Amazon Alexa GitHub Followers Counter

This post is part of “Alexa” series. I will walk you through how to build an Amazon Alexa Skill with Node.JS and Lambda to get numbers of followers & repositories in GitHub in real-time.

Note: all the code is available in my GitHub.

Amazon Echo will captures voice commands and send them to the Alexa Skill to convert them into structured text commands. A recognized command is sent to an AWS Lambda function that will call GitHub API to get response.

To get started, sign up to Amazon Developer Console,  and create a new Alexa Skill:

The invocation name is what user will say to trigger the skill. In our case it will be “github“.

Click on “Next” to bring up the Interaction Model page, use the intent schema below:

Intents will map user’s voice command to services that our Alexa skill can address. For instance, here I defined an intent called GetGithubFollowerCount, which will line up with a portion of code in my Lambda funtion that I leverage in a bit.

The programming languages are defined as a Custom Slot Type, with the following possible values:

Now our intents are defined, we need to link them to a human request that will trigger this linkage. To do this multiple sentences (utterances) are listed to make the interaction as natural as possible.

Result:

Click on “Next” and you will move onto a page that allows us to use an ARN (Amazon Resource Name) to link to AWS Lambda.

Before that, let’s create our lambda function, login to AWS Management Console, then navigate to Lambda Dashboard and create a new function from scratch:

Select Alexa Skills Kit as trigger:

I wrote the Lambda functions in Node.JS, although that code isn’t actually that interesting so I won’t go into it in much detail.

This function is fired when there is an incoming request from Alexa. The function will:

  • Process the request
  • Call GitHub API
  • Send the response back to Alexa

Create a zip file consisting of the function above and any dependencies (node_modules). Then, specify the .zip file name as your deployment package at the time you create the Lambda function. Don’t forget to set your GitHub Username as an environment variable:

Back in the Alexa Skill we need to link our Lambda function as our endpoint for the Alexa Skill:

That’s it, let’s test it out using a Service Simulation by clicking on “Next“.

GetFollowerCount Intent : 

GetRepositoryCount Intent:

GetGithubRepositoryCountByLanguage Intent:

You can see that the Lambda responds as expected !

Test it now with Amazon Echo, by saying “Alexa, ask GitHub for …” :

 

Highly Available WordPress Blog

In this post you will learn about the easiest way to deploy a fault tolerant and scalable WordPress on AWS.

To get started, setup a Swarm cluster on AWS by following this tutorial Setup Docker Swarm on AWS using Ansible & Terraform:

Now your cluster is ready to use. You are ready to go !

WordPress stores some files on disk (plugins, themes, images …) which causes a problem if you want to use a fleet of EC2 instances to run your blog in case of high traffic:

That’s where AWS EFS (Elastic File System) comes into the play. The idea is to mount shared volumes using the NFS protocol in each host to synchronize files between all nodes in the cluster.

So create an Elastic File System, make sure to deploy it in the same VPC on which your Swarm cluster is created:

Once created, note the DNS name:

Now, mount Amazon EFS file systems via the NFSv4.1 protocol on each node:

We can verify the mount with a plain df -h command:

WordPress requires a relational database. Create an Amazon Aurora database:

Wait couple of minutes, then the database should be ready, copy the endpoint of database:

To deploy the stack, I’m using the following Docker Compose file:

In addition to wordpress container, Im using Traefik as reverse proxy to be able to scale out my blog easily with docker service scale command.

In your Manager node run the following command to deploy the stack:

At this point, you should have a clean install of WordPress running.

Fire up your browser and point it to manager public IP address, you will be greeted with the familiar WordPress setup page:

If you’re expecting a high traffic, you can easily scale the WP service using the command:

Verify Traefik Dashboard:

That’s how to build a scalable WordPress blog with no single points of failure.

Butler CLI: Export/Import Jenkins Plugins & Jobs

Not long ago, I had to migrate Jenkins jobs from an old server to a new one. That’s where StackOverflow comes into the play, below the most voted answers I found:

In spite of their advantages, those solutions comes with their downsides especially if you have a large number of jobs to move or no access root to the server. But, guess what ? I didn’t stop there. I have came up with a CLI to make your life easier and export/import not only Jenkins jobs but also plugins like a boss.

To get started, find the appropriate package for your system and download it. For linux:

Note: For Windows make sure that butler binary is available on the PATHThis page contains instructions for setting the PATH on Windows.

Once done, verify the installation worked, by opening a new terminal session and checking if butler is available :

1 – Plugins Management

To export Jenkins jobs, you need to provide the URL of the source Jenkins instance:

As shown above, butler will dump a list of plugins installed to stdout and a new file plugins.txt will be generated, with list of installed Jenkins plugins with name and version pairs:

Now, to import the plugins to the new Jenkins instance, use the command below with the URL of the Jenkins target instance as an argument:

Butler will install each plugin on the target Jenkins instance by issuing API calls.

2 – Jobs Management

To export Jenkins jobs, just provide the URL of the source Jenkins server:

A new directory jobs/ will be created with every job in Jenkins. Each job will have its own configuration file config.xml.

Now, to import the jobs to the new Jenkins instance, issue the following command:

Butler will use the configuration files created earlier to issue API calls to target Jenkins instance to create jobs.

Once you are done, check Jenkins and you should see your jobs successfully created :

Hope it helps ! The CLI is still in its early stages, so you are welcome to contribute to the project in Github.

Chatbot with Angular 5 & DialogFlow

I have seen many posts on how to build a chatbot for a wide variety of collaboration platforms such as Slack, Facebook Messenger, HipChat … So I decided to build a chatbot from scratch to production using Angular latest release v5.0.0, DialogFlow, and AWS.

Here is how our chatbot will look like at the end of this post:

Note: This project is open source and can be found on my Github.

To get started, create a brand new Angular project using the Angular CLI:

1 – Chatbot Architecture

We will split out chat app in different components and each component will be able to communicate with others using attribute directives:

2 – Message Entity

Create an empty class by issuing the following command:

The message entity has 3 fields:

3 – Message List Component

Generate a new component:

Now we can display the messages by iterating over them:

The code of this component should look like this:

Note the usage of @app/models instead of the relative path, its called alias. To be able to use aliases we have to add the paths properties to our tsconfig.json file like this:

Note: I also added @env alias to be able to access environment variables from anywhere in our application.

4 – Message Item Component

Let’s build a component that will simply display a message in our message list:

In message-item.component.html, add the following content:

The code of the component should look like this:

5 – Message Form Component

Let’s build the form that will be responsible for sending the messages:

In the message-form.component.html, add the following content:

And it’s corresponding typescript code in message-form.component.ts:

The sendMessage() method will be called each time a user click on send button.

That’s it! Try it by yourself and you will see that it’s working.

At this moment, you wont get any response, that’s where NLP comes to play.

6 – NLP Backend

I choose to go with DialogFlow.  Sign up to DialogFlow and create a new agent:

Then, enable the Small Talk feature to have a simple chitchat:

Note: You can easily change the responses to the questions if you don’t like them. To go further you can create your own Intents & Entities as described in my previous tutorial.

Copy the DialogFlow Client Access Token. It will be used for making queries.

Past the token into your environments/environment.ts file:

7 – DialogFlow Service

Generate a DialogFlow Service which will make calls the DialogFlow API to retreive the corresponding response:

It uses the DialogFlow API to process natural language in the form of text. Each API request, include the Authorization field in the HTTP header.

Update the sendMessage() method in MessageFormComponent as follows:

Finally, in app.component.html, copy and past the following code to include the message-list and the message-form directives:

8 – Deployment to AWS

Generate production grade artifacts:

The build artifacts will be stored in the dist/ directory

Next, create an S3 bucket with AWS CLI:

Upload the build artifacts to the bucket:

Finally, turns website hosting on for your bucket:

If you point your browser to the S3 Bucket URL, you should see the chatbox:

MySQL Monitoring with Telegraf, InfluxDB & Grafana

This post will walk you through each step of creating interactive, real-time & dynamic dashboard to monitor your MySQL instances using Telegraf, InfluxDB & Grafana.

Start by enabling the MySQL input plugin in /etc/telegraf/telegraf.conf :

Once Telegraf is up and running it’ll start collecting data and writing them to the InfluxDB database:

Finally, point your browser to your Grafana URL, then login as the admin user. Choose ‘Data Sources‘ from the menu. Then, click ‘Add new‘ in the top bar.

Fill in the configuration details for the InfluxDB data source:

You can now import the dashboard.json file by opening the dashboard dropdown menu and click ‘Import‘ :

Note: Check my Github for more interactive & beautiful Grafana dashboards.