Using Webhooks With API.AI

Josh Feinberg
Dev Tutorials
Published in
4 min readMay 12, 2017

--

Hopefully you’ve read my last entry about setting up a conversation action. If not, head back there and get started creating, or if you just want to learn how to do webhooks with API.AI, this is the place to start.

A webhook is a tool to allow you to link your conversation action to an API and have it provide the results for the conversation. The requirements for how the data going in and out should look can be seen here but were going to use a helpful tool provided by Google called the ApiAiAssistant.

The ApiAiAssistant is a Node.js plugin that allows you to simplify building reading and sending back data with API.AI. It provides helpful tools for parsing arguments and letting the action know how to respond.

First we need to set up a few things. As a Chicagoan I am going to use the Chicago Transit Authority’s Bus Tracker API. This tutorial should work with minor tweaks for other transit systems but my code will use the CTA. Be sure to take note of the provided CTA apikey after you sign up as you will need that for the code.

Next up we need somewhere to host our script. I found that the Google Cloud Platform works perfectly for this, specifically the Google Cloud Functions feature. This allows us to upload our node script and get up to 2 million requests for free. So let’s go ahead and setup our endpoint.

First pick any name for your route, and if you would like you can choose to change the region. The default 256mb of memory should be more than enough and the 60 second timeout is perfect. (As a note: the Google Home will only wait about 5 seconds for a response so we need to make sure our requests are fast). Our trigger needs to be an HTTP trigger and you should get a URL. Create any bucket and set the function to execute as “findBus”.

For the source I’ve provided both our index.js and package.json files in a gist. (Please Note: I am not a javascript developer so feel free to tell me how to improve these). All you need to do is input your apiKey and a default bus stop and it should be good to go. For now we will hardcode the bus stop but in our next lesson we’ll add finding our devices location to get bus stop closest. Now lets take a look at these files.

In function responseHandler we do our setup of hitting the CTA’s API. Here we are using default http package provided by node to handle parsing the JSON response. We also get a first look at our ApiAiAssistant with grabbing our arguments

var busNumber = assistant.getArgument(‘busnumber’);var busDirection = assistant.getArgument(‘direction’).toLowerCase();

Next we have a couple helper functions. parseData does exactly that and just reads in the JSON, returning that there are no busses found for any errors or if the API simply does not return any predictions for the next bus arrival. findBus loops through all the results until it finds a bus that is heading the right direction. Here we see two more uses of the ApiAiAssistant.

The first is tell

assistant.tell(“Quick, the “ + busNumber + “ is here!”)

Tell is used when you want to immediately close the microphone after it is done speaking. Since the bus is here, the person should run out and we should just close their microphone.

The other function is ask

assistant.ask(“<speak>The next <say-as interpret-as=\”cardinal\”>” + busNumber + “</say-as> will arrive in “ + prd[i].prdctdn + “ minutes. Would you like to find another?</speak>”)

Ask is used when you want the microphone to be left open. Here we allow the user to find another bus if the want. We also get a look at how Speech Synthesis Markup Language (SSML) works. This helps with cues to the assistant on how to pronounce certain items, in this case making sure the bus number is read like a number.

Finally, we set up which actions our node.js script can handle with this:

const actionMap = new Map();actionMap.set(‘bus-requested’, responseHandler);assistant.handleRequest(actionMap);

This just tells API.AI that we can handle the action ‘bus-requested’ with the responseHandler function.

That’s it for our script, so take note of the URL and let’s go back to our API.AI Agent.

First step in API.AI is to enable our webhook. To do this, we go the Fulfillments page and enable the webhooks. Then just copy in the URL that you were handed in the Google Cloud Functions creation.

Now go back to our intent “Lookup Bus” and here we are going to enable our webhook by going to the Fulfillments section and checking “Use webhook”. We also want to make sure we uncheck the “End conversation” as our webhook will decide when to close out now.

We also need to add our action ‘bus-requested’ to the intent. Last up will be setting the default response if the webhook fails. Choose a string or just leave the default echo that we setup in the first lesson.

Since we do sometimes leave the microphone open with the ask command, we need to have a way for the user to leave. Create another intent called “Goodbye” and set up the goodbye commands and response. Here is what mine looks like:

Goodbye Intent

Now lets go test this out. Open back up our web simulator and request a bus.

Only 9 minutes away!

And because it’s more fun on the Google Home:

Next up we’ll go into user permissions so we can figure out the closest bus stop.

Edit: Part three where we add in user permissions is now available here!

--

--