The advent of AI has made voice-enabled chatbots and voice assistants part of our lives. Imagine the likes of Siris and Google Assistants for your phones, Alexa and Google Home for your house. These aforementioned systems are nothing but voice-enabled chatbots.
Step by step guide to making a voice-enabled chatbot
We will be using two tools to build the voice chatbot, Jovo, and Dialogflow.
Jovo is an open-source framework that lets you build voice apps for Amazon Alexa and Google Assistant. Dialogflow is a conversational interface builder by Google. Both of these tools are free to use so you can try your hands at them right away.
Let’s jump into the crux of the article now.
Install Jovo in your system by this command:
$ npm install -g jovo-cli
Proceed to create a new project in Jovo. You can use the following method:
$ jovo new <directory>
We will be using a simple example of a HelloWorld project to start with. Just enter the directory name as HelloWorld.
Open your HelloWorld project in Visual Studio code editor. We will be using Webhook for local development. You can check the server configuration in Index.js file.
You can understand the app logic, intents, and configuration by scanning through the App.js file.
Intents are your triggers to perform certain actions on launch, initialization, and replies. You can find some basic intents set up in the handler, “LAUNCH”, “HelloWorldIntent” and “MyNameIsIntent”.
Jovo’s V1 framework project contains ‘models’ folder and it has an en-US.json file. This file contains the language model. The language model gives the structure to create platform-specific models, which can be used to pass on the data to NLU platforms, such as Dialgflow and Alexa.
It contains generic elements such as intent, input types et cetera and a few platform-specific elements. You can find more information here.
Passing the data to Dialogflow
To train your chatbot and deploy it, you need to pass on the data to Dialogflow. First, you need to create and initialize Google Action folder. You can find it under models > platform > googleAction. The ‘googleAction’ folder contains the ‘dialogflow’ folder with all the files needed to deploy the agent to Dialogflow Console.
$ jovo init googleAction
This will create an App.json file which contains ‘googleAction’ object. Here, ‘nlu’ stand for Natural Language Understanding and points to the NLU platform, Dialogflow.
Next, you need to create platform-specific files based on App.json file and models folder. Use this command:
$ jovo build
This will create new folder /platforms/googleAction/dialogflow. You need to deploy this to further use in Dialoflow console, use command:
$ jovo deploy
This will create a new file called dialogflow_agent.zip. We will use this file in Dialogflow Console. This makes it very easy to import the language model to your Dialogflow agent.
Creating Dialogflow agent
Now, login to your Dialogflow Console and create a new agent by clicking the ”Create Agent ” button.
In Dialogflow, an agent is your NLU module which converts inputs into actionable data. You can name your agent “HellowWorldAgent” and click on “Create button”.
“HellowWorldAgent” agent will be created with two intents, namely “Default Fallback Intent” and “Default Welcome Intent”. These are two basic intent for welcoming and fallback action.
Next, import your dialogflow_agent.zip file into Dialogflow agent.
- Click on the ‘+’ button beneath your agent’s name > Export and Import > RESTORE FROM ZIP > SELECT FILE
- Upload the dialogflow_agent.zip file from Jovo.
- Click on Restore and Done after successful restore.
This will create new intents based on the imported zip file. The “HelloWorldIntent” and “MyNameIsIntent” will start showing up here.
The Webhook URL will also be automatically filled in the Fulfillment section. Fulfillment allows data to be exchanged from any service/backend over Webhooks. It gathers information and dynamically passes on to the service/backend.
Integrate with Google Assistant
Next, we need to integrate with Google Assistant to exchange voice messages with the newly created voice chatbot. Go to Integrations > Google Assistant > Integration Settings.
You will find everything imported from the file already present there, this is the power Dialogflow brings. Click on Test > Continue to test your voice chatbot. This will take you to Actions on Google console.
You will find your new Action on Google project HelloWorldAgent here. Your voice chatbot is now almost ready for testing. The Actions for Google platform needs to communicate with your local environment. Just run the local server an create a link to it. Use the below command to run:
$ jovo run
Test your voice-enabled chatbot
For best practice, first tets your voice chatbot using Dialogflow and then using the built-in simulator of the Actions for Google Console.
Type “say hi” > default response – <speak>Hello World! What’s your name?</speak>”
Type “Alex” > default response – <speak>Hey Alex, nice to meet you!</speak>
Now, test this using Google simulator. Click on Talk to my test app or just click on the input field and press enter. Just respond using the mic button to input your answer. The bot will reply by your response thread: “Hey name, nice to meet you!”.
The sample conversation goes like:
Google: “Alright. Here’s the test version of my test app. Hello World! What’s your name?”
Google: “Hey Alex, nice to meet you!”
That is it. Your voice-enabled chatbot is ready. Now you can create customized intents and responses to create a more versatile voice chatbot.
Subscribe here to get the good stuff — we solemnly swear to deliver top of the line, out of the box and super beneficial content to you once a week.
At Kommunicate, we are envisioning a world-beating customer support solution to empower the new era of customer support. We would love to have you on board to have a first-hand experience of Kommunicate. You can signup for free and start delighting your customers right away.