Your first Google Assistant skill

How to build conversational app for Google Home or Google Assistant

Smart home speakers, assistant platforms and cross-device solutions, so you can talk to your smartwatch and see the result on your TV or car’s dashboard. Personal assistants and VUIs are slowly appearing around us and it’s pretty likely that they will make our lives much easier.
Because of my great faith that natural language will be the next human-machine interface, I decided to start writing new blog posts series and building an open source code where I would like to show how to create new kind of apps: conversational oriented, device-independent assistant skills which will give us freedom in platform or hardware we use.
And will bring the most natural interface for humans – voice.

WaterLog assistant skill

In this post we’ll start with the simples implementation of assistant skill. Cloud Functions and Realtime Database for app backend,

  • file which is also the implementation of HTTP Trigger for Firebase Cloud Function (an endpoint in short 🙂):

    In our Cloud Function we defined mapping of Intents into functions which need to be called as a fulfillment for conversation.
    As an example let’s see conversation.actionLogWater() (fulfillment for log_water Intent):

    Here is what happens:

    1. App is getting argument from utterance extracted by Dialogflow. For input: Log 500ml of water we’ll get an object {“amount”:500,”unit”:”ml”} .
    2. Through waterLog implementation app is saving this data into Firebase Realtime Database.
    3. At the end app is getting sum of logged water and pass it as a response into dialogflowApp object. tell() function responses to user and closes the conversation (docs).

    See full code of Conversation class: conversation.js.

    The rest of code doesn’t do much more interesting things. Conversation class is responsible for handling user input. WaterLog saves and retrieves data from Firebase Realtime Database (about logged water). UserManager adds some helpers for (anonymous) user handling.

    Unit testing

    While this paragraph isn’t directly connected with assistant apps or voice interfaces, I believe it’s still extremely important in each kind of software we build. Just imagine that every time you change something in the code, you need to deploy function and start conversation with you app. In WaterLog app it was relatively simple (but still it took at least tens of deployments). In bigger apps it will be critical to have unit tests. It will speed up development time by order of magnitude.

    All unit tests for our classes can be found under functions/test/ directory. Tests in this project aren’t extremely sophisticated (they use sinon.js and chai libraries without any additional extensions) they still helped a lot with going to production in relatively short time.

    Here is the output from $ npm test :

    Source code

    Full source code of WaterLog app with:

    • Firebase Cloud Functions
    • Dialogflow agent configuration
    • Assets required for app distribution

    can be found on Github:

    frogermcs/WaterLog-assistant-app

    Thanks for reading! 😊

    Further readings

  • Powered by WPeMatico

    Please follow and like us:
    0

    Gurupriyan is a Software Engineer and a technology enthusiast, he’s been working on the field for the last 6 years. Currently focusing on mobile app development and IoT.