Google bought and also released their own home-baked Cloud Natural Language API, Amazon introduced Amazon Lex – conversational API and is updating their Stories and making them even better.
More and more developers are coming up with the idea to make their own bot for Slack, Telegram, Skype, Kik, Messenger and, probably, several other platforms that might pop up over the next couple of months.
We also focus on the platforms that we can use for our bots today, including the API – LUIS from Microsoft, from Facebook, from Google, Watson from IBM and Alexa Skill Set, and Lex from Amazon.
But if we are curious enough, we can also ask Google Keyword Planner for other related ideas and extend our list by about 800 phrases related to the search term “asian food near me”.
Let’s say, however, that we want to create a curated list of Asian Food places; in this case we can see that the results are still highly relevant to the service that we want to provide to the users.
An Intent is the core concept in building the conversational UI in chat systems, so the first thing that we can do with the incoming message from the user is to understand its Intent.
Along with the Intent, it’s necessary to extract the parameters of actions from the phrase.
Examples of entity types that are commonly supported in language understanding systems are:
An example of one session is when you order a flight from your starting point: ‘I need a flight to London’ (the intent), then through subsequent interactions (questions and answers) you get the information about a booked flight and finish the interaction.
We can think about context as a shared basket that we carry through the whole session and use as short term memory
Let’s say, after the user expression that represents the BookFlight intent, we started a new context, BookFlightContext, which indicates that we are currently collecting all parameters needed for the booking.
After the question about flight dates, the user decides to request info from the calendar, thus expressing a new intent CalendarEvents, and starting a new context, CalendarEventsContext, that saves the state of user interaction during the dialogue about events in a calendar
LUIS was introduced during this year’s Microsoft Build 2016 event in San Francisco, together with Microsoft Bot Framework and Skype Developer Platform, which can be used to create Skype Bots
LUIS provides Entities that you can define and then teach to recognize a LUIS system from a free-text expression
Moving closer to automatic language understanding and the acting upon completion of Intents with parameters, there is another feature called Action Fulfilment, which is currently present only in preview mode, but you can already play with it and plan for the future
To train the model with different utterances, LUIS provides a Web interface where we can type an expression, see an output from the model, and make changes in labels or assign new intents
During the F8 conference in April, 2016, Facebook introduced a major update to their platform and rolled out their own version of Bot Engine that extends a previous intent-oriented approach to the story-oriented approach.
Under the hood, during the logic implementation, you still work extensively with the context and need to do all tasks required to maintain the conversation’s correct state.
Effectively, the Converse API will resolve the user utterance and the given state into the next state/action of your system, thus giving you the tool to build a Finite State Machine that describes sequences of speech acts.
Open projects can be forked and you can create you own version of the model on top of existing community projects.
Since the release of the first version of this article they’ve make a better builder for the Stories and added support for Quick Replies, Branches (if/else) and Jumps in Stories which is great for describing complex flows.
It means that your new Agent on the system can recognize these Intents without any additional training – and even provide you with the response text which you can use as the next thing your bot will say
When you create an Intent, you directly define which Context the Intent should expect and produce as a result
There is also an embedded integration mode available so you can have an agent that works without connection to the internet and is independent from any API looks like a decent solution that you can use for building sophisticated conversational interfaces
Good for the founders, but this means the community has lost the powerful independent NLP service, although a couple of other startups are emerging from the stealth mode.
At first glance, this looks like the simplest language processing algorithm available among all other systems, but it’s deployed, tested and exposed to more than 3 million Amazon Alexa users who are already using conversational interfaces on a daily basis.
It feels like they are still working on their own version of machine learning in order to simplify the work needed for model training.
UPDATE from Dec 1, 2016: Yesterday, Amazon revealed Amazon Lex – a conversational interface API with NLP features and tight integration to Amazon services such as Lambda, Dynamo DB, SNS/SES and others
So the good news is that IBM moved the technology behind the Watson into the cloud and released the set of API that you can use in your own conversational applications.
There are a lot of building blocks that you can use in your application, but you probably will spend a fair amount of time integrating them into one solution.
We think that IBM’s solution is the ideal choice for enterprises that want to be 100% sure of their API provider.
For a recent IBM Watson demonstration you can watch a fireside chat with Dr
You just don’t want to be in a situation when a company shuts down their service and you are completely unprepared
We are also often using Chatfuel as it’s an easy to build builder of conversational flow with a powerfull JSON API integrations.
Alexa Skills Kit is proprietary for Amazon Echo devices, therefore you can’t use it with arbitrary bots at Slack or Facebook Messenger for language processing, but it is ideal for smart home bots that augment your kitchen or living room environment, and which are built specifically for Alexa
Since the first version of this article in May, 2016, Google have launched their own NLP service, Cloud Natural Language API, which is packed with text analytics, content classifications & relationship graphs, and deep learning models that you can use for your chatbot needs