You can structure these modules to flow in any way you like, ranging from free form to sequential. The Bot Framework SDK provides several libraries that allows you to construct any conversational flow your bot needs. For example, the prompts library allows you to ask users for input, the waterfall library allows you to define a sequence of question/answer pair, the dialog control library allows you to modularized your conversational flow logic, etc. All of these libraries are tied together through a dialogs object. Let's take a closer look at how modules are implemented as dialogs to design and manage conversation flows and see how that flow is similar to the traditional application flow.

User message. Once authenticated, the user sends a message to the bot. The bot reads the message and routes it to a natural language understanding service such as LUIS. This step gets the intents (what the user wants to do) and entities (what things the user is interested in). The bot then builds a query that it passes to a service that serves information, such as Azure Search for document retrieval, QnA Maker for FAQs, or a custom knowledge base. The bot uses these results to construct a response. To give the best result for a given query, the bot might make several back-and-forth calls to these remote services.


Not integrated. This goes hand-in-hand with the contextual knowledge, but chatbots often suffer from “death by data silo” where their access to data is limited. If a chatbot is “chatting with” a customer, they not only need to access the contextual data of their customer but also have access to every place where the answer to the customer’s question may reside. Product documentation site, customer community, different websites are all places where that answer can be.
Whilst the payout wasn't huge within the early days of Amazon, those who got in early are now seeing huge rewards, with 38% of shoppers starting their buying journey within Amazon (source), making it the number one retail search engine. Some studies are suggesting that Amazon is responsible for 80% of e-commerce growth for publicly traded web retailers (source).
2. Flow-based: these work on user interaction with buttons and text. If you have used Matthew’s chatbot, that is a flow-based chatbot. The chatbot asks a question then offers options in the form of buttons (Matthew’s has a yes/no option). These are more limited, but you get the possibility of really driving down the conversation and making sure your users don’t stray off the path.
Users want to ask questions in their own language, and have bots help them. A statement that sounds as straight-forward as “My login isn’t working! I haven’t been able to log into your on-line billing system” might sound straight forward to us, but to a bot, there’s a lot it needs to understand. Watson Conversation Services has learned from Wikipedia, and along with its deep learning techniques, it’s able to work out what the user is asking.

It takes bold visionaries and risk-takers to build future technologies into realities. In the field of chatbots, there are many companies across the globe working on this mission. Our mega list of artificial intelligence, machine learning, natural language processing, and chatbot companies, covers the top companies and startups who are innovating in this space.
Most chatbots try to mimic human interactions, which can frustrate users when a misunderstanding arises. Watson Assistant is more. It knows when to search for an answer from a knowledge base, when to ask for clarity, and when to direct you to a human. Watson Assistant can run on any cloud – allowing businesses to bring AI to their data and apps wherever they are.
The chatbot is trained to translate the input data into a desired output value. When given this data, it analyzes and forms context to point to the relevant data to react to spoken or written prompts. Looking into deep learning within AI, the machine discovers new patterns in the data without any prior information or training, then extracts and stores the pattern.

The term "ChatterBot" was originally coined by Michael Mauldin (creator of the first Verbot, Julia) in 1994 to describe these conversational programs. Today, most chatbots are either accessed via virtual assistants such as Google Assistant and Amazon Alexa, via messaging apps such as Facebook Messenger or WeChat, or via individual organizations' apps and websites.[2] [3] Chatbots can be classified into usage categories such as conversational commerce (e-commerce via chat), analytics, communication, customer support, design, developer tools, education, entertainment, finance, food, games, health, HR, marketing, news, personal, productivity, shopping, social, sports, travel and utilities.[4]


At this year’s I/O, Google announced its own Facebook Messenger competitor called Allo. Apart from some neat features around privacy and self-expression, the really interesting part of Allo is @google, the app’s AI digital assistant. Google’s assistant is interesting because the company has about a decades-long head start in machine learning applied to search, so its likely that Allo’s chatbot will be very useful. In fact, you could see Allo becoming the primary interface for interacting with Google search over time. This interaction model would more closely resemble Larry Page’s long-term vision for search, which goes far beyond the clumsy search query + results page model of today:


Niki is a personal assistant that has been developed in India to perform an impressively wide variety of tasks, including booking taxis, buses, hotels, movies and events, paying utilities and recharging your phone, and even organizing laundry pickup and delivery. The application has proven to be a huge success across India and won the Deep Tech prize at the 2017 AWS Mobility Awards.

Several studies accomplished by analytics agencies such as Juniper or Gartner [34] report significant reduction of cost of customer services, leading to billions of dollars of economy in the next 10 years. Gartner predicts an integration by 2020 of chatbots in at least 85% of all client's applications to customer service. Juniper's study announces an impressive amount of $8 billion retained annually by 2022 due to the use of chatbots.
×