Telegram launched its bot API in 2015, and launched version 2.0 of its platform in April 2016, adding support for bots to send rich media and access geolocation services. As with Kik, Telegram’s bots feel spartan and lack compelling features at this point, but that could change over time. Telegram has also yet to add payment features, so there are not yet any shopping-related bots on the platform.
ELIZA's key method of operation (copied by chatbot designers ever since) involves the recognition of cue words or phrases in the input, and the output of corresponding pre-prepared or pre-programmed responses that can move the conversation forward in an apparently meaningful way (e.g. by responding to any input that contains the word 'MOTHER' with 'TELL ME MORE ABOUT YOUR FAMILY'). Thus an illusion of understanding is generated, even though the processing involved has been merely superficial. ELIZA showed that such an illusion is surprisingly easy to generate, because human judges are so ready to give the benefit of the doubt when conversational responses are capable of being interpreted as "intelligent".
This is a lot less complicated than it appears. Given a set of sentences, each belonging to a class, and a new input sentence, we can count the occurrence of each word in each class, account for its commonality and assign each class a score. Factoring for commonality is important: matching the word “it” is considerably less meaningful than a match for the word “cheese”. The class with the highest score is the one most likely to belong to the input sentence. This is a slight oversimplification as words need to be reduced to their stems, but you get the basic idea.
You may remember Facebook’s big chatbot push in 2016 –  when they announced that they were opening up the Messenger platform to chatbots of all varieties. Every organization suddenly needed to get their hands on the technology. The idea of having conversational chatbot technology was enthralling, but behind all the glitz, glamour and tech sex appeal, was something a little bit less exciting. To quote Gizmodo writer, Darren Orf:
“They’re doing things we’re simply not doing in the U.S. Imagine if you were going to start a city from scratch. Rather than having to deal with all the infrastructure created 200 years ago, you could hit the ground running on the latest technology. That’s what China’s doing — they’re accessing markets for the first time through mobile apps and payments.” — Brian Buchwald, CEO of consumer intelligence firm Bomoda
If you visit a Singapore government website in the near future, chances are you’ll be using a chatbot to access the services you need, as part of the country’s Smart Nation initiative. In Australia, Deakin University students now access campus services using its ‘Genie’ virtual assistant platform, made up of chatbots, artificial intelligence (AI), voice recognition and predictive analytics.
User message. Once authenticated, the user sends a message to the bot. The bot reads the message and routes it to a natural language understanding service such as LUIS. This step gets the intents (what the user wants to do) and entities (what things the user is interested in). The bot then builds a query that it passes to a service that serves information, such as Azure Search for document retrieval, QnA Maker for FAQs, or a custom knowledge base. The bot uses these results to construct a response. To give the best result for a given query, the bot might make several back-and-forth calls to these remote services.
At this year’s I/O, Google announced its own Facebook Messenger competitor called Allo. Apart from some neat features around privacy and self-expression, the really interesting part of Allo is @google, the app’s AI digital assistant. Google’s assistant is interesting because the company has about a decades-long head start in machine learning applied to search, so its likely that Allo’s chatbot will be very useful. In fact, you could see Allo becoming the primary interface for interacting with Google search over time. This interaction model would more closely resemble Larry Page’s long-term vision for search, which goes far beyond the clumsy search query + results page model of today:
Chatbots – also known as “conversational agents” – are software applications that mimic written or spoken human speech for the purposes of simulating a conversation or interaction with a real person. There are two primary ways chatbots are offered to visitors: via web-based applications or standalone apps. Today, chatbots are used most commonly in the customer service space, assuming roles traditionally performed by living, breathing human beings such as Tier-1 support operatives and customer satisfaction reps.
NBC Politics Bot allowed users to engage with the conversational agent via Facebook to identify breaking news topics that would be of interest to the network’s various audience demographics. After beginning the initial interaction, the bot provided users with customized news results (prioritizing video content, a move that undoubtedly made Facebook happy) based on their preferences.
This is a lot less complicated than it appears. Given a set of sentences, each belonging to a class, and a new input sentence, we can count the occurrence of each word in each class, account for its commonality and assign each class a score. Factoring for commonality is important: matching the word “it” is considerably less meaningful than a match for the word “cheese”. The class with the highest score is the one most likely to belong to the input sentence. This is a slight oversimplification as words need to be reduced to their stems, but you get the basic idea.
Having a conversation with a computer might have seemed like science fiction even a few years ago. But now, most of us already use chatbots for a variety of tasks. For example, as end users, we ask the virtual assistant on our smartphones to find a local restaurant and provide directions. Or, we use an online banking chatbot for help with a loan application.
Build a bot directly from one of the top messaging apps themselves. By building a bot in Telegram, you can easily run a bot in the application itself. The company recently open-sourced their chatbot code, making it easy for third-parties to integrate and create bots of their own. Their Telegram API, which they also built, can send customized notifications, news, reminders, or alerts. Integrate the API with other popular apps such as YouTube and Github for a unique customer experience.
The progressive advance of technology has seen an increase in businesses moving from traditional to digital platforms to transact with consumers. Convenience through technology is being carried out by businesses by implementing Artificial Intelligence (AI) techniques on their digital platforms. One AI technique that is growing in its application and use is chatbots. Some examples of chatbot technology are virtual assistants like Amazon's Alexa and Google Assistant, and messaging apps, such as WeChat and Facebook messenger.
More and more businesses are choosing AI chatbots as part of their customer service team. There are several reasons for that. Chatbots can answer customers’ inquiries cheaply, quickly, in real-time. Another reason is the ease of installation of such chatbot: once you have a fine live chat app, it takes a couple of minutes to integrate a chatbot with it.
This chatbot aims to make medical diagnoses faster, easier, and more transparent for both patients and physicians – think of it like an intelligent version of WebMD that you can talk to. MedWhat is powered by a sophisticated machine learning system that offers increasingly accurate responses to user questions based on behaviors that it “learns” by interacting with human beings.
As retrieved from Forbes, Salesforce’s chief scientist, Richard Socher talked in a conference about his revelations of NLP and machine translation: “I can’t speak for all chatbot deployments in the world – there are some that aren’t done very well…but in our case we’ve heard very positive feedback because when a bot correctly answers questions or fills your requirements it does it very, very fast.
Most chatbots try to mimic human interactions, which can frustrate users when a misunderstanding arises. Watson Assistant is more. It knows when to search for an answer from a knowledge base, when to ask for clarity, and when to direct you to a human. Watson Assistant can run on any cloud – allowing businesses to bring AI to their data and apps wherever they are.
There has been a great deal of controversy about the use of bots in an automated trading function. Auction website eBay has been to court in an attempt to suppress a third-party company from using bots to traverse their site looking for bargains; this approach backfired on eBay and attracted the attention of further bots. The United Kingdom-based bet exchange Betfair saw such a large amount of traffic coming from bots that it launched a WebService API aimed at bot programmers, through which it can actively manage bot interactions.
When one dialog invokes another, the Bot Builder adds the new dialog to the top of the dialog stack. The dialog that is on top of the stack is in control of the conversation. Every new message sent by the user will be subject to processing by that dialog until it either closes or redirects to another dialog. When a dialog closes, it's removed from the stack, and the previous dialog in the stack assumes control of the conversation.
Back in April, National Geographic launched a Facebook Messenger bot to promote their new show about the theoretical physicist's work and personal life. Developed by 360i, the charismatic Einstein bot reintroduced audiences to the scientific figure in a more intimate setting, inviting them to learn about the lesser-known aspects of his life through a friendly, natural conversation with the man himself.
I will not go into the details of extracting each feature value here. It can be referred from the documentation of rasa-core link that I provided above. So, assuming we extracted all the required feature values from the sample conversations in the required format, we can then train an AI model like LSTM followed by softmax to predict the next_action. Referring to the above figure, this is what the ‘dialogue management’ component does. Why LSTM is more appropriate? — As mentioned above, we want our model to be context aware and look back into the conversational history to predict the next_action. This is akin to a time-series model (pls see my other LSTM-Time series article) and hence can be best captured in the memory state of the LSTM model. The amount of conversational history we want to look back can be a configurable hyper-parameter to the model.
Chatbots are a great way to answer customer questions. According to a case study, Amtrak uses chatbots to answer roughly 5,000,000 questions a year. Not only are the questions answered promptly, but Amtrak saved $1,000,000 in customer service expenses in the year the study was conducted. It also experienced a 25 percent increase in travel bookings.
The trained neural network is less code than an comparable algorithm but it requires a potentially large matrix of “weights”. In a relatively small sample, where the training sentences have 150 unique words and 30 classes this would be a matrix of 150x30. Imagine multiplying a matrix of this size 100,000 times to establish a sufficiently low error rate. This is where processing speed comes in.

This importance is reinforced by Jacqueline Payne, Customer Support Manager at Paperclip Digital, who says ‘Customer service isn’t a buzzword. But too many businesses treat it like it is. As a viable avenue from which to lower customer acquisition costs and cultivate a loyal customer base, chat bots can play a pivotal role in driving business growth.’
Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $2, maybe $3, and after asking her for the money, you go on your way.
I've come across this challenge many times, which has made me very focused on adopting new channels that have potential at an early stage to reap the rewards. Just take video ads within Facebook as an example. We're currently at a point where video ads are reaching their peak; cost is still relatively low and engagement is high, but, like with most ad platforms, increased competition will drive up those prices and make it less and less viable for smaller companies (and larger ones) to invest in it.
“Bots go bust” — so went the first of the five AI startup predictions in 2017 by Bradford Cross, countering some recent excitement around conversational AI (see for example O’Reilly’s “Why 2016 is shaping up to be the Year of the Bot”). The main argument was that social intelligence, rather than artificial intelligence is lacking, rendering bots utilitarian and boring.
If AI struggles with fourth-grade science question answering, should AI be expected to hold an adult-level, open-ended chit-chat about politics, entertainment, and weather? It is thus encouraging to see that Microsoft’s Satya Nadella did not give up on Tay after its debacle, and Amazon’s Jeff Bezos is sponsoring an Alexa social chatbot competition. I love this below quote from Jeff:
A chatbot that functions through machine learning has an artificial neural network inspired by the neural nodes of the human brain. The bot is programmed to self-learn as it is introduced to new dialogues and words. In effect, as a chatbot receives new voice or textual dialogues, the number of inquiries that it can reply and the accuracy of each response it gives increases. Facebook has a machine learning chatbot that creates a platform for companies to interact with their consumers through the Facebook Messenger application. Using the Messenger bot, users can buy shoes from Spring, order a ride from Uber, and have election conversations with the New York Times which used the Messenger bot to cover the 2016 presidential election between Hilary Clinton and Donald Trump. If a user asked the New York Times through his/her app a question like “What’s new today?” or “What do the polls say?” the bot would reply to the request.
×