Chatting with a bot should be like talking to a human that knows everything. If you're using a bot to change an airline reservation, the bot should know if you have an unused credit on your account and whether you typically pick the aisle or window seat. Artificial intelligence will continue to radically shape this front, but a bot should connect with your current systems so a shared contact record can drive personalization.
Earlier, I made a rather lazy joke with a reference to the Terminator movie franchise, in which an artificial intelligence system known as Skynet becomes self-aware and identifies the human race as the greatest threat to its own survival, triggering a global nuclear war by preemptively launching the missiles under its command at cities around the world. (If by some miracle you haven’t seen any of the Terminator movies, the first two are excellent but I’d strongly advise steering clear of later entries in the franchise.)

As artificial intelligence continues to evolve (it’s predicted that AI could double economic growth rates by 2035), conversational bots are becoming a powerful tool for businesses worldwide. By 2020, it’s predicted that 85% of customers’ relationship with businesses will be handled without engaging a human at all. Businesses are even abandoning their mobile apps to adopt conversational bots.
Short for chat robot, a computer program that simulates human conversation, or chat, through artificial intelligence. Typically, a chat bot will communicate with a real person, but applications are being developed in which two chat bots can communicate with each other. Chat bots are used in applications such as ecommerce customer service, call centers and Internet gaming. Chat bots used for these purposes are typically limited to conversations regarding a specialized purpose and not for the entire range of human communication.

Intents: It is basically the action chatbot should perform when the user say something. For instance, intent can trigger same thing if user types “I want to order a red pair of shoes”, “Do you have red shoes? I want to order them” or “Show me some red pair of shoes”, all of these user’s text show trigger single command giving users options for Red pair of shoes.

Unfortunately, my mom can’t really engage in meaningful conversations anymore, but many people suffering with dementia retain much of their conversational abilities as their illness progresses. However, the shame and frustration that many dementia sufferers experience often make routine, everyday talks with even close family members challenging. That’s why Russian technology company Endurance developed its companion chatbot.

Founded by Pavel Durov, creator of Russia’s equivalent to Facebook, Telegram launched in 2013 as a lightweight messaging app to combine the speed of WhatsApp with the ephemerality of Snapchat along with claimed enhanced privacy and security through its use of the MTProto protocol (Telegram has offered a $200k prize to any developer who can crack MTProto’s security). Telegram has 100M MAUs, putting it in the second tier of messaging apps in terms of popularity.
[…] But how can simple code assimilate something as complex as speech in only the span of a handful of years? It took humans hundreds of generations to identify, compose and collate the English language. Chatbots have a one up on humans, because of the way they dissect the vast data given to them. Now that we have a grip on the basics, we’ll understand how chatbots work in the next series. […]
If AI struggles with fourth-grade science question answering, should AI be expected to hold an adult-level, open-ended chit-chat about politics, entertainment, and weather? It is thus encouraging to see that Microsoft’s Satya Nadella did not give up on Tay after its debacle, and Amazon’s Jeff Bezos is sponsoring an Alexa social chatbot competition. I love this below quote from Jeff:
Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $3 and after asking her for the money, you go on your way.
Chatbots are often used online and in messaging apps, but are also now included in many operating systems as intelligent virtual assistants, such as Siri for Apple products and Cortana for Windows. Dedicated chatbot appliances are also becoming increasingly common, such as Amazon's Alexa. These chatbots can perform a wide variety of functions based on user commands.
Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $3 and after asking her for the money, you go on your way.
Chatbots are predicted to be progressively present in businesses and will automate tasks that do not require skill-based talents. Companies are getting smarter with touchpoints and customer service now comes in the form of instant messenger, as well as phone calls. IBM recently predicted that 85% of customer service enquiries will be handled by AI as early as 2020.[62] The call centre workers may be particularly at risk from AI.[63]
I will not go into the details of extracting each feature value here. It can be referred from the documentation of rasa-core link that I provided above. So, assuming we extracted all the required feature values from the sample conversations in the required format, we can then train an AI model like LSTM followed by softmax to predict the next_action. Referring to the above figure, this is what the ‘dialogue management’ component does. Why LSTM is more appropriate? — As mentioned above, we want our model to be context aware and look back into the conversational history to predict the next_action. This is akin to a time-series model (pls see my other LSTM-Time series article) and hence can be best captured in the memory state of the LSTM model. The amount of conversational history we want to look back can be a configurable hyper-parameter to the model.
ELIZA's key method of operation (copied by chatbot designers ever since) involves the recognition of clue words or phrases in the input, and the output of corresponding pre-prepared or pre-programmed responses that can move the conversation forward in an apparently meaningful way (e.g. by responding to any input that contains the word 'MOTHER' with 'TELL ME MORE ABOUT YOUR FAMILY').[9] Thus an illusion of understanding is generated, even though the processing involved has been merely superficial. ELIZA showed that such an illusion is surprisingly easy to generate, because human judges are so ready to give the benefit of the doubt when conversational responses are capable of being interpreted as "intelligent".
×