Authentication. Users start by authenticating themselves using whatever mechanism is provided by their channel of communication with the bot. The bot framework supports many communication channels, including Cortana, Microsoft Teams, Facebook Messenger, Kik, and Slack. For a list of channels, see Connect a bot to channels. When you create a bot with Azure Bot Service, the Web Chat channel is automatically configured. This channel allows users to interact with your bot directly in a web page. You can also connect the bot to a custom app by using the Direct Line channel. The user's identity is used to provide role-based access control, as well as to serve personalized content.
The most advanced bots are powered by artificial intelligence, helping it to understand complex requests, personalize responses, and improve interactions over time. This technology is still in its infancy, so most bots follow a set of rules programmed by a human via a bot-building platform. It's as simple as ordering a list of if-then statements and writing canned responses, often without needing to know a line of code.

The upcoming TODA agents are good at one thing, and one thing only. As Facebook found out with the ambitious Project M, building general personal assistants that can help users in multiple tasks (cross-domain agents) is hard. Think awfully hard. Beyond the obvious increase in scope, knowledge, and vocabulary, there is no built-in data generator that feeds the hungry learning machine (sans an unlikely concerted effort to aggregate the data silos from multiple businesses). The jury is out whether the army of human agents that Project M employs can scale, even with Facebook’s kind of resources. In addition, cross-domain agents will probably need major advances in areas such as domain adaptation, transfer learning, dialog planning and management, reinforcement/apprenticeship learning, automatic dialog evaluation, etc.
There are a bunch of e-commerce stores taking advantage of chatbots as well. One example that I was playing with was from Fynd that enables you to ask for specific products and they'll display them to you directly within Messenger. What's more, Facebook even allows you to make payments via Messenger bots, opening up a whole world of possibility to e-commerce stores.

Human touch. Chatbots, providing an interface similar to human-to-human interaction, are more intuitive and so less difficult to use than a standard banking mobile application. They doesn't require any additional software installation and are more adaptive as able to be personalized during the exploitation by the means of machine learning. Chatbots are instant and so much faster that phone calls, shown to be considered as tedious in some studies. Then they satisfy both speed and personalization requirement while interacting with a bank.


As you roll out new features or bug fixes to your bot, it's best to use multiple deployment environments, such as staging and production. Using deployment slots from Azure DevOps allows you to do this with zero downtime. You can test your latest upgrades in the staging environment before swapping them to the production environment. In terms of handling load, App Service is designed to scale up or out manually or automatically. Because your bot is hosted in Microsoft's global datacenter infrastructure, the App Service SLA promises high availability.

Previous generations of chatbots were present on company websites, e.g. Ask Jenn from Alaska Airlines which debuted in 2008[20] or Expedia's virtual customer service agent which launched in 2011.[20] [21] The newer generation of chatbots includes IBM Watson-powered "Rocky", introduced in February 2017 by the New York City-based e-commerce company Rare Carat to provide information to prospective diamond buyers.[22] [23]
Ursprünglich rein textbasiert, haben sich Chatbots durch immer stärker werdende Spracherkennung und Sprachsynthese weiterentwickelt und bieten neben reinen Textdialogen auch vollständig gesprochene Dialoge oder einen Mix aus beidem an. Zusätzlich können auch weitere Medien genutzt werden, beispielsweise Bilder und Videos. Gerade mit der starken Nutzung von mobilen Endgeräten (Smartphones, Wearables) wird diese Möglichkeit der Nutzung von Chatbots weiter zunehmen (Stand: Nov. 2016).[10] Mit fortschreitender Verbesserung sind Chatbots dabei nicht nur auf wenige eingegrenzte Themenbereiche (Wettervorhersage, Nachrichten usw.) begrenzt, sondern ermöglichen erweiterte Dialoge und Dienstleistungen für den Nutzer. Diese entwickeln sich so zu Intelligenten Persönlichen Assistenten.

“Utility gets something done following a prompt. At a higher level the more entertainment-related chatbots are able to answer all questions and get things done. Siri and Cortana you can have small talk with, as well as getting things done, so they are much harder to build. They took years and years of giant company’s efforts. Different companies that don’t have those resources, like Facebook, will build more constrained utility bots.”
There was a time when even some of the most prominent minds believed that a machine could not be as intelligent as humans but in 1991, the start of the Loebner Prize competitions began to prove otherwise. The competition awards the best performing chatbot that convinces the judges that it is some form of intelligence. But despite the tremendous development of chatbots and their ability to execute intelligent behavior not displayed by humans, chatbots still do not have the accuracy to understand the context of questions in every situation each time.
Die Herausforderung bei der Programmierung eines Chatbots liegt in der sinnvollen Zusammenstellung der Erkennungen. Präzise Erkennungen für spezielle Fragen werden dabei ergänzt durch globale Erkennungen, die sich nur auf ein Wort beziehen und als Fallback dienen können (der Bot erkennt grob das Thema, aber nicht die genaue Frage). Manche Chatbot-Programme unterstützen die Entwicklung dabei über Priorisierungsränge, die einzelnen Antworten zuzuordnen sind. Zur Programmierung eines Chatbots werden meist Entwicklungsumgebungen verwendet, die es erlauben, Fragen zu kategorisieren, Antworten zu priorisieren und Erkennungen zu verwalten[5][6]. Dabei lassen manche auch die Gestaltung eines Gesprächskontexts zu, der auf Erkennungen und möglichen Folgeerkennungen basiert („Möchten Sie mehr darüber erfahren?“). Ist die Wissensbasis aufgebaut, wird der Bot in möglichst vielen Trainingsgesprächen mit Nutzern der Zielgruppe optimiert[7]. Fehlerhafte Erkennungen, Erkennungslücken und fehlende Antworten lassen sich so erkennen[8]. Meist bietet die Entwicklungsumgebung Analysewerkzeuge, um die Gesprächsprotokolle effizient auswerten zu können[9]. Ein guter Chatbot erreicht auf diese Weise eine mittlere Erkennungsrate von mehr als 70 % der Fragen. Er wird damit von den meisten Nutzern als unterhaltsamer Gegenpart akzeptiert.
With natural language processing (NLP), a bot can understand what a human is asking. The computer translates the natural language of a question into its own artificial language. It breaks down human inputs into coded units and uses algorithms to determine what is most likely being asked of it. From there, it determines the answer. Then, with natural language generation (NLG), it creates a response. NLG software allows the bot to construct and provide a response in the natural language format.
Operator calls itself a “request network” aiming to “unlock the 90% of commerce that’s not on the internet.” The Operator app, developed by Uber co-founder Garrett Camp, connects you with a network of “operators” who act like concierges who can execute any shopping-related request. You can order concert tickets, get gift ideas, or even get interior design recommendations for new furniture. Operator seems to be positioning itself towards “high consideration” purchases, bigger ticket purchases requiring more research and expertise, where its operators can add significant value to a transaction.
Like apps and websites, bots have a UI, but it is made up of dialogs, rather than screens. Dialogs help preserve your place within a conversation, prompt users when needed, and execute input validation. They are useful for managing multi-turn conversations and simple "forms-based" collections of information to accomplish activities such as booking a flight.

The classic historic early chatbots are ELIZA (1966) and PARRY (1972).[10][11][12][13] More recent notable programs include A.L.I.C.E., Jabberwacky and D.U.D.E (Agence Nationale de la Recherche and CNRS 2006). While ELIZA and PARRY were used exclusively to simulate typed conversation, many chatbots now include functional features such as games and web searching abilities. In 1984, a book called The Policeman's Beard is Half Constructed was published, allegedly written by the chatbot Racter (though the program as released would not have been capable of doing so).[14]
×