[In] artificial intelligence ... machines are made to behave in wondrous ways, often sufficient to dazzle even the most experienced observer. But once a particular program is unmasked, once its inner workings are explained ... its magic crumbles away; it stands revealed as a mere collection of procedures ... The observer says to himself "I could have written that". With that thought he moves the program in question from the shelf marked "intelligent", to that reserved for curios ... The object of this paper is to cause just such a re-evaluation of the program about to be "explained". Few programs ever needed it more.
One of the most thriving eLearning innovations is the chatbot technology. Chatbots work on the principle of interacting with users in a human-like manner. These intelligent bots are often deployed as virtual assistants. The best example would be Google Allo - an intelligent messaging app packed with Google Assistant that interacts with the user by texting back and replying to queries. This app supports both voice and text queries.
Die meisten Chatbots greifen auf eine vorgefertigte Datenbank, die sog. Wissensdatenbank mit Antworten und Erkennungsmustern, zurück. Das Programm zerlegt die eingegebene Frage zuerst in Einzelteile und verarbeitet diese nach vorgegebenen Regeln. Dabei können Schreibweisen harmonisiert (Groß- und Kleinschreibung, Umlaute etc.), Satzzeichen interpretiert und Tippfehler ausgeglichen werden (Preprocessing). Im zweiten Schritt erfolgt dann die eigentliche Erkennung der Frage. Diese wird üblicherweise über Erkennungsmuster gelöst, manche Chatbots erlauben darüber hinaus die Verschachtelung verschiedener Mustererkennungen über sogenannte Makros. Wird eine zur Frage passende Antwort erkannt, kann diese noch angepasst werden (beispielsweise können skriptgesteuert berechnete Daten eingefügt werden – „In Ulm sind es heute 37 °C.“). Diesen Vorgang nennt man Postprocessing. Die daraus entstandene Antwort wird dann ausgegeben. Moderne kommerzielle Chatbot-Programme erlauben darüber hinaus den direkten Zugriff auf die gesamte Verarbeitung über eingebaute Skriptsprachen und Programmierschnittstellen.
H&M’s consistent increased sales over the past year and its August announcement to launch an eCommerce presence in Canada and South Korea during the fall of 2016, along with 11 new H&M online markets (for a total of 35 markets by the end of the year), appear to signify positive results for its chatbot implementation (though direct correlations are unavailable on its website).
Online chatbots save time and efforts by automating customer support. Gartner forecasts that by 2020, over 85% of customer interactions will be handled without a human. However, the opportunites provided by chatbot systems go far beyond giving responses to customers’ inquiries. They are also used for other business tasks, like collecting information about users, helping to organize meetings and reducing overhead costs. There is no wonder that size of the chatbot market is growing exponentially.
Love them or hate them, chatbots are here to stay. Chatbots have become extraordinarily popular in recent years largely due to dramatic advancements in machine learning and other underlying technologies such as natural language processing. Today’s chatbots are smarter, more responsive, and more useful – and we’re likely to see even more of them in the coming years.
A very common request that we get is people want to practice conversation, said Duolingo's co-founder and CEO, Luis von Ahn. The company originally tried pairing up non-native speakers with native speakers for practice sessions, but according to von Ahn, "about three-quarters of the people we try it with are very embarrassed to speak in a foreign language with another person."
While messaging and voice interfaces are central components, they fit into a larger picture of increasing infusion of technology into our daily lives, which in turn is unlocking new potential for brand-to-consumer interaction. The fact is, technology overall is becoming more deeply woven into our lives, and the entire ecosystem is enjoying tighter cohesion through the increasing availability and sophistication of APIs. Smart companies are finding new and innovative touch points with consumers that are contextual, relevant, highly personal, and yes, conversational. Commerce is becoming not only more conversational but more ubiquitous and seamlessly integrated into our lives, and the way we interact with brands will be forever changed as a result.
Not integrated. This goes hand-in-hand with the contextual knowledge, but chatbots often suffer from “death by data silo” where their access to data is limited. If a chatbot is “chatting with” a customer, they not only need to access the contextual data of their customer but also have access to every place where the answer to the customer’s question may reside. Product documentation site, customer community, different websites are all places where that answer can be.
Amazon’s Echo device has been a surprise hit, reaching over 3M units sold in less than 18 months. Although part of this success can be attributed to the massive awareness-building power of the Amazon.com homepage, the device receives positive reviews from customers and experts alike, and has even prompted Google to develop its own version of the same device, Google Home.
Social networking bots are sets of algorithms that take on the duties of repetitive sets of instructions in order to establish a service or connection among social networking users. Various designs of networking bots vary from chat bots, algorithms designed to converse with a human user, to social bots, algorithms designed to mimic human behaviors to converse with behavioral patterns similar to that of a human user. The history of social botting can be traced back to Alan Turing in the 1950s and his vision of designing sets of instructional code that passes the Turing test. From 1964 to 1966, ELIZA, a natural language processing computer program created by Joseph Weizenbaum, is an early indicator of artificial intelligence algorithms that inspired computer programmers to design tasked programs that can match behavior patterns to their sets of instruction. As a result, natural language processing has become an influencing factor to the development of artificial intelligence and social bots as innovative technological advancements are made alongside the progression of the mass spreading of information and thought on social media websites.
Efforts by servers hosting websites to counteract bots vary. Servers may choose to outline rules on the behaviour of internet bots by implementing a robots.txt file: this file is simply text stating the rules governing a bot's behaviour on that server. Any bot that does not follow these rules when interacting with (or 'spidering') any server should, in theory, be denied access to, or removed from, the affected website. If the only rule implementation by a server is a posted text file with no associated program/software/app, then adhering to those rules is entirely voluntary – in reality there is no way to enforce those rules, or even to ensure that a bot's creator or implementer acknowledges, or even reads, the robots.txt file contents. Some bots are "good" – e.g. search engine spiders – while others can be used to launch malicious and harsh attacks, most notably, in political campaigns.
1-800-Flowers’ 2017 first quarter results showed total revenues had increased 6.3 percent to $165.8 million, with the Company’s Gourmet Food and Gift Baskets business as a significant contributor. CEO Chris McCann stated, “…our Fannie May business recorded positive same store sales as well as solid eCommerce growth, reflecting the success of the initiatives we have implemented to enhance its performance.” While McCann doesn’t go into specifics, we assume that initiatives include the implementation of GWYN, which also seems to be supported by CB Insights’ finding: 70% of customers ordering through the chat bot were new 1-800-Flowers customers as of June 2016.
Kik Messenger, which has 275 million registered users, recently announced a bot store. This includes one bot to send people Vine videos and another for getting makeup suggestions from Sephora. Twitter has had bots for years, like this bot that tweets about earthquakes as soon as they’re registered or a Domino’s bot that allows you to order a pizza by tweeting a pizza emoji.
Interestingly, the as-yet unnamed conversational agent is currently an open-source project, meaning that anyone can contribute to the development of the bot’s codebase. The project is still in its earlier stages, but has great potential to help scientists, researchers, and care teams better understand how Alzheimer’s disease affects the brain. A Russian version of the bot is already available, and an English version is expected at some point this year.
In 1950, Alan Turing's famous article "Computing Machinery and Intelligence" was published, which proposed what is now called the Turing test as a criterion of intelligence. This criterion depends on the ability of a computer program to impersonate a human in a real-time written conversation with a human judge, sufficiently well that the judge is unable to distinguish reliably—on the basis of the conversational content alone—between the program and a real human. The notoriety of Turing's proposed test stimulated great interest in Joseph Weizenbaum's program ELIZA, published in 1966, which seemed to be able to fool users into believing that they were conversing with a real human. However Weizenbaum himself did not claim that ELIZA was genuinely intelligent, and the introduction to his paper presented it more as a debunking exercise: