Despite all efforts during almost half a century, most chatbots are still easily uncovered, but over the next decades they will definitely get smarter and finally we will distinguish human beings by them giving us silly answers as opposed to the much smarter chatbots. All of this will really start accelerating as soon as one single chatbot is smarter than one single human being. They will then be able to learn from each other, instead of learning from human beings, their knowledge will explode and they will be able to design even better learning mechanisms. In the long run, we will learn language from chatbots instead of the other way around.
Conversable is the enterprise-class, SaaS platform that will build your bot with you. They work with a lot of Fortune 500 companies (they’re behind the Whole Foods, Pizza Hut, 7-11, and Dunkin Donuts bots, among others). They go beyond Facebook Messenger, and will make sure your conversations are happening across all channels, including voice-based ones (like, for instance, OnStar).
The classic historic early chatbots are ELIZA (1966) and PARRY (1972). More recent notable programs include A.L.I.C.E., Jabberwacky and D.U.D.E (Agence Nationale de la Recherche and CNRS 2006). While ELIZA and PARRY were used exclusively to simulate typed conversation, many chatbots now include functional features such as games and web searching abilities. In 1984, a book called The Policeman's Beard is Half Constructed was published, allegedly written by the chatbot Racter (though the program as released would not have been capable of doing so).
At Facebook’s F8 Developers Conference, Messenger Bots were announced. These bots are being developed by media corporations and retailers alike and very quickly so which raises the question as to what a Messenger bot is and how it’s useful to so many different types of companies. Even more important to know is what these bots mean for the average user, whether or not they will always be safe or can they present a potential threat if they are developed by anyone with malicious intent. Here’s a the answer to all that and more.