Yes. Messenger bots are approved by Facebook before being made available inside the Messenger app so you can rest assured that they aren’t trying to steal your identity (or anything else). What’s more is a bot is tied to a Facebook page and a Facebook app making it all the more inconvenient to use for fraudulent activity. That said, don’t exchange private and/or personal information with a bot. Finally, because Messenger doesn’t support credit cards and purchasing just yet, anything you buy will likely be done via a browser with a bot aiding you so far as to place an order. If you want, you can always block a bot.
The use of digital assistants is on the rise and more people are taking to chatbots as a first point-of-contact with businesses. While chatbots have traditionally supported customer service departments, more businesses are now using them to automate marketing and sales efforts. For a simple entry point into the chatbot world, look no further than Facebook Messenger.
It didn’t take long, however, for Turing’s headaches to begin. The BabyQ bot drew the ire of Chinese officials by speaking ill of the Communist Party. In the exchange seen in the screenshot above, one user commented, “Long Live the Communist Party!” In response, BabyQ asked the user, “Do you think that such a corrupt and incompetent political regime can live forever?”
Conversable is the enterprise-class, SaaS platform that will build your bot with you. They work with a lot of Fortune 500 companies (they’re behind the Whole Foods, Pizza Hut, 7-11, and Dunkin Donuts bots, among others). They go beyond Facebook Messenger, and will make sure your conversations are happening across all channels, including voice-based ones (like, for instance, OnStar).
The term "ChatterBot" was originally coined by Michael Mauldin (creator of the first Verbot, Julia) in 1994 to describe these conversational programs.[3] Today, most chatbots are accessed via virtual assistants such as Google Assistant and Amazon Alexa, via messaging apps such as Facebook Messenger or WeChat, or via individual organizations' apps and websites.[4][5] Chatbots can be classified into usage categories such as conversational commerce (e-commerce via chat), analytics, communication, customer support, design, developer tools, education, entertainment, finance, food, games, health, HR, marketing, news, personal, productivity, shopping, social, sports, travel and utilities.[6]
Marketer’s Take: The bot was surprisingly effective yet fell short several times when queries like “Show me Blue Jeans” came with a canned bot response, “Sorry, I didn't find any products for this criteria.” Yet I know they sell “blue jeans”. Still, the bot was one of the best eCommerce bots I’ve seen on the platform thus far, and marketers should study it.
Chatbot Eliza can be regarded as the ancestor and grandmother of the large chatbot family we have listed on our website. As you can see in our directory tab, there are hundreds of online chatbots available in the public domain, although we believe hundreds of thousands have been created by enthusiastic artificial intelligence amateurs on platforms such as Pandorabots, MyCyberTwin or Personality Forge AI. Most of these chatbots give similar responses, the default response, and it appears to take a long time and patience to train a chatbot in another field of expertise and not all amateur developers are willing to spend these vast amounts of time. Most of the chatbots created this way are no longer accessible. Only a small portion of fanatic botmasters manage to fight their way out of the crowd and get some visibility in the public domain.

At Facebook’s F8 Developers Conference, Messenger Bots were announced. These bots are being developed by media corporations and retailers alike and very quickly so which raises the question as to what a Messenger bot is and how it’s useful to so many different types of companies. Even more important to know is what these bots mean for the average user, whether or not they will always be safe or can they present a potential threat if they are developed by anyone with malicious intent. Here’s a the answer to all that and more.
×