Tay, an AI chatbot that learns from previous interaction, caused major controversy due to it being targeted by internet trolls on Twitter. The bot was exploited, and after 16 hours began to send extremely offensive Tweets to users. This suggests that although the bot learnt effectively from experience, adequate protection was not put in place to prevent misuse.
Think of a message thread as the place where you connect and interact with your users. Build just one bot, and your experience is available on all platforms where Messenger exists, including iOS, Android, and web. It also removes the friction of your users having to download one more app, on top of all the apps they already have and may not use, given Messenger is now used by 900 million people every month.
The term Chatbot is closely related to chat bot and chatterbot. Chatterbot is more popular in relation to chatbot who talk a lot, and is not necessary very intelligent in processing the user answers. Chat bot is used by technical people who consider the word ‘bot’ as a normal term for ‘robotised actions’, and for them ‘chat bot’ is a special kind of bot. The term Chatbot is actually the most popular amongst these three terms and has the broadest meaning.
Yes. Messenger bots are approved by Facebook before being made available inside the Messenger app so you can rest assured that they aren’t trying to steal your identity (or anything else). What’s more is a bot is tied to a Facebook page and a Facebook app making it all the more inconvenient to use for fraudulent activity. That said, don’t exchange private and/or personal information with a bot. Finally, because Messenger doesn’t support credit cards and purchasing just yet, anything you buy will likely be done via a browser with a bot aiding you so far as to place an order. If you want, you can always block a bot.