Efforts by servers hosting websites to counteract bots vary. Servers may choose to outline rules on the behaviour of internet bots by implementing a robots.txt file: this file is simply text stating the rules governing a bot's behaviour on that server. Any bot that does not follow these rules when interacting with (or 'spidering') any server should, in theory, be denied access to, or removed from, the affected website. If the only rule implementation by a server is a posted text file with no associated program/software/app, then adhering to those rules is entirely voluntary – in reality there is no way to enforce those rules, or even to ensure that a bot's creator or implementer acknowledges, or even reads, the robots.txt file contents. Some bots are "good" – e.g. search engine spiders – while others can be used to launch malicious and harsh attacks, most notably, in political campaigns.
Messenger bots might also be able to revolutionize customer support. Facebook has become a popular platform for brands to interact with their customers. Many times customers will take a complaint to a brand’s Facebook page and have it resolved over chat. A Messenger bot makes it easier for you to get help. The quality of the support will vary but for smaller business that rely on Facebook for sales a bot is going to help them stay ‘online’ 24/7.