What are bots...how do they work? - FIX info

What are bots...how do they work?

A bot is a website that accesses through the internet and performs automated actions. There are two types, Good Bot and Bad Bot.

Image: Kaspersky

What is a Good bot

A good bot works on the internet using various methods to provide a better user experience for internet users. Good bot and bad bot are similar in management, so when doing bot management strategy, we want to block good bots.

What is a Bad bot

Malicious bots (Bad bots) are malware designed to steal information from infected hosts (server / computer / device).

Introducing the Good Bots.

#Search Engine Bots This bot is called web crawler, spider. Search Engines such as Google and Yahoo, when a user searches for something, crawl the previous websites' links and record them in the database and display them back. SEO is a concept. You have to prepare the website according to the guidelines set by the search engine so that the search engine bot can read and record it conveniently.

#Copyright Bots This bot crawls websites and checks whether they are copyrighted. Either a company or an individual can create and use Dilobot. Check whether text, video, images, music, etc. are duplicated or not.

#Site Monitoring bots are bots used for website monitoring. The website owner uses Google or some other service to monitor the website downtime and notify them by email or e-mail. For example, cloudflare shows the cached version of the web page when the server is down, such as Alays Online service.

#Commercial bots are run by commercial companies, so they do market research using the information they need to crawl.

#Feed Bots This bot is a bot that collects new content while crawling websites on the internet for new contenders for the platform's news feeds.

#Chatbots are used when people talk. Social channels and businesses use chatbots to provide information to customers. Bots are gradually becoming more advanced and more intelligent.

#Personal assistant bots These bots are more advanced than real bots like Siri and Alex. They can understand people better and have more personalized behavior.

Websites need to manage bots to maintain resources. Robots.txt is the instruction to show bots. Which pages must be read, which pages are not allowed to be crawled, etc. Why write robots.txt is clear. Just create and write robots.txt on the website. This is cloudflare's robots.txt. It's Google's robots.txt file. It's interesting. Every website has it, so you can go and see it. Also, you can set it as you like on the site.

https://www.cloudflare.com/robots.txt https://www.google.com/robots.txt

Below link has a few guidelines for writing robots.txt. You may want to try writing it for your website.

The above points are the basic information. Thanks for reading end article.