What is Robots.txt file and how to implement in website
Robots.txt is a text file which in SEO used to tell Google spider (crawler) to which website page or folder, not index or crawl. Google bots follow the instruction of Robots.txt file. We can Block whole website, Particular folder or single landing page. It is a set of instructions for search engines bot. For example (https://www.abc.com/robots.txt). Robots.txt file always uploads in the root folder. We can submit in Google Webmaster tools (Google Console).
User-agent: This is used to give crawl instruction to a different search engine (Google, msnbot, Bing etc..)
Disallow: This command is used for Block
Allow: This command is used to all crawl of website
Crawl-delay: This command is used to tell bots to delay crawl
Sitemap: We used sitemap link in the robots.txt file for easily bots visit sitemap and crawl all updated pages.
Block all Website:-
Crawler access allow Website:-
Block Specific Folder:-
Block Specific Page:-
Block Specific Crawler:-