What is Robots.txt?
• Robots.txt is a text file creates to instruct to Google crawl my website.
• Then Crawl the all indexed pages in your website.
• Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them to visit.
• Robots.txt is by no means mandatory for search engines.
How to check Robots.txt to your website:
Ex: www.Domainname.com/robots.txt
How to Create Robots.txt to your website:
• Just open notepad file
• Copy this code:
Ex:
• Save that file with .txt extension (robots.txt)
• Upload that file on your website (need to log on site admin details)
0 comments:
Post a Comment