Wednesday, July 2, 2014

Interesting facts regarding Robots.txt file

Interesting facts regarding Robots.txt file

June 30, 2014 is celebrated as the 20th Anniversay for Robots.txt. Robots.txt was created by Martijn Koster in 1994. He created this file when he was working at Nexor to overcome the issues with crawlers hitting his sites too hard. All major Search Engines including WebCrawler, Lycos and AltaVista, quickly adopted it, even 20 years later continue to support it and obey it.

Most common robots.txt mistakes ever noticed:

To disallowing URLs it is always not important to implement Robots.txt file. When you are thinking to implement Robots.txt file to disallow URL, take it as a last option. You first think about the 410 HTTP response, noindex meta tags and rel=canonical.


Keep Reading...

1 comment: