Tuesday, 4 February 2014

Google: Remove The Robots.txt File Completely

Believe it or not, I am not a large fan of putting spiders.txt information on websites unless you want to particularly avoid material and segments from Google or other look for search engines. It just always experienced repetitive to tell a online look for motor they can spider your site when they will do so unless you tell them not to.

Google's JohnMu verified in a Google Website owner Help line and even suggested to one webmaster that he/she should eliminate their spiders.txt computer file "completely."

John said:
I would suggest going even a bit further, and perhaps eliminating the spiders.txt computer file absolutely. The common concept behind preventing some of those webpages from creeping is to avoid them from being listed. However, that's not really necessary -- websites can still be indexed, listed and rated excellent with webpages like their conditions and conditions or delivery information listed (sometimes that's even useful to the customer :-)).

I know many SEOs experience it is compulsory to have a spiders.txt computer file and just have it say, User-agent: * Allow: /. Why when they will eat up your material anyway?
Anyway, it is awesome to see a Googler verifying this, at least in this situation.
Forum conversation at Google Website owner Help.

No comments:

Post a Comment