Great community. Great ideas.
Welcome to SEOChat, a community dedicated to helping beginners and professionals alike in improving their Search Engine Optimization knowledge. Sign up today to gain access to the combined insight of tens of thousands of members.
Jan 19th, 2005, 09:35 AM
Yet another robots.txt question
I've been surfing the net and the forum but I'm still confused. In the past someone advised me it was better to use an empty robots.txt file to get rid of the 404 loads in my logs. Ok, this works fine. But now I've been reading it is better to:
1. not have a robots.txt if you don't want to exclude anything
2. have one, but empty
3. have one and use this:
Now, what is true and what isn't?
I prefer to allow all robots everything, if necessary I can exclude a file to make them happy with the file, a stupid image or something like that.
Kind regards, Jef
My site can be found at http://jefpatat.freefronthost.com for the moment the robots.txt is still empty
Jan 19th, 2005, 11:07 AM
All three methods are valid and have the same effect, although only method 2 and 3 get rid of the 404 errors in your logs. Choose any method you like...
Jan 20th, 2005, 02:34 AM
I've read somewhere that an empty robots.txt file is not valid, which is true if you check it in a validator. Won't this harm? Don't robots hate invalid robots.txt?
Jan 20th, 2005, 07:32 PM
The most safe method is the number 3, that is a robots.txt file containing just:
This way, it will not generate a 404 (not found) error and it will give spiders full access to the site.
Please, please, help tsunami survivors
making a donation: UNICEF
- Red Cross
Dec 23rd, 2012, 12:53 AM
robots usefull to seo how it is u can install robotics is automatically used
By weee in forum Search Engine Optimization
Last Post: Jan 10th, 2005, 01:55 PM
By jngabi in forum SEO Help (General Chat)
Last Post: Sep 21st, 2004, 03:00 AM
By cyberil in forum BING/Yahoo Search Optimization
Last Post: Feb 19th, 2004, 12:29 PM