I've been surfing the net and the forum but I'm still confused. In the past someone advised me it was better to use an empty robots.txt file to get rid of the 404 loads in my logs. Ok, this works fine. But now I've been reading it is better to:
1. not have a robots.txt if you don't want to exclude anything
2. have one, but empty
3. have one and use this:
Now, what is true and what isn't?
I prefer to allow all robots everything, if necessary I can exclude a file to make them happy with the file, a stupid image or something like that.
Kind regards, Jef
My site can be found at http://jefpatat.freefronthost.com for the moment the robots.txt is still empty