#1
  1. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2012
    Posts
    5
    Rep Power
    0

    Big mistake with Robots.txt - disallowed by accident


    Hi everybody,

    I have made a big mistake recently with a site which has been online for 6+ years and has done many things before to get a high page rank. The site was on many first pages on several keywords.

    The site was recently redesigned and redeveloped but the problem is that robots.txt was once setup to disallow everything but was not put back correctly after the site has launched.

    This went on for a few weeks before it was noticed and has been corrected immediately but the damage seemed to be have done already as the site is no longer on the first page (or any other immediate page) and showing very bad scores and page ranks with online SEO tests.

    Is there any way we can 'roll back' to before the robots.txt mistake or is it really starting from scratch again ?

    Thanks in advance!
  2. #2
  3. SEO Since 97
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2011
    Location
    Arizona
    Posts
    5,212
    Rep Power
    1930
    If you have the robot tags fixed, that should fix that issue, you just have to wait, but be sure and check every page. Not sure what score your talking about that you got on-line but a redesign can make your rankings drop if your elements aren't what they use to be, it's hard to say without looking at the site and comparing it to the old version.

    You can roll back if you still have the old files which depends on how it was redesigned and redeveloped.
  4. #3
  5. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2012
    Posts
    5
    Rep Power
    0
    Hi test-ok, thank you for your reply

    I did have the robots.txt fixed but only after a few weeks when the damage was already done.

    What do you mean with rolling back ?
    I do still have the old website available on a separate subdomain.
    But judging by how the new site is setup, the way its setup and the content should be more extensive then what it was before.

    So i would assume that the biggest problem was the robots.txt which was set to dissallow everything and it having pulled out ALL the pages and ranks that existed in Google.
  6. #4
  7. SEO Since 97
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2011
    Location
    Arizona
    Posts
    5,212
    Rep Power
    1930
    What do you mean with rolling back ?
    Didn't you ask?
    Is there any way we can 'roll back' to before the robots.txt mistake or is it really starting from scratch again ?
    Back to square one and yes it would be starting over...I wouldn't do it that way, I'd move forward. Sounds like you already fixed the one issue, now it's a waiting game.
  8. #5
  9. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2012
    Posts
    5
    Rep Power
    0
    Originally Posted by Test-ok
    Didn't you ask?

    Back to square one and yes it would be starting over...I wouldn't do it that way, I'd move forward. Sounds like you already fixed the one issue, now it's a waiting game.
    Yes, what i actually meant was how can i roll back to how it was with the new site if i still have the old site.

    I misread and thought it was possible to gain back all the search result positions etc as it was before the robots.txt mistake.
  10. #6
  11. Cool bra. Go with Christ!
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2004
    Location
    Gloucester (South West UK).
    Posts
    5,967
    Rep Power
    1220
    Originally Posted by johnjohn
    Yes, what i actually meant was how can i roll back to how it was with the new site if i still have the old site.

    I misread and thought it was possible to gain back all the search result positions etc as it was before the robots.txt mistake.
    If you have fixed the robots.txt, a short wait and you will "revert" to the search positions you would have had before the rogue robots.txt was introduced (there are no lasting effects).

    When you did the redesign, did you change any urls? If so did you add 301 redirects?
    ClickyB
    "The quality of the visitor is more important than the volume".. Egol 22nd Feb 2008
    New to SEO/SeoChat?...Click Here
    Canonical Problems?... Click Here
    Forum Rules & Posting Guidelines
  12. #7
  13. SEO Consultant
    SEO Chat Genius (4000 - 4499 posts)

    Join Date
    Jul 2004
    Location
    Minneapolis, MN, USA
    Posts
    4,210
    Rep Power
    979
    It sounds very much like there are two separate sites with old and new URLs. Are any of the old site URLs still accessible?

    I would guess if the robots.txt error happened, then there are probably not any 301's in place either, and this is likely the reason the new site has dropped.
  14. #8
  15. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2012
    Posts
    5
    Rep Power
    0
    The faulty robots.txt was running for a few weeks so Google had enough time to disallow' the entire site.

    With the new design/development, the site also makes use of new URL's and all the old url's are 301 redirected to the new site.

    This also applies to several domain names which are also 301 redirected to the 'main' site.

    The problem is just that when i do some standard SEO tests online, the site ranking is just very low even though the site is made to be SEO friendly / accessible.
    I figure the low ranking is just because of the robots.txt screwing things up.

    The other measurement method is Google Analytics which states that the visitors went from 10000 visitors to merely 1800 visitors.

    So moving forward, what would be the best ways to make the site perform like it used to.
    Will it help to just pay more for adwords in the beginning and hopefully get more visitors and higher ranking in Google searches or are there other (cheaper) ways ?
  16. #9
  17. SEO Since 97
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2011
    Location
    Arizona
    Posts
    5,212
    Rep Power
    1930
    The other measurement method is Google Analytics which states that the visitors went from 10000 visitors to merely 1800 visitors.
    It's not being found, because of the no index robots, really nothing you can do but wait.

    Not sure why or what bad scores your seeing, what program are you using and what's being scored?
  18. #10
  19. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2012
    Posts
    5
    Rep Power
    0
    Originally Posted by Test-ok
    It's not being found, because of the no index robots, really nothing you can do but wait.

    Not sure why or what bad scores your seeing, what program are you using and what's being scored?
    I have used searchmetrics<dot>com for example and see that the SEO visibility is 6 whereas a competitor website scores a 997 !

    The site has an Alexa ranking of 1,920,000+
    Whereas the competitor website is on 292,000+

    Both do have pagerank of 3 though

Similar Threads

  1. Has anyone disallowed robots.txt?
    By l3vi in forum Search Engine Optimization
    Replies: 3
    Last Post: Oct 3rd, 2006, 12:39 AM
  2. robots.txt : disable mistake
    By y7g7 in forum Search Engine Optimization
    Replies: 1
    Last Post: May 3rd, 2005, 03:47 PM

IMN logo majestic logo threadwatch logo seochat tools logo