#1
  1. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2004
    Posts
    48
    Rep Power
    15

    Optimal site depth, ceteris paribus: 3 links deep or 6 links deep? (shallow vs. deep)


    What would be the optimal site depth for a site with--for the sake of simplicity--1 million pages? (and no cross-linked pages--everything flows from the main page) Assume that the site link structure is the only thing that changes, all else is equal.

    Would it be to be 3 links deep--e.g., 100 links on the main page, then 100 links on each of the pages the main page links to, then 100 pages on each of those pages. Thus, a spider would only have traverse 3 levels deep to get to all of the pages.

    Or would the spider prefer a site with only 10 links per page--thus necessitating a structure 6 levels deep to spider all the pages?

    I think the former, shallower structure would probably be better for search engine purposes. The danger with this is being penalized for too many links, like being categorized as a link farm; however, if all the links are internal, I do not think there would be any problems. On any given page, Yahoo and Amazon and many prominant sites have over 100 internal links.

    Whereas I think Googlebot and other spiders might have some set limitations on how many links deep they are supposed to spider a site, or how often they are supposed to go X links deep. So it might not like a site with very deep links.

    Your thoughts?
  2. #2
  3. No Profile Picture
    mick.sawyer
    Guest
    SEO Chat Mastermind (5000+ posts)
    Built if for your users and put the content where it needs to be for the user to navigate it and the rest will follow.

    I would say there is no answer to your question and any answer would be pure opinion.
  4. #3
  5. Wymiatacz
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Sep 2004
    Location
    Warsaw, Poland
    Posts
    174
    Rep Power
    15
    6 levels deep? Robot wouldnt just ever get there.
    3 levels that's pretty shallow, so I think you could make 4 levels deep, but pages should interlink and there should be link from main page to the deepest level.
  6. #4
  7. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2004
    Posts
    48
    Rep Power
    15
    Originally Posted by mick.sawyer
    Built if for your users and put the content where it needs to be for the user to navigate it and the rest will follow.
    Certainly the primary question would be ease of navigation for your users. That would make more difference with regards to increasing quality long-term recurring traffic than trying to please bots.

    However I am asking a purely theoretical question about which method search engines would prefer. I do not think every answer would be simply opinion--no doubt one of the linking structures described above IS preferred by bots, ceteris paribus. And it is likely someone out there possesses data that enlightens the issue.
  8. #5
  9. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2004
    Posts
    48
    Rep Power
    15
    Originally Posted by Tojan
    6 levels deep? Robot wouldnt just ever get there.
    I agree--I think that spiders would shy away from going so deep into a site, so the shallow structure is preferable. However I have no data to back up my beliefs, only logic; and perhaps there is more to it that I haven't thought of.
  10. #6
  11. No Profile Picture
    mick.sawyer
    Guest
    SEO Chat Mastermind (5000+ posts)
    Originally Posted by Kadence
    Certainly the primary question would be ease of navigation for your users. That would make more difference with regards to increasing quality long-term recurring traffic than trying to please bots.

    However I am asking a purely theoretical question about which method search engines would prefer. I do not think every answer would be simply opinion--no doubt one of the linking structures described above IS preferred by bots, ceteris paribus. And it is likely someone out there possesses data that enlightens the issue.
    2 levels deep is enough.
    If you do it any deeper it takes ages for bots to re-crawl.
    Unless you reverse link it.
    Put links to your home page and lots of links from regularly crawled front pages to a site map or list of links in a second level directory.

    If you want to go six to 50 levels deep there is always a lot of ways to manipulate the bots to come in every few days.
  12. #7
  13. Wymiatacz
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Sep 2004
    Location
    Warsaw, Poland
    Posts
    174
    Rep Power
    15
    I have some proofs. I have a site with forum, where posts are Level 4 and Google didn't spider it almost at all (although I have decent PR).
    Since I put links on the main page (latest posts) google spiders level 4 like crazy.
  14. #8
  15. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2004
    Posts
    48
    Rep Power
    15
    There are indeed many ways to manipulate bots to go deep. Getting links for deep internal pages posted to directories would help a lot with this, or just getting plenty of deep links from other sites in general. A site map wouldn't really work for the theoretical site--for the 100 link structure the front page would be the most adequate functional site map you are likely to get. But bots can be "enticed" into going deep into your site.

    Though I am more concerned with the algorithmic question than practical application. Simply looking at the pagerank algorithm, it should make no difference. However there are probably things that bots are simply programmed not to do, and going past a certain number of levels if probably one of them.

    So the general opinion so far seems to be, bots prefer shallow structures, because they are (by themselves, without manipulation) disinclined to follow deep links. I am curious if anyone would choose to challenge this. (again, purely from an abstract viewpoint with no other factors considered; not practical--effective strategies certainly exist for optimizing both shallow and deep structures)

Similar Threads

  1. Drastic Ranking Drop
    By Tiffany* in forum Google Optimization
    Replies: 1241
    Last Post: May 16th, 2006, 11:24 PM
  2. Outbound Links - Authority Site Linking
    By fathom in forum Link Development
    Replies: 58
    Last Post: Jan 31st, 2006, 06:28 PM
  3. Replies: 34
    Last Post: Jul 25th, 2004, 01:24 AM
  4. link exchange program
    By innvisibility in forum Link Development
    Replies: 15
    Last Post: Jun 16th, 2004, 05:45 AM
  5. Google Recovery!
    By relaxzoolander in forum Google Optimization
    Replies: 133
    Last Post: Mar 16th, 2004, 04:42 PM

IMN logo majestic logo threadwatch logo seochat tools logo