#1
  1. Contributing User
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Jan 2003
    Location
    Texas!
    Posts
    1,137
    Rep Power
    47

    Linkage loops in a cascade of sites


    Ok got a link problem that I am trying to work out. If you got opinions or know the answer please let me know. This is basically a PR calculation between websites in a cascading fashion. I will present some versions of this. Which one is the best link structure?? Also, assume closed network of just the sites 1,2,3. Basically what I am trying to determine is if I have for example 3 sites. Each linking as so:

    A. Site 1 --> Site 2 --> Site 3
    ___PR4______PR3_____PR2

    How would my PR be affected if Site 3 linked back to Site 1

    B. Site 1 --> Site 2 --> Site 3 ---.
    .......^--------------------------------'
    ___PR?+/-___PR?+/-___PR?+/-

    Next, how would the PR be affected is I linked like so:
    Where Site 3 now has 2 outgoing links to Site 1 and Site 2 that originally contributed some of there PR to Site 3.

    ......................v--------------.
    C. Site 1 --> Site 2 --> Site 3 ---.
    .......^--------------------------------'
    ___PR?+/-___PR?+/-___PR?+/-


    I can also ask would it be better to avoid a link from Site 3 to Site 1 and concentrate the PR in a smaller network, but I assume there is no harm in this because Site 1 contributes PR to Site 2 and then to 2 contributes to Site 3.

    Also, if all sites where linked in two outgoing and two incoming between all 3 sites, this would be the maximum use of this PR. But I am not trying to find this nor is it possible in most instances with sites not completely owned by one company or person to do this. What I want to know is what happens when barriers do happen and how to best maximize this (PR).

    Look forward to discussion.

    Ben P.
  2. #2
  3. rod@missionop.com
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2003
    Location
    Palm Beach Gardens FL 33410
    Posts
    16,979
    Rep Power
    0
    The best possibly design.

    Design as if a single website without a single root index.html page.

    Use a single graphic design > remember that on entering a new site it is disorienting therefore if you're crosslinking multiple sites "for the visitor and not for the bot" you are making them more disoriented.

    use keyword.html for each site as a common linked interface and have each of these common pages (buttons) clearly labeled as if internal links.

    Add a copy statement and denote the company operates website for keyword1, keyword2, keyword3 etc. having the link go to index.html of each site.

    Add index.html - and link only to the associated "common" page of the interface.

    In this instance - if you actual draw a map of all sites together - only 1 page per domain is actually crosslinking - the remainder are oneway only links (linking by proxy).

    You can not achieve anything better than this - of course if all your sites are just re-purpose information to manipulate PageRank then obviously visitors will not be impressed - seeing the exact same thing 10 times over.

    So the most important thing - and ease of design - design a single site - then place each unique directory in a different domain and change the needed links from relative to absolute.

    20K per day plus (market depend) or about 7 million per year.
  4. #3
  5. Contributing User
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Jan 2003
    Location
    Texas!
    Posts
    1,137
    Rep Power
    47
    Originally posted by "fathom"

    The best possibly design.


    You can not achieve anything better than this - of course if all your sites are just re-purpose information to manipulate PageRank then obviously visitors will not be impressed - seeing the exact same thing 10 times over.

    Fathom, great explanation. But I don't think you understood what I was getting at, or maybe I didn't explain it correctly. I am looking for just raw PR in this model. Structure is highly important, but link structure and how it affects PageRank is what I am asking.
    I do think through proper linking and the way you do it, then you can achieve a maximum Pagerank. It like getting the most for your money.
    Look at my first example. I have imaginary PageRank values places on

    A. Site 1 --> Site 2 --> Site 3
    ___PR4______PR3_____PR2


    Each site "contributes" Pagerank to the next site. Now what I am trying to figure out is how to maximize this. Throw out common user navigation, visitors, and anything external etc... Think inside the box. These are not in the model and theory. I am looking at a group of sites 1,2,3.

    Then take what you know about Pagerank and think about what the best way is cross link these pages.

    Next put up a barrier. You can NOT link Site 2 to Site 1, impossible. Read my first post for clarification.
  6. #4
  7. rod@missionop.com
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2003
    Location
    Palm Beach Gardens FL 33410
    Posts
    16,979
    Rep Power
    0
    Regardless of whether you are looking at the theory in practical application to maximize PageRank across various nodes or a feasible multi-domain name model that is designed for navigation and usability... I have previously discussed the best direction.

    Pure theory "does not work".

    1. The only way you could conceivably implement "pure theory" is 3 - 100% perfectly identical sites, in design, content, and structure so the there would be virtual no difference "anywhere" for "any page" of any site to target something different than the corresponding associated and adjacent pages.

    In this case PageRank would be irrelevant - due to the duplicate content penalties your theoretical design would incur - but at least one "theory site" would probably survive.

    The answer to your real question lies in the physical difference between a "crosslink" and a "reciprocal link"...

    In saying that... Your best bet is forget about PageRank and design to address navigational and usability issues and let PageRank develop itself... You'd be surprise what you can achieve... when you're not thinking about it.

Similar Threads

  1. New sites, how long in top 10?
    By cityneil in forum Google Optimization
    Replies: 7
    Last Post: Oct 10th, 2003, 11:01 AM
  2. linking my sites to one another
    By keli in forum Google Optimization
    Replies: 11
    Last Post: Sep 8th, 2003, 06:41 AM
  3. Is this spam on these sites?
    By Robin A in forum Google Optimization
    Replies: 2
    Last Post: Aug 15th, 2003, 09:19 PM
  4. Google Ranking Up and Down and Up again
    By BadgerMedia in forum Google Optimization
    Replies: 1
    Last Post: Jul 10th, 2003, 10:39 AM
  5. duplicate content across multiple web sites
    By lowell in forum Google Optimization
    Replies: 5
    Last Post: Apr 28th, 2003, 06:50 PM

IMN logo majestic logo threadwatch logo seochat tools logo