#1
  1. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2004
    Posts
    308
    Rep Power
    15

    2 Sites, Near-Identical Content, Google Penalty


    I sure hope someone can give me some advice on this issue. I am working for a national non-profit organization of auto transmission repair shops. I set the organization up with a single template website that uses a lot of session variables to populate page content, including meta tags. When a shop wants a website I simply populate a database with address, shop title, phone, directions, etc. Here is the issue I am having. Look at these two sites, http://www.pacnwtrans.com and http://www.samstransmissions.com. You will see they are nearly identical but are in fact two businesses in the same market. It appears Google has dropped Sams Transmission for "copying" and I am not sure how to resolve this issue. I am NOT trying to game any system, only set up legitimate businesses with effective websites. Ultimately there might be 10 - 20 sites per market as my program continues to expand but I will hit a brick wall if every site set up after the originally indexed site for that market gets indexed, all that follow will be percieved as "copy-cats". I would think Google would be conscientous of this issue. If someone were searching for "Salem Transmission" what would it matter if two (or 20 for that matter) websites came up all representing different businesses. Please help.
  2. #2
  3. KSL Consulting
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Apr 2004
    Posts
    427
    Rep Power
    18
    Originally Posted by thall89553
    I sure hope someone can give me some advice on this issue. I am working for a national non-profit organization of auto transmission repair shops. I set the organization up with a single template website that uses a lot of session variables to populate page content, including meta tags. When a shop wants a website I simply populate a database with address, shop title, phone, directions, etc. Here is the issue I am having. Look at these two sites, http://www.pacnwtrans.com and http://www.samstransmissions.com. You will see they are nearly identical but are in fact two businesses in the same market. It appears Google has dropped Sams Transmission for "copying" and I am not sure how to resolve this issue. I am NOT trying to game any system, only set up legitimate businesses with effective websites. Ultimately there might be 10 - 20 sites per market as my program continues to expand but I will hit a brick wall if every site set up after the originally indexed site for that market gets indexed, all that follow will be percieved as "copy-cats". I would think Google would be conscientous of this issue. If someone were searching for "Salem Transmission" what would it matter if two (or 20 for that matter) websites came up all representing different businesses. Please help.
    Actually www.pacnwtrans.com/ only has one page listed in Google - so it's not banned or totally removed due to duplicate content. As this site has Page Rank 0 with no indicated backlinks, it may just need more inbound links to get the other pages indexed.

    http://www.samstransmissions.com only has 5 pages indexed by Google and 4 of those pages are supplemental. The most likely explanation is that Google has found duplicate content as you suggest and made these pages supplemental. Again, this site has very few inbound links (Google only shows 1 back link).

    I've seen a similar issue with one of my clients where he posted a mirror site on another domain for a while and many of the pages on his more established domain went supplemental. His site had few inbound links too, so it was a similar scenario. Unfortunately, it's not easy to get "supplemental results" fully reinstated once they appear. In my client's case, even removing the mirror site still left the supplemental pages on his other domain (several months later).

    My remedial advice is:-

    1. Reduce the similarity of content between these sites. Do some new copy writing so as to at least make the text less identical between sites. The text and layout is very similar between sites and this is somewhat asking for trouble.

    2. Get more inbound links for both domains to get them more established. Currently neither has enough links to get properly indexed by Google. Google changed it's web page indexing priorities after the Big Daddy update and sites with very few links are now often partially indexed.

    3. Wait and pray that the additioanl links and more dissimilar content resolves the matter.
  4. #3
  5. No Profile Picture
    Contributing User
    SEO Chat Hero (2000 - 2499 posts)

    Join Date
    Aug 2004
    Location
    Bay Area, CA
    Posts
    2,350
    Rep Power
    56
    Originally Posted by thall89553
    Ultimately there might be 10 - 20 sites per market as my program continues to expand but I will hit a brick wall if every site set up after the originally indexed site for that market gets indexed, all that follow will be percieved as "copy-cats". I would think Google would be conscientous of this issue. If someone were searching for "Salem Transmission" what would it matter if two (or 20 for that matter) websites came up all representing different businesses. Please help.
    Google is very "conscientous" of the issue and has acted accordingly. What you want to do is create 20 websites with the same content and have all of them show up as results for the same search terms. That is not in Google's best interest nor in a searcher's best interest.

    You may not "think" you are trying to game the system, but your are in fact trying to take short cuts. You are creating 1 website, puting 20 different sets of contact information on it and reselling it as 20 different websites. As far as Google is concerned (and asfar as 99% of searchers are convcerned) different websites should have different content. If your clients want "their own website" they need their own content. It is that simple.

    Bottom line, there is no "solution" to your problem. Google does not see it as a problem, nor do most of Google's constituents. Write unique content for each site or the sites will get deindexed almost as fast as you release them.

    Comments on this post

    • googler agrees : well said

    "Live never to be ashamed of what is written about you. Even if what is written is not true" -- Richard Bach

    Yahoo Store SEO
  6. #4
  7. B afraid.. B very afraid!
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2004
    Location
    Land of enchantment... deserts of the Southwest
    Posts
    9,697
    Rep Power
    2402
    There is a way around it. Require each affiliate to submit content that can be interspersed with your canned content. You can have them submit a minimum of x words describing their management, another paragraph for driving directions, another paragraph on specialty services, etc. Intermix this with your common content by having mandatory unique content from them and all your duplicate problems will go away. IMO Google will not penalize any sites that are less than 90-95% duplicate. (Note: we have 90-95% duplicate content on 50 U.S. states X 2; all 100 pages place in the top 10 of SERPs with Google on a state by state basis).
    ...Never mistake activity for achievement...

    ...Wise men don't need advice. Fools won't take it....
    Benjamin Franklin
  8. #5
  9. rod@missionop.com
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2003
    Location
    Palm Beach Gardens FL 33410
    Posts
    16,979
    Rep Power
    0
    Originally Posted by SEO_AM
    IMO Google will not penalize any sites that are less than 90-95% duplicate. (Note: we have 90-95% duplicate content on 50 U.S. states X 2; all 100 pages place in the top 10 of SERPs with Google on a state by state basis).
    May indeed be true... but I suspect there is a little more involved than 90-95% duplication.

    You can have 50 websites with 100% duplication and they will all survive provided each has unique linkage to it from elsewhere, and none link to the other 49.

    Duplication of copy and linkage to the original/copies is the problem.

    It is also worth noting that while any level of duplication can survive... but once detected... reduction of duplication will not work short of near 100% original.

    Comments on this post

    • newbieuk23 agrees : I agree. Once duplicates are found it's difficult to resolve.
  10. #6
  11. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2004
    Posts
    308
    Rep Power
    15
    Your statement "what you want to do is create 20 websites with the same content" is not accurate at all. What I want to do is create 20 similar websites for 20 completely different "competing" businesses. Hang with me a second if you would. ATRA is a national non-profit transmission rebuilders organization that certifies small transmission repair shops that are trying to compete with large chain transmission shops. Many of these shop owners live month to month on what their shops bring in. Ok that being said, let us say someone in Salem needs a mechanic and they search "Salem Transmission Repair". What does it matter if 10 sites that all look similar come up IF those are all 10 distinct shops that are certified by a reputable organization and ALL have proven track records of good service? Each site (shop) will have their own specials, own testimonials, own bio, etc. It seems to me that the Google system is a bit flawed but what can you do when there are so many idiots out there trying to "game' the system, which is what I am NOT trying to do...
  12. #7
  13. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2004
    Posts
    308
    Rep Power
    15
    Originally Posted by fathom
    May indeed be true... but I suspect there is a little more involved than 90-95% duplication.

    You can have 50 websites with 100% duplication and they will all survive provided each has unique linkage to it from elsewhere, and none link to the other 49.

    Duplication of copy and linkage to the original/copies is the problem.

    It is also worth noting that while any level of duplication can survive... but once detected... reduction of duplication will not work short of near 100% original.
    Whereas you are doing 50 distinct states I am challenged with up to 2,500 sites across America. So maybe there will be 50 in Denver, 20 in Phoenix, 10 in Cleveland, etc. This is really a challenge. Thanks for your feedback though it is encouraging.
  14. #8
  15. rod@missionop.com
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2003
    Location
    Palm Beach Gardens FL 33410
    Posts
    16,979
    Rep Power
    0
    Originally Posted by thall89553
    It seems to me that the Google system is a bit flawed but what can you do when there are so many idiots out there trying to "game' the system, which is what I am NOT trying to do...
    That answer is simple... give every website owner a budget to have their own content done independent of you.

    That way everything will "be unique" and you are then no longer one of those idiots attempting to 'game' the system.
  16. #9
  17. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2004
    Posts
    308
    Rep Power
    15
    When comparing "content" does that encompass both meta and page content?
  18. #10
  19. rod@missionop.com
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Mar 2003
    Location
    Palm Beach Gardens FL 33410
    Posts
    16,979
    Rep Power
    0
    Originally Posted by thall89553
    When comparing "content" does that encompass both meta and page content?
    Meta's are worthless... ignore them.

    Please note: <title></title> is Title Element and not a Meta Tag... extremely valuable [but content applies to what is visable on the page]
  20. #11
  21. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2006
    Posts
    332
    Rep Power
    13
    Originally Posted by SEO_AM
    There is a way around it. Require each affiliate to submit content that can be interspersed with your canned content. You can have them submit a minimum of x words describing their management, another paragraph for driving directions, another paragraph on specialty services, etc. Intermix this with your common content by having mandatory unique content from them and all your duplicate problems will go away. IMO Google will not penalize any sites that are less than 90-95% duplicate. (Note: we have 90-95% duplicate content on 50 U.S. states X 2; all 100 pages place in the top 10 of SERPs with Google on a state by state basis).
    Still it is not sure that you are playing a safe game. There are still possibilities for blacklisting at any time.

    Secondly seeing one or two of these exeptional cases, we can't use the same content for different sites. It is too risk for any webmasters.

    Thirdly if you are giving general information on your websites then there is better possibility of google sparing your sites.

    So theme of the site also depends a lot in this case.

    Could you agree with you?

Similar Threads

  1. Replies: 35
    Last Post: Jun 26th, 2006, 12:07 AM
  2. No more duplicate content penalty in google...............
    By ezhilraja in forum Google Optimization
    Replies: 18
    Last Post: Nov 8th, 2005, 05:44 AM
  3. Proposal For Google Site Quality Graph
    By TearingHairOut in forum Google Optimization
    Replies: 1
    Last Post: Oct 25th, 2005, 07:37 PM
  4. Link Popularity
    By HTTPguru in forum SEO Chat Articles
    Replies: 0
    Last Post: Oct 19th, 2004, 02:45 AM

IMN logo majestic logo threadwatch logo seochat tools logo