#1
  1. Philip@SearchBenefit.com
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Oct 2009
    Location
    Massachusetts, USA
    Posts
    1,384
    Rep Power
    1014

    Question Some duplicates more duplicate than others?


    Fellow SEOs, I would appreciate your ideas. We all know that search engines, and Google in particular, don't like duplicate content across multiples sites.

    In some cases on has to work with duplicate content, and my concern here is to try to figure out how to make the most out of such a situation. I am looking at someone's site right now. It's an extremely large site mostly comprising items that are also available in other Web content repositories. I am not at liberty to say very much about the site itself or its contents, but the site's owner has a legitimate interest in having these items indexed and findable because they are made available on this site by an arrangement among these websites and therefore this owner's claim to them is not inferior to any others. The site has had trouble having much of its content indexed, and the majority of those pages that are indexed get dropped from the index eventually, and I believe that this is a duplicate content issue. I have been asked to try and help with this, so in this case I am not in a position simply to recommend that the site should focus on original content only: it would be contrary to the nature of the enterprise. I want articles on this site to be indexed and placed in the main index rather than in the supplementary results.

    So I have been thinking about this a bit. As most SEOs are likely to have noticed, not all such duplicate content is treated in the same way. It seems in Google tries to return the Web page that it decides has the best claim to an article of content as the first result. Other pages can:
    -- rank below the first result in the main Google index
    -- be relegated to the supplementary index
    -- be dropper from the index altogether
    I wonder what decides how such duplicate pages are being treated. My own sense is that various factors play a role here, such as
    -- whether the title tags are different
    -- minor differences in the pages, including content that surrounds the main content
    -- backlinks
    -- outbound links to sites with the same content
    -- duplicate pages on the same site

    These are just guesses, and it seems that none of these factors is decisive. For example, if you search for the phrase "With a struggling economy, rising drug costs and an aging population using ever more prescription medicines" you see a small handful or main results, and many more if you "repeat the search with the omitted results included." Some of the supplemental results have unique titles, so it seems that unique titles by themselves don't prevent duplicate content from being reduced to the supplemental index. Furthermore, I see reports everywhere about duplicate content that was previously indexed and displayed being dropped from the index or otherwise losing rankings, and this may be happening more often recently than it used to.

    So there may be no reliable way to get duplicate articles to rank, but I want to do my best, and I am thinking that giving such pages unique title tags, introducing unique/distinct page elements, eliminating on-site duplicate content (including 301-redirecting the root directory to the www, subdomain, or vice versa), removing links to pages with the same content on other sites, and building backlinks to the individual pages might help. These are my thoughts, but I am not sure how effective these measures will be. What do you think? And do you have any other ideas? I would appreciate hearing them.
  2. #2
  3. from the horses mouth
    SEO Chat Hero (2000 - 2499 posts)

    Join Date
    Nov 2004
    Location
    UK
    Posts
    2,105
    Rep Power
    556
    To make duplicate content rank you just need to make it appear to be the original source, and you can make that happen with backlinks.

    And in a lot of cases you don't need that many, I've seen just social bookmarking backlinks be enough to rank a duplicate above the original source in some cases.
  4. #3
  5. SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Apr 2007
    Location
    Philippines
    Posts
    1,148
    Rep Power
    24
    you can use some tool to revise the article to create your own content but same idea, I use spinner in order not to get duplicate. Its hard on my part to rank higher using duplicate content.
  6. #4
  7. Philip@SearchBenefit.com
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Oct 2009
    Location
    Massachusetts, USA
    Posts
    1,384
    Rep Power
    1014
    Many thanks for the responses. I'd like to specify that we are talking about multiple millions of pages, and in this case changing the main content of those pages is out of the question: it must remain intact. Most of the content existed on the Web before it was added to the site.
    Last edited by PhilipSEO; Mar 24th, 2010 at 11:45 AM.

Similar Threads

  1. Duplicate content email
    By jperks1985 in forum Google Optimization
    Replies: 8
    Last Post: Nov 1st, 2007, 04:38 PM
  2. How to avoid duplicate contents filter even after copying.
    By awesomehimanshu in forum Google Optimization
    Replies: 26
    Last Post: Mar 17th, 2007, 05:39 AM
  3. Duplicate Content Question...
    By frankc in forum Google Optimization
    Replies: 6
    Last Post: Jan 25th, 2006, 02:57 PM
  4. Duplicate content and PR? any way out?
    By Scott.G in forum Google Optimization
    Replies: 7
    Last Post: Jun 30th, 2004, 12:10 PM
  5. What counts as duplicate pages
    By mat dowle in forum Google Optimization
    Replies: 5
    Last Post: Jun 19th, 2003, 08:45 AM

IMN logo majestic logo threadwatch logo seochat tools logo