Great community. Great ideas.
Welcome to SEOChat, a community dedicated to helping beginners and professionals alike in improving their Search Engine Optimization knowledge. Sign up today to gain access to the combined insight of tens of thousands of members.
Mar 27th, 2013, 11:36 AM
Is it still duplicate content if you are the author?
I have been wondering recently what the effect of duplicate content would be if you are the author of the content.
Lets say for example, you contribute an article to a popular website and you have your rel=author in place on the article. But then lets say you also want to include that article on your own site, or a version of it. You also include the rel=author in that page.
1) use a canonical tag to the original
2) noindex the duplicated article
3) Use robots to prevent the page from being crawled
4) Not risk using the article (even though its your own and has rel=author in) on another site
I would be very interested to hear your thoughts.
Mar 27th, 2013, 01:15 PM
Its still duplicate content regardless to whether you use rel=author
If you want to publish content on your site and another site you would be much better of publishing it on your own site first (making sure google has found it) and then publishing it on the other site, but most sites will likely have a problem with this as less and less sites are happy to publish duplicate content these days.
On the other hand, if you have already published the article on another site and then decide you would like it featured on your site as well, I wouldn't see it being a problem if you use a canonical tag to the original so google knows you have only put the article on your site for your users rather then because you want to pinch content.
Having said that though, its not something I would likely do, but that's simply because anything of value I would publish on my own site first rather then later down the line
Mar 27th, 2013, 05:15 PM
I agree with Nathaniel. If you wanted to publish the article on the authoritative site as well as your own then either option #1 or #2 would work. At this point, I don't think the rel=author tag is going to save you from Google seeing it as duplicate content.
Mar 29th, 2013, 08:42 AM
Unfortunately duplication is duplication. Im not convinced that noindex saves you either. Google has never got to grips with why duplication is sometimes required and sometimes downright necessary. Why shouldnt your article be on 6 sites in 6 different sectors ? why shouldnt a 30 franchisee operation each have their own identical websites ? How many pages need to exist on the web before there are natural duplicates of articles - at one point that must happen.
How many estate agents use the same website template and put their identical property on as many other websites as possible, how many contact us pages are the same ? the answer is loads and loads. Really theres nothing wrong with this, who wouldnt want to market their product in as many places as possible. In real life ( lol ) you can put your advert in the Times the Telegraph and the Sun, same ad no duplication penalty though.
IMO googles uber view on duplication is totally wrong, even the penalties are not justly applied - I've seen an original site penalised and a new competitor with copied content survive - disgraceful, and google cant tell who had the content first. However, I totally understand no-one wants their content stolen, and there should be safe guards.
I think if google wants to keep penalising duplication it should put a serious tool out that really marks ownership of content and when, and also an ability to allow or justify the use of duplication. As it says it can detect duplication, make this tool available to us so we can see where our content has been elsewhere published ! ( in the same way they do ).
The only time ive come across duplication being ok, is if you want to translate your articles / content into a another language - HOWEVER, said translation must not be software generated, it MUST be written by a native speaker in a natural way.
Apr 6th, 2013, 08:47 AM
Google looks at duplication as duplication regardless whether it is produced by the same author. It asks for unique material in your content in order for your content to be approved. Unique content gives authority to a website. Readers demand for something new and interesting. Reading the same matter over and over again does not interest anyone. therefore it is mandatory to keep your content fresh and unique. If your websites are related and require related content. Then you may spin your articles and add fresh material here and there. Otherwise it is regarded as spam.
By zumajoe in forum Google Optimization
Last Post: Mar 10th, 2011, 05:08 PM
By fsirius in forum Google Optimization
Last Post: Jul 1st, 2010, 12:57 AM
By daniels in forum Google Optimization
Last Post: Aug 3rd, 2009, 06:38 AM
By newseoman in forum Google Optimization
Last Post: Apr 1st, 2009, 08:28 AM
By dakman in forum Google Optimization
Last Post: Oct 19th, 2005, 03:14 AM