Great community. Great ideas.
Welcome to SEOChat, a community dedicated to helping beginners and professionals alike in improving their Search Engine Optimization knowledge. Sign up today to gain access to the combined insight of tens of thousands of members.
Nov 15th, 2012, 05:47 AM
Faster crawling of duplicate content pages
How can we let Googlebot crawl our rel=canonical pages faster?
We have a lot of duplicate content on our webpages and they are still in the index. We've implemented rel=canonical to get rid of the duplicate pages, but how can we let Googlebot crawl our duplicate content pages faster?
Nov 20th, 2012, 02:23 AM
Originally Posted by Gebruiker2500
hi, If you want googlebot crawl your website faster, then you have to change you crawl rate in google webmaster. login and goto configuration - setting - click on Limit Google's maximum crawl rate and change rate as per your requirement. hope you understand.
Nov 20th, 2012, 05:28 AM
Thanks. Are you sure? This is what I read on Google's Webmaster Help pages:
Crawl rate refers to the speed of Googlebot's requests during the crawl process. It doesn't have any effect on how often we crawl or how deeply we crawl your URL structure. If you want Google to crawl new or updated content on your site, use Fetch as Google instead.
Changing the crawl rate can cause some problems (for example, Google will not be able to crawl at a faster rate than the custom rate you set), so don't do this unless you are noticing specific problems caused by Googlebot accessing your servers too often.
By rjonesx in forum Google Optimization
Last Post: Dec 23rd, 2010, 03:42 AM
By jperks1985 in forum Google Optimization
Last Post: Nov 1st, 2007, 04:38 PM
By brakai295 in forum Google Optimization
Last Post: Nov 12th, 2006, 04:55 AM
By sparknote_s in forum Google Optimization
Last Post: Aug 2nd, 2005, 04:34 PM
By thejoker in forum Search Engine Optimization
Last Post: Mar 18th, 2005, 12:55 PM