Can I improve page speed only for search engine crawlers?
Wrong email or password! Try one more time.

Forgot password?

An account with this email already exists.

An email with a confirmation link has been sent to you.

Did you forget your password? Don't panic. Enter your email address,
and we will email you a link where you may create a new password.

If this address exists, we will send you an email with further instructions.

Back to authentication

Should I provide to Googlebot optimized pages for speed?

Should I provide to Googlebot optimized pages for speed? - answered by Matt Cutts

Summary:

You shouldn’t serve Googlebot content-only pages for improving load speed. That would be seen as cloaking. Google is not using just Googlebot to determine load speed of a page so that wouldn't even work. Most of the time CSS, and Java Script content is external and they are not loaded at that particular time. So shouldn’t have to do anything to try to speed things up for Googlebot.

 

Matt's answer:

The answer to that question is No and if I could reach through the screen and hit the ESC key and Ctrl C and Break I would because that’s cloaking. You never want to do something completely different for googlebot than you’d do for regular users. That’s the definition of cloaking. If you have something in your code, saying “If user agent equals Googlebot or if the IP address is Google, do something different”.

 

Doing something different for Googlebot is the mere definition of cloaking

So you might think the page load speed is a factor I should care about, so let’s make things fast for Googlebot. But that is not the right approach. Because:

  • we’re not only using Googlebot to determine how fast a particular page or site loads, so it wouldn’t even work.
  • if you’re changing the content that can show up people will be able to look at the cache page and see oh, it’s nothing but a text page, this is something that’s very strange and so they’ll complain about your site cloaking.

 

Think about it, whenever you include CSS, Java Script images, most of the time those are external, we’re not even going off to load those at that particular time. So knowing that external stuff exists, doesn’t necessarily mean that we’re going to go off and fetch it and incorporate it all into the page. But you do want to show the same page to users that you show to Googlebot. Don’t do anything to try to speed things up for Googlebot because that’s cloaking and that’s a much higher risk, then just trying to speed things up only for Googlebot.


by Matt Cutts - Google's Head of Search Quality Team

 

Original video: