The answer to that question is No and if I could reach through the screen and hit the ESC key and Ctrl C and Break I would because that’s cloaking. You never want to do something completely different for googlebot than you’d do for regular users. That’s the definition of cloaking. If you have something in your code, saying “If user agent equals Googlebot or if the IP address is Google, do something different”.
So you might think the page load speed is a factor I should care about, so let’s make things fast for Googlebot. But that is not the right approach. Because:
Think about it, whenever you include CSS, Java Script images, most of the time those are external, we’re not even going off to load those at that particular time. So knowing that external stuff exists, doesn’t necessarily mean that we’re going to go off and fetch it and incorporate it all into the page. But you do want to show the same page to users that you show to Googlebot. Don’t do anything to try to speed things up for Googlebot because that’s cloaking and that’s a much higher risk, then just trying to speed things up only for Googlebot.
by Matt Cutts - Google's Head of Search Quality Team