Will Google improve its crawling of AJAX?

Will Google improve its crawling of AJAX? - answered by Matt Cutts

Matt's answer:

Today’s question comes from Daniel Voogsgerd in the Netherlands, who asks, “At the moment, Google is able to parse AJAX but not that well. Will this change in 2011?” There is a team of people who are working on being able to crawl and index AJAX, and index javascript, and parse and execute javascript as well as other types of rich content. The trend in 2011 is going to be the same as the trend as it was in 2010, which is improving our ability to understand javascript, improving our ability to index AJAX, improving our ability to index rich content, and flash, and all of those things. In general we try to do well, but if you’re using stuff that’s really complex or idiomatic, then we might not understand it. If you can stick to relatively standard libraries, a javascript that is relatively well understood, that’s going to be a little bit easier and probably faster for us to get to than custom coded stuff that does really weird esoteric stuff, really weird esoteric things. So those are a few factors to think about. There is an indexing standard for AJAX, where rather than using hash, you can use hash and an exclamation point. And that basically tells Google this is something that we actually do want to be able to index. And both Twitter and Facebook are now using that in order to get various amounts of AJAX indexed. So that can work quite well. If you do a search for that information, or we can hopefully leave a link in the description of the video, that’s one way that you can try to get your AJAX indexed a little bit better. So we’ll keep working on it, and then if you kind of meet us halfway, we’ll find a reasonable middle ground.

by Matt Cutts - Google's Head of Search Quality Team


Original video: