Monday, July 28, 2003

From the horses mouth...GoogleGuy Says: "GoogleGuy Says:
Okay, I'm waiting for something to finish compiling, so I thought I'd write up a few webmaster tips. Most of these are on our help pages somewhere, but not many people know all of these tidbits. So here goes:

Tip #1: Use If-Modified-Since (IMS). IMS lets your webserver tell Googlebot whether a page has changed since the last time the page was fetched. If the page hasn't changed, we can re-use the content from the last time we fetched that page. That in turn lets the bot download more pages and save bandwidth. I highly recommend that you check to see if your server is configured to support If-Modified-Since. It's an easy win for static pages, and sometimes even pages with parameters can benefit from IMS.

Tip #2: You can use wildcards in robots.txt, and patterns can end in '$' to indicate the end of a name. So if you don't want Googlebot to fetch any PDF files, for example, you could say
Disallow: /*.pdf$
Don't forget that in the robots.txt file, all url patterns need a '/' anchor to be valid. That's a pretty common webmaster error (maybe the most common robots.txt mistake), so keep it in mind and save yourself some angst. :)

Tip #3: Googlebot also permits an 'Allow' directive in robots.txt. This lets you specifically flag areas that are okay to crawl. When there are two directives that could apply, we follow the longest (i.e. most specific directive). See http://www.google.com/webmasters/faq.html#robots for an example.

Tip #4: Avoid session ID's. If you can, use fewer dynamic parameters and stay away from the parameter 'id=' in urls--Googlebot tries to stay away from things that might be session ID's.

Tip #5: Make sure that you can reach every page on your site with a text browser like lynx. T"

Google
Creative Commons Licence
This work is licensed under a Creative Commons License.