Just looked at a Wikipedia for googlebots and it says:
"If a webmaster wishes to restrict the information on their site available to a Googlebot, or another well-behaved spider, they can do so with the appropriate directives in a robots.txt file."
Greek to me, but it may be useful to someone.
Del