Robots.txt is a file that tells search engines what NOT to crawl on your site. Ideally, you want to screen out everywhere that a search engine has no business indexing -- like your userlist, or user's profiles.
To make a robots.txt file, just create a blank text file with the name "robots.txt". Upload this file into your root directory. Search engines will automatically see it and (supposedly) follow it.
This is the contents of mine:
Code:User-agent: Googlebot
User-agent: *
Disallow: /*?
Disallow: /forum/admin/
Disallow: /forum/db/
Disallow: /forum/files/
Disallow: /forum/images/
Disallow: /forum/includes/
Disallow: /forum/language/
Disallow: /forum/spelling/
Disallow: /forum/templates/
Disallow: /forum/common.php
Disallow: /forum/config.php
Disallow: /forum/faq.php
Disallow: /forum/glance_config.php
Disallow: /forum/groupcp.php
Disallow: /forum/login.php
Disallow: /forum/memberlist.php
Disallow: /forum/mini_cal.php
Disallow: /forum/modcp.php
Disallow: /forum/posting,
Disallow: /forum/posting.php
Disallow: /forum/printview.php
Disallow: /forum/privmsg,
Disallow: /forum/privmsg.php
Disallow: /forum/profile.php
Disallow: /forum/search.php
Disallow: /forum/viewforum,
Disallow: /forum/viewforum.php
Disallow: /forum/viewonline.php
Disallow: /forum/viewtopic,
Disallow: /forum/viewtopic.php
Disallow: /forum/updates-topic.html*
Disallow: /forum/stop-updates-topic.html*
Disallow: /forum/ptopic
Disallow: /forum/ntopic
Disallow: /forum/mark
Disallow: /forum/post-
Replace "forum" with whatever the name of your phpBB2 directory is (I believe the default name is "phpBB2").
The first disallow, "Disallow: /*?", keeps search engines from crawling ANY dynamic URLs (any URL that includes a question mark). Do NOT use this line UNLESS you have mod-rewrite installed.
Many of these lines will not work in all search engines. I threw a ton of stuff in there, in the hopes that at least one of the lines will filter out what I want filtered out.
Foul