Use robots.txt

 

If you have a section of your site with pages that move around or get added and deleted a lot, consider blocking search engines from indexing them in their databases. You definitely shouldn’t use META tags on pages that move a lot, either.

Create a file called robots.txt in the root directory (the one you refer to as ‘/’). Follow this format:

# robots.txt for http://www.yoursite.net/
User-agent: *
Disallow: /disappearing/ # These pages move a lot.
Disallow: /soontobe404.htm

 

‘User-agent: *’ means that it applies to all search engines. Anything preceded by a ‘#’ is simply a note to yourself. You can block entire directories or individual files using robots.txt. Use it!