From the developer: "The standard edition of Robot-Manager allows the user to effortlessly create a robots.txt file for their web site. This file is used to help direct search engine and other types of spiders to appropriate pages once they arrive at your web site. Most spiders will only gather about 30% of your web site's content, so it's critical that you help them find the most appropriate pages on your site. At present, Robot-Manager can retrieve the directory structure of your web site from your local hard drive or via an ftp server. Once the file is created, Robot-Manager can upload it to the appropriate directory on your web server."
Note: The trial edition is limited to working with 5 spiders in your robots.txt file. There is no time limit on its use.