With the IIS7 SEO Toolkit one can manage Robots.txt to allow/disallow rules for your web application. To use this tool, type INETMGR in the start command to open IIS Management console, navigate to your website using the treeview on the left side. Select the SEO Toolkit in the management console:
You will find Site Analysis, Sitemaps and sitemap indexes and Robots Exclusion. You can perform site analysis on your website to analyze your website.
Robot Exclusion protocol under SEO Toolkit uses ‘Allow/Disallow’ rules to inform search engines about the URL paths that can be crawled and that to be ignored. One can set the rules for anyone specific user agent or for all user agents using user agent HTTP header.
To create or update the existing Robots.txt file, select the appropriate link under Robots Exlusion based on your requirement to Add a new Allow rules or to Add a new Disallow rules or view existing rules. If you have selected to Add a new DisAllow rule a new Add Disallow rule dialog is opened. Select the User agent you are targeting In the URL structure, yu can select the physical location or previously performed Site Analysis or you can run a new IIS Site Analysis. Based upon your selections, URL are displayed under URL path. Select the URLs that you want to apply rules. Press ‘OK’ to confirm the changes. The Robots.txt file will be updated ( or creates a new Robots.txt if does not exist).
Use sitemaps to inform the search engines the locations that are available for indexing. You can create Sitemaps using SEO toolkit. Select ‘Create new Sitemap’ under the Sitemap and Sitemap Index, give a name to the sitemap(like sitemap.xml) in the New Sitemap dialog. Upon creating it opens a dialog to add URLs to the site map. Set the domain name, URL structure to select the URL’s under the URL path treeview.
Upon creation of sitemap you can add the sitemap.xml to the Robots.txt file by selecting Add to the Robots.txt.