Monthly Archives: December 2010

CSS for Disabled anchor tag on asp.net

In the asp.net when the link was disabled it used to apply the disabled properties for the anchor tags. With the 4.0 .Net framework, the colors won’t apply automatically. They introduced a new property for all controls DisabledCssClass which is a string property to set/get the disabled css class. This property is applied to all the controls.
This property can be used to change the name that is rendered for the DisabledCssClass property of individual Web controls. By default, this property returns “aspNetDisabled“.
When SupportsDisabledAttribute is overridden in a derived class to return false, the value of the DisabledCssClass property is rendered as the value of the class attribute of the HTML element for the control. In that case, if there is a value in the CssClass property, both CSS classes will be applied to the rendered HTML element. The class attribute will consist of the value of the DisabledCssClass property followed by the value of the CssClass property, separated by a space.
This property is static, which means that you can set it only for the WebControl class. Whatever value you set it to is used for all controls in a Web application. You cannot specify different values for individual controls.
If you want to set a new class for all the controls all over the application, you can set a new class in the application_start event in global.asax

void Application_Start(object sender, EventArgs e)
{
webcontrol.DisabledCssClass = somecssclass;
}

If you want to set the DisabledCssClass for an anchor tag, you can do that in the stylesheet like

a.aspNetDisabled
{
color:000FFF#;
}

This set of property is applied to anchor tag without affecting any other controls.

IIS SEO Toolkit to create Robots.txt and Sitemap.xml

With the IIS7 SEO Toolkit one can manage Robots.txt to allow/disallow rules for your web application. To use this tool, type INETMGR in the start command to open IIS Management console, navigate to your website using the treeview on the left side. Select the SEO Toolkit in the management console:
You will find Site Analysis, Sitemaps and sitemap indexes and Robots Exclusion. You can perform site analysis on your website to analyze your website.

Robot Exclusion protocol under SEO Toolkit uses ‘Allow/Disallow’ rules to inform search engines about the URL paths that can be crawled and that to be ignored. One can set the rules for anyone specific user agent or for all user agents using user agent HTTP header.

To create or update the existing Robots.txt file, select the appropriate link under Robots Exlusion based on your requirement to Add a new Allow rules or to Add a new Disallow rules or view existing rules. If you have selected to Add a new DisAllow rule a new Add Disallow rule dialog is opened. Select the User agent you are targeting In the URL structure, yu can select the physical location or previously performed Site Analysis or you can run a new IIS Site Analysis. Based upon your selections, URL are displayed under URL path. Select the URLs that you want to apply rules. Press ‘OK’ to confirm the changes. The Robots.txt file will be updated ( or creates a new Robots.txt if does not exist).

Use sitemaps to inform the search engines the locations that are available for indexing. You can create Sitemaps using SEO toolkit. Select ‘Create new Sitemap’ under the Sitemap and Sitemap Index, give a name to the sitemap(like sitemap.xml) in the New Sitemap dialog. Upon creating it opens a dialog to add URLs to the site map. Set the domain name, URL structure to select the URL’s under the URL path treeview.

Upon creation of sitemap you can add the sitemap.xml to the Robots.txt file by selecting Add to the Robots.txt.