8 packages returned for Tags:"Robots.txt"

RobotsTxt : A robots.txt parser for .NET
A robots.txt parser for .NET Supports ; - Allow directives. - Crawl-delay directives. - Sitemap declarations. - * and $ wildcards. See https://bitbucket.org/cagdas/robotstxt for usage examples.
SimpleSitemap is a lite library that helps you create web sitemaps for collections or lists of items. These sitemaps follow the Sitemap Protocol. Both sitemapindex and urlset links are generated, based upon the data collection size and 'page size'. Examples of this could be a list of your users,... More information
Robots Handler
The Robots Handler package provides editors with the ability to dynamically change the contents of a site's robots file. Instead of storing the contents of a robots file on the file system, an editor can specify its contents in an Umbraco content page. The property content is then served via a Http... More information
Tam Tam Conditional Robots
- It adds a robots.txt and a robots_closed.txt to the root of the website - It adds a rewrite rule to the web.config that rewrites the robots.txt to robots_closed.txt for all urls ending with tamtam.nl