returned for Tags:"Robots.Txt"
RobotsTxt : A robots.txt parser for .NET
A robots.txt parser for .NET
- Allow directives.
- Crawl-delay directives.
- Sitemap declarations.
- * and $ wildcards.
See https://bitbucket.org/cagdas/robotstxt for usage examples.
SimpleSitemap is a lite library that helps you create web sitemaps for collections or lists of items.
These sitemaps follow the Sitemap Protocol. Both sitemapindex and urlset links are generated, based upon the data collection size and 'page size'.
Examples of this could be a list of your users,...
A Simple and Powerful Library to Deal with Web Robots Control Strategy.
The Robots Handler package provides editors with the ability to dynamically change the contents of a site's robots file. Instead of storing the contents of a robots file on the file system, an editor can specify its contents in an Umbraco content page. The property content is then served via a Http...
Tam Tam Conditional Robots
- It adds a robots.txt and a robots_closed.txt to the root of the website
- It adds a rewrite rule to the web.config that rewrites the robots.txt to robots_closed.txt for all urls ending with tamtam.nl
A tool for accessing and querying the contents of a robots.txt file.
A tool for loading and reading XmlSitemaps.