Crumpled.RobotsTxt 3.0.3

dotnet add package Crumpled.RobotsTxt --version 3.0.3
                    
NuGet\Install-Package Crumpled.RobotsTxt -Version 3.0.3
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Crumpled.RobotsTxt" Version="3.0.3" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Crumpled.RobotsTxt" Version="3.0.3" />
                    
Directory.Packages.props
<PackageReference Include="Crumpled.RobotsTxt" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Crumpled.RobotsTxt --version 3.0.3
                    
#r "nuget: Crumpled.RobotsTxt, 3.0.3"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Crumpled.RobotsTxt@3.0.3
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Crumpled.RobotsTxt&version=3.0.3
                    
Install as a Cake Addin
#tool nuget:?package=Crumpled.RobotsTxt&version=3.0.3
                    
Install as a Cake Tool

Crumpled.RobotsTxt

A flexible, configuration-driven robots.txt solution for Umbraco v13, v14, v15, v16 & v17 that protects your non-production environments from search engine indexing by default, while giving you granular control over crawling rules across multiple sites and environments.

Key Features

  • πŸ›‘οΈ Safe by Default - Blocks all bots by default to prevent accidental indexing of development, staging, or preview environments
  • 🌍 Multi-Site & Environment-Aware - Configure different robots.txt rules for different domains/hostnames and environments (Production, Development, Staging, etc.)
  • πŸ“ Flexible Rule Configuration - Define reusable rulesets with Allow/Disallow patterns for different user agents
  • πŸ—ΊοΈ Sitemap Integration - Include sitemap URLs per site
  • ☁️ Umbraco Cloud Ready - Default behaviour designed for Umbraco Cloud - Perfect for hiding those often overlooked *.umbraco.io environment domains.
  • βš™οΈ Zero Code Setup - Works out of the box with auto-registration

Install NuGet package

dotnet add package Crumpled.RobotsTxt

Setup

The package automatically registers itself via a Umbraco Composer. No code changes required!

Manual Registration (Advanced)

If you prefer to manually register the package in program.cs, disable the composer:

"Crumpled": {
  "RobotsTxt": {
    "DisableComposer": true
  }
}

Then add to your program.cs:

.AddCrumpledRobotsTxt()

Default Behavior - Protection First

The package prioritizes protecting your content from unintended indexing. When no Sites are configured, smart defaults kick in:

  • Custom Default: If you specify a DefaultRuleset, that ruleset will be used as the fallback

  • Umbraco Cloud Live Environment: If the environment variable UMBRACO__CLOUD__DEPLOY__ENVIRONMENTNAME equals "live", all bots are allowed by default:

    User-agent: *
    Allow: /
    
  • All Other Environments: All bots are blocked by default for safety - protecting staging, development, and preview environments:

    User-agent: *
    Disallow: /
    

⚠️ Note: Once you configure Sites, these defaults are ignored and your custom RuleSets take full control.

Unmatched Domains - Additional Protection

When Sites are configured, any domain that doesn't match the configured HostNames (e.g., temporary preview URLs, forgotten subdomains) will get a protective fallback:

  • Custom Default: If you specify a DefaultRuleset, that ruleset will be used
  • Otherwise: Blocks all bots for safety:
    User-agent: *
    Disallow: /
    

This prevents unintended crawling of staging, preview, or other unlisted domains - ensuring only your explicitly configured production domains are indexed.

Configuration Example - Multi-Site Setup

Configure different robots.txt rules for different environments and domains using reusable rulesets:

"Crumpled": {
  "RobotsTxt": {
    "DefaultRuleset": "NonProduction",
    "RuleSets": { // There can be multiple rulesets for complex scenarios!
      "Production": {
        "Allow": {
          "*" : ["/"]
        },
        "Disallow": {
          "*": [ "/cdn-cgi/challenge-platform/", "/cdn-cgi/email-platform/" ]
        }
      },
      "NonProduction": { 
        "Allow": {
          "SemrushBot": [ "/" ],
          "SemrushBot-SA": [ "/" ],
          "SemrushBot-Desktop": [ "/" ],
          "SemrushBot-Mobile": [ "/" ],
          "SiteAuditBot": [ "/" ]
        },
        "Disallow": {
          "*": [ "/" ]
        }
      }
    },
    "Sites": {
      "Prod": {
        "HostNames": "www.mysite.com",
        "SiteMapDomain": "www.mysite.com",
        "RuleSet": "Production"
      },
      "AnotherProd": {
        "HostNames": "www.anothermysite.com",
        "SiteMapDomain": "www.anothermysite.com",
        "RuleSet": "Production" // or can define alternate production ruleset for this site
      },
      "Stage": {
        "HostNames": "mysite-staging-uksouth01.umbraco.io,staging.mysite.com", 
        "SiteMapDomain": "staging.mysite.com",
        "RuleSet": "NonProduction"
      },
      "Dev": {
        "HostNames": "mysite-dev-uksouth01.umbraco.io,dev.mysite.com",
        "SiteMapDomain": "dev.mysite.com",
        "RuleSet": "NonProduction"
      }
    }
  }
}
Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
3.0.3 98 2/14/2026
3.0.2 100 2/12/2026
3.0.1 78 2/12/2026
3.0.0 84 2/12/2026