WebSpark.HttpClientUtility 2.1.2

dotnet add package WebSpark.HttpClientUtility --version 2.1.2
                    
NuGet\Install-Package WebSpark.HttpClientUtility -Version 2.1.2
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="WebSpark.HttpClientUtility" Version="2.1.2" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="WebSpark.HttpClientUtility" Version="2.1.2" />
                    
Directory.Packages.props
<PackageReference Include="WebSpark.HttpClientUtility" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add WebSpark.HttpClientUtility --version 2.1.2
                    
#r "nuget: WebSpark.HttpClientUtility, 2.1.2"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package WebSpark.HttpClientUtility@2.1.2
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=WebSpark.HttpClientUtility&version=2.1.2
                    
Install as a Cake Addin
#tool nuget:?package=WebSpark.HttpClientUtility&version=2.1.2
                    
Install as a Cake Tool

WebSpark.HttpClientUtility

Drop-in HttpClient wrapper with Polly resilience, response caching, and OpenTelemetry for .NET 8-10+ APIsβ€”configured in one line

NuGet Version NuGet Downloads Crawler Package License: MIT Build Status .NET 8-10 Documentation


Stop writing 50+ lines of HttpClient setup. Get enterprise-grade resilience (retries, circuit breakers), intelligent caching, structured logging with correlation IDs, and OpenTelemetry tracing in a single AddHttpClientUtility() call. Perfect for microservices, background workers, and web scrapers.


πŸš€ Why Choose WebSpark.HttpClientUtility?

Your HTTP setup in 1 line vs. 50+

Feature WebSpark.HttpClientUtility Raw HttpClient RestSharp Refit
Setup Complexity ⭐ One line ⭐⭐⭐ 50+ lines manual ⭐⭐ Low ⭐⭐ Low
Built-in Retry/Circuit Breaker βœ… Polly integrated ❌ Manual Polly setup ❌ Manual ❌ Manual
Response Caching βœ… Configurable, in-memory ❌ Manual ❌ Manual ❌ Manual
Correlation IDs βœ… Automatic ❌ Manual middleware ❌ Manual ❌ Manual
OpenTelemetry βœ… Built-in ❌ Manual ActivitySource ❌ Manual ❌ Manual
Structured Logging βœ… Rich context ❌ Manual ILogger ⭐⭐ Basic ⭐⭐ Basic
Web Crawling βœ… Separate package ❌ No ❌ No ❌ No
Production Trust βœ… 252+ tests, LTS support βœ… Microsoft-backed βœ… Popular (7M+ downloads) βœ… Popular (10M+ downloads)

When to use WebSpark:

  • βœ… Building microservices with distributed tracing requirements
  • βœ… Need resilience patterns without writing Polly boilerplate
  • βœ… Want intelligent caching for API rate-limit compliance
  • βœ… Building web scrapers or crawlers (with Crawler package)

When NOT to use WebSpark:

  • ❌ You need declarative, type-safe API clients (use Refit)
  • ❌ You want maximum control and minimal magic (use raw HttpClient)
  • ❌ Legacy .NET Framework 4.x projects (WebSpark requires .NET 8+)

πŸ›‘οΈ Production Trust

Battle-Tested & Production-Ready

  • βœ… 252+ unit tests with 100% passing - tested on .NET 8, 9, and 10
  • βœ… Continuous Integration via GitHub Actions - every commit tested
  • βœ… Semantic Versioning - predictable, safe upgrades
  • βœ… Zero breaking changes within major versions - backward compatibility guaranteed
  • βœ… Framework Support: .NET 8 LTS (supported until Nov 2026), .NET 9, .NET 10 (Preview)
  • βœ… MIT Licensed - free for commercial use

Support & Maintenance

  • πŸ”„ Active development - regular updates and improvements
  • πŸ“… Long-term support - each major version supported for 18+ months
  • πŸ’¬ Community support - GitHub Discussions for questions and best practices
  • πŸ“– Comprehensive documentation - Full docs site

Breaking Change Commitment

We follow semantic versioning strictly:

  • Patch versions (2.0.x): Bug fixes only, zero breaking changes
  • Minor versions (2.x.0): New features, backward compatible
  • Major versions (x.0.0): Breaking changes with detailed migration guides

πŸ“¦ v2.0 - Now in Two Focused Packages!

Starting with v2.0, the library is split into two packages:

Package Purpose Size Use When
WebSpark.HttpClientUtility Core HTTP features 163 KB You need HTTP client utilities (authentication, caching, resilience, telemetry)
WebSpark.HttpClientUtility.Crawler Web crawling extension 75 KB You need web crawling, robots.txt parsing, sitemap generation

Upgrading from v1.x? Most users need no code changes! See Migration Guide.

πŸ“š Documentation

View Full Documentation β†’

The complete documentation site includes:

  • Getting started guide
  • Feature documentation
  • API reference
  • Code examples
  • Best practices

⚑ 30-Second Quick Start

Install

dotnet add package WebSpark.HttpClientUtility

Minimal Example (Absolute Minimum)

// Program.cs
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddHttpClientUtility();
var app = builder.Build();

app.MapGet("/weather", async (IHttpRequestResultService http) =>
{
    var request = new HttpRequestResult<WeatherData>
    {
        RequestPath = "https://api.weather.com/forecast?city=Seattle",
        RequestMethod = HttpMethod.Get
    };
    var result = await http.HttpSendRequestResultAsync(request);
    return result.IsSuccessStatusCode ? Results.Ok(result.ResponseResults) : Results.Problem();
});

app.Run();

record WeatherData(string City, int Temp);

That's it! You now have:

  • βœ… Automatic correlation IDs for tracing
  • βœ… Structured logging with request/response details
  • βœ… Request timing telemetry
  • βœ… Proper error handling and exception management
  • βœ… Support for .NET 8 LTS, .NET 9, and .NET 10 (Preview)

<details> <summary>πŸ“– Show more: Service-based pattern with error handling</summary>

// Program.cs
builder.Services.AddHttpClientUtility(options =>
{
    options.EnableCaching = true;      // Cache responses
    options.EnableResilience = true;   // Retry on failure
});

// WeatherService.cs
public class WeatherService
{
    private readonly IHttpRequestResultService _http;
    private readonly ILogger<WeatherService> _logger;

    public WeatherService(
        IHttpRequestResultService http,
        ILogger<WeatherService> logger)
    {
        _http = http;
        _logger = logger;
    }

    public async Task<WeatherData?> GetWeatherAsync(string city)
    {
        var request = new HttpRequestResult<WeatherData>
        {
            RequestPath = $"https://api.weather.com/forecast?city={city}",
            RequestMethod = HttpMethod.Get,
            CacheDurationMinutes = 10  // Cache for 10 minutes
        };

        var result = await _http.HttpSendRequestResultAsync(request);

        if (!result.IsSuccessStatusCode)
        {
            _logger.LogError(
                "Weather API failed: {StatusCode} - {Error}",
                result.StatusCode,
                result.ErrorDetails
            );
            return null;
        }

        return result.ResponseResults;
    }
}

</details>

<details> <summary>πŸ“– Show more: Full-featured with auth and observability</summary>

// Program.cs - Advanced configuration
builder.Services.AddHttpClientUtility(options =>
{
    options.EnableCaching = true;
    options.EnableResilience = true;
    options.ResilienceOptions.MaxRetryAttempts = 3;
    options.ResilienceOptions.RetryDelay = TimeSpan.FromSeconds(2);
    options.DefaultTimeout = TimeSpan.FromSeconds(30);
});

// WeatherService.cs - Advanced usage
public async Task<WeatherData?> GetWeatherWithAuthAsync(string city, string apiKey)
{
    var request = new HttpRequestResult<WeatherData>
    {
        RequestPath = $"https://api.weather.com/forecast?city={city}",
        RequestMethod = HttpMethod.Get,
        CacheDurationMinutes = 10,
        Headers = new Dictionary<string, string>
        {
            ["X-API-Key"] = apiKey,
            ["Accept"] = "application/json"
        }
    };

    var result = await _http.HttpSendRequestResultAsync(request);

    // Correlation ID is automatically logged and propagated
    _logger.LogInformation(
        "Weather request completed in {Duration}ms with correlation {CorrelationId}",
        result.RequestDuration,
        result.CorrelationId
    );

    return result.IsSuccessStatusCode ? result.ResponseResults : null;
}

</details>

Web Crawling Features (Crawler Package)

Install Both Packages

dotnet add package WebSpark.HttpClientUtility
dotnet add package WebSpark.HttpClientUtility.Crawler

Register Services

// Program.cs
builder.Services.AddHttpClientUtility();
builder.Services.AddHttpClientCrawler();  // Adds crawler features

Use Crawler

public class SiteAnalyzer
{
    private readonly ISiteCrawler _crawler;
    
    public SiteAnalyzer(ISiteCrawler crawler) => _crawler = crawler;

    public async Task<CrawlResult> AnalyzeSiteAsync(string url)
    {
        var options = new CrawlerOptions
        {
            MaxDepth = 3,
            MaxPages = 100,
            RespectRobotsTxt = true
        };
        
        return await _crawler.CrawlAsync(url, options);
    }
}

πŸš€ Features

Base Package Features

  • Simple API - Intuitive request/response model
  • Authentication - Bearer token, Basic auth, API key providers
  • Correlation IDs - Automatic tracking across distributed systems
  • Structured Logging - Rich context in all log messages
  • Telemetry - Request timing and performance metrics
  • Error Handling - Standardized exception processing
  • Type-Safe - Strongly-typed request and response models
  • Caching - In-memory response caching (optional)
  • Resilience - Polly retry and circuit breaker policies (optional)
  • Concurrent Requests - Parallel request processing
  • Fire-and-Forget - Background request execution
  • Streaming - Efficient handling of large responses
  • OpenTelemetry - Full observability integration (optional)
  • CURL Export - Generate CURL commands for debugging

Crawler Package Features

  • Site Crawling - Full website crawling with depth control
  • Robots.txt - Automatic compliance with robots.txt rules
  • Sitemap Generation - Create XML sitemaps from crawl results
  • HTML Parsing - Extract links and metadata with HtmlAgilityPack
  • SignalR Progress - Real-time crawl progress updates
  • CSV Export - Export crawl results to CSV files
  • Performance Tracking - Monitor crawl speed and efficiency

πŸ“š Common Scenarios

Enable Caching

builder.Services.AddHttpClientUtility(options =>
{
    options.EnableCaching = true;
});

// In your service
var request = new HttpRequestResult<Product>
{
    RequestPath = "https://api.example.com/products/123",
    RequestMethod = HttpMethod.Get,
    CacheDurationMinutes = 10  // Cache for 10 minutes
};

Add Resilience (Retry + Circuit Breaker)

builder.Services.AddHttpClientUtility(options =>
{
    options.EnableResilience = true;
  options.ResilienceOptions.MaxRetryAttempts = 3;
    options.ResilienceOptions.RetryDelay = TimeSpan.FromSeconds(2);
});

All Features Enabled

builder.Services.AddHttpClientUtilityWithAllFeatures();

πŸ”„ Upgrading from v1.x

If You DON'T Use Web Crawling

No code changes required! Simply upgrade:

dotnet add package WebSpark.HttpClientUtility --version 2.0.0

Your existing code continues to work exactly as before. All core HTTP features (authentication, caching, resilience, telemetry, etc.) are still in the base package with the same API.

If You DO Use Web Crawling

Three simple steps to migrate:

Step 1: Install the crawler package

dotnet add package WebSpark.HttpClientUtility.Crawler --version 2.0.0

Step 2: Add using directive

using WebSpark.HttpClientUtility.Crawler;

Step 3: Update service registration

// v1.x (old)
services.AddHttpClientUtility();

// v2.0 (new)
services.AddHttpClientUtility();
services.AddHttpClientCrawler();  // Add this line

That's it! Your crawler code (ISiteCrawler, SiteCrawler, SimpleSiteCrawler, etc.) works identically after these changes.

Need Help? See the detailed migration guide or open an issue.

πŸ“– Documentation

πŸŽ“ Sample Projects

Explore working examples in the samples directory:

  • BasicUsage - Simple GET/POST requests
  • WithCaching - Response caching implementation
  • WithResilience - Retry and circuit breaker patterns
  • ConcurrentRequests - Parallel request processing
  • WebCrawler - Site crawling example

🀝 Contributing

Contributions are welcome! See our Contributing Guide for details.

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for your changes
  4. Ensure all tests pass
  5. Submit a pull request

πŸ“Š Project Stats

  • 252+ Unit Tests - 100% passing
  • Supports .NET 8 LTS, .NET 9, & .NET 10 (Preview)
  • MIT Licensed - Free for commercial use
  • Active Maintenance - Regular updates
Package Purpose Status
WebSpark.HttpClientUtility.Testing Test helpers & fakes for unit testing βœ… Available (v2.1.0+)

Testing Package Features:

  • FakeHttpResponseHandler - Mock HTTP responses without network calls
  • Fluent API - Easy test setup with ForRequest().RespondWith()
  • Sequential Responses - Test retry behavior with multiple responses
  • Request Verification - Assert requests were made correctly
  • Latency Simulation - Test timeout scenarios
dotnet add package WebSpark.HttpClientUtility.Testing

See the Testing documentation for examples.

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.


Questions or Issues? Open an issue or start a discussion!

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 is compatible.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (2)

Showing the top 2 NuGet packages that depend on WebSpark.HttpClientUtility:

Package Downloads
WebSpark.Bootswatch

WebSpark.Bootswatch provides Bootswatch themes for ASP.NET Core applications. It includes custom themes and styles that can be easily integrated with ASP.NET Core MVC or Razor Pages applications. Supports .NET 8.0, 9.0, and 10.0. ⚠️ IMPORTANT: This package requires WebSpark.HttpClientUtility to be installed and registered separately. SETUP: 1. Install: dotnet add package WebSpark.HttpClientUtility 2. Register: builder.Services.AddHttpClientUtility(); (BEFORE AddBootswatchThemeSwitcher) 3. Configure appsettings.json with HttpRequestResultPollyOptions section See package README for complete setup guide.

WebSpark.HttpClientUtility.Crawler

Web crawling extension for WebSpark.HttpClientUtility. Includes SiteCrawler and SimpleSiteCrawler with robots.txt compliance, HTML link extraction (HtmlAgilityPack), sitemap generation (Markdig), CSV export (CsvHelper), and real-time SignalR progress updates. Perfect for web scraping, SEO audits, and site analysis. Supports .NET 8 LTS, .NET 9, and .NET 10 (Preview). Requires WebSpark.HttpClientUtility base package [2.1.0]. Install both packages and call AddHttpClientUtility() + AddHttpClientCrawler() in your DI registration.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
2.1.2 211 12/4/2025
2.1.1 398 11/12/2025
2.0.0 203 11/5/2025
1.5.1 182 11/2/2025
1.5.0 184 11/2/2025
1.4.0 198 11/2/2025
1.3.2 135 11/1/2025
1.3.0 176 10/7/2025
1.2.0 131 9/26/2025
1.1.0 232 7/1/2025
1.0.10 132 5/24/2025
1.0.8 221 5/19/2025
1.0.5 316 5/4/2025
1.0.4 175 5/3/2025
1.0.3 112 5/3/2025
1.0.2 110 5/3/2025
0.1.0 81 5/3/2025

2.1.2 - Security patch: Fixed js-yaml prototype pollution (GHSA-mh29-5h37-fv8m,
MODERATE) and glob command injection (GHSA-5j98-mcp5-4vw2, HIGH) in documentation build
dependencies. All Dependabot alerts resolved. Zero breaking changes.
2.1.1 - GitHub Actions: Fixed package verification step. Zero breaking changes.
2.1.0 - Added .NET 10 (Preview) multi-targeting support. All projects now target net8.0,
net9.0, and net10.0. Updated Microsoft.Extensions packages to 10.0.0. All 291 tests
passing on all three frameworks (873 test runs, 0 failures). Zero breaking changes.
2.0.0 - MAJOR: Package split into base + crawler. Base package now 163 KB with 10
dependencies (down from 13). Zero breaking changes for core HTTP users. Web crawling
moved to separate WebSpark.HttpClientUtility.Crawler package. CurlCommandSaver now uses
JSON Lines format. All 474 tests passing.