WebSpark.HttpClientUtility
2.1.2
dotnet add package WebSpark.HttpClientUtility --version 2.1.2
NuGet\Install-Package WebSpark.HttpClientUtility -Version 2.1.2
<PackageReference Include="WebSpark.HttpClientUtility" Version="2.1.2" />
<PackageVersion Include="WebSpark.HttpClientUtility" Version="2.1.2" />
<PackageReference Include="WebSpark.HttpClientUtility" />
paket add WebSpark.HttpClientUtility --version 2.1.2
#r "nuget: WebSpark.HttpClientUtility, 2.1.2"
#:package WebSpark.HttpClientUtility@2.1.2
#addin nuget:?package=WebSpark.HttpClientUtility&version=2.1.2
#tool nuget:?package=WebSpark.HttpClientUtility&version=2.1.2
WebSpark.HttpClientUtility
Drop-in HttpClient wrapper with Polly resilience, response caching, and OpenTelemetry for .NET 8-10+ APIsβconfigured in one line
Stop writing 50+ lines of HttpClient setup. Get enterprise-grade resilience (retries, circuit breakers), intelligent caching, structured logging with correlation IDs, and OpenTelemetry tracing in a single AddHttpClientUtility() call. Perfect for microservices, background workers, and web scrapers.
π Why Choose WebSpark.HttpClientUtility?
Your HTTP setup in 1 line vs. 50+
| Feature | WebSpark.HttpClientUtility | Raw HttpClient | RestSharp | Refit |
|---|---|---|---|---|
| Setup Complexity | β One line | βββ 50+ lines manual | ββ Low | ββ Low |
| Built-in Retry/Circuit Breaker | β Polly integrated | β Manual Polly setup | β Manual | β Manual |
| Response Caching | β Configurable, in-memory | β Manual | β Manual | β Manual |
| Correlation IDs | β Automatic | β Manual middleware | β Manual | β Manual |
| OpenTelemetry | β Built-in | β Manual ActivitySource | β Manual | β Manual |
| Structured Logging | β Rich context | β Manual ILogger | ββ Basic | ββ Basic |
| Web Crawling | β Separate package | β No | β No | β No |
| Production Trust | β 252+ tests, LTS support | β Microsoft-backed | β Popular (7M+ downloads) | β Popular (10M+ downloads) |
When to use WebSpark:
- β Building microservices with distributed tracing requirements
- β Need resilience patterns without writing Polly boilerplate
- β Want intelligent caching for API rate-limit compliance
- β Building web scrapers or crawlers (with Crawler package)
When NOT to use WebSpark:
- β You need declarative, type-safe API clients (use Refit)
- β You want maximum control and minimal magic (use raw HttpClient)
- β Legacy .NET Framework 4.x projects (WebSpark requires .NET 8+)
π‘οΈ Production Trust
Battle-Tested & Production-Ready
- β 252+ unit tests with 100% passing - tested on .NET 8, 9, and 10
- β Continuous Integration via GitHub Actions - every commit tested
- β Semantic Versioning - predictable, safe upgrades
- β Zero breaking changes within major versions - backward compatibility guaranteed
- β Framework Support: .NET 8 LTS (supported until Nov 2026), .NET 9, .NET 10 (Preview)
- β MIT Licensed - free for commercial use
Support & Maintenance
- π Active development - regular updates and improvements
- π Long-term support - each major version supported for 18+ months
- π¬ Community support - GitHub Discussions for questions and best practices
- π Comprehensive documentation - Full docs site
Breaking Change Commitment
We follow semantic versioning strictly:
- Patch versions (2.0.x): Bug fixes only, zero breaking changes
- Minor versions (2.x.0): New features, backward compatible
- Major versions (x.0.0): Breaking changes with detailed migration guides
π¦ v2.0 - Now in Two Focused Packages!
Starting with v2.0, the library is split into two packages:
| Package | Purpose | Size | Use When |
|---|---|---|---|
| WebSpark.HttpClientUtility | Core HTTP features | 163 KB | You need HTTP client utilities (authentication, caching, resilience, telemetry) |
| WebSpark.HttpClientUtility.Crawler | Web crawling extension | 75 KB | You need web crawling, robots.txt parsing, sitemap generation |
Upgrading from v1.x? Most users need no code changes! See Migration Guide.
π Documentation
The complete documentation site includes:
- Getting started guide
- Feature documentation
- API reference
- Code examples
- Best practices
β‘ 30-Second Quick Start
Install
dotnet add package WebSpark.HttpClientUtility
Minimal Example (Absolute Minimum)
// Program.cs
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddHttpClientUtility();
var app = builder.Build();
app.MapGet("/weather", async (IHttpRequestResultService http) =>
{
var request = new HttpRequestResult<WeatherData>
{
RequestPath = "https://api.weather.com/forecast?city=Seattle",
RequestMethod = HttpMethod.Get
};
var result = await http.HttpSendRequestResultAsync(request);
return result.IsSuccessStatusCode ? Results.Ok(result.ResponseResults) : Results.Problem();
});
app.Run();
record WeatherData(string City, int Temp);
That's it! You now have:
- β Automatic correlation IDs for tracing
- β Structured logging with request/response details
- β Request timing telemetry
- β Proper error handling and exception management
- β Support for .NET 8 LTS, .NET 9, and .NET 10 (Preview)
<details> <summary>π Show more: Service-based pattern with error handling</summary>
// Program.cs
builder.Services.AddHttpClientUtility(options =>
{
options.EnableCaching = true; // Cache responses
options.EnableResilience = true; // Retry on failure
});
// WeatherService.cs
public class WeatherService
{
private readonly IHttpRequestResultService _http;
private readonly ILogger<WeatherService> _logger;
public WeatherService(
IHttpRequestResultService http,
ILogger<WeatherService> logger)
{
_http = http;
_logger = logger;
}
public async Task<WeatherData?> GetWeatherAsync(string city)
{
var request = new HttpRequestResult<WeatherData>
{
RequestPath = $"https://api.weather.com/forecast?city={city}",
RequestMethod = HttpMethod.Get,
CacheDurationMinutes = 10 // Cache for 10 minutes
};
var result = await _http.HttpSendRequestResultAsync(request);
if (!result.IsSuccessStatusCode)
{
_logger.LogError(
"Weather API failed: {StatusCode} - {Error}",
result.StatusCode,
result.ErrorDetails
);
return null;
}
return result.ResponseResults;
}
}
</details>
<details> <summary>π Show more: Full-featured with auth and observability</summary>
// Program.cs - Advanced configuration
builder.Services.AddHttpClientUtility(options =>
{
options.EnableCaching = true;
options.EnableResilience = true;
options.ResilienceOptions.MaxRetryAttempts = 3;
options.ResilienceOptions.RetryDelay = TimeSpan.FromSeconds(2);
options.DefaultTimeout = TimeSpan.FromSeconds(30);
});
// WeatherService.cs - Advanced usage
public async Task<WeatherData?> GetWeatherWithAuthAsync(string city, string apiKey)
{
var request = new HttpRequestResult<WeatherData>
{
RequestPath = $"https://api.weather.com/forecast?city={city}",
RequestMethod = HttpMethod.Get,
CacheDurationMinutes = 10,
Headers = new Dictionary<string, string>
{
["X-API-Key"] = apiKey,
["Accept"] = "application/json"
}
};
var result = await _http.HttpSendRequestResultAsync(request);
// Correlation ID is automatically logged and propagated
_logger.LogInformation(
"Weather request completed in {Duration}ms with correlation {CorrelationId}",
result.RequestDuration,
result.CorrelationId
);
return result.IsSuccessStatusCode ? result.ResponseResults : null;
}
</details>
Web Crawling Features (Crawler Package)
Install Both Packages
dotnet add package WebSpark.HttpClientUtility
dotnet add package WebSpark.HttpClientUtility.Crawler
Register Services
// Program.cs
builder.Services.AddHttpClientUtility();
builder.Services.AddHttpClientCrawler(); // Adds crawler features
Use Crawler
public class SiteAnalyzer
{
private readonly ISiteCrawler _crawler;
public SiteAnalyzer(ISiteCrawler crawler) => _crawler = crawler;
public async Task<CrawlResult> AnalyzeSiteAsync(string url)
{
var options = new CrawlerOptions
{
MaxDepth = 3,
MaxPages = 100,
RespectRobotsTxt = true
};
return await _crawler.CrawlAsync(url, options);
}
}
π Features
Base Package Features
- Simple API - Intuitive request/response model
- Authentication - Bearer token, Basic auth, API key providers
- Correlation IDs - Automatic tracking across distributed systems
- Structured Logging - Rich context in all log messages
- Telemetry - Request timing and performance metrics
- Error Handling - Standardized exception processing
- Type-Safe - Strongly-typed request and response models
- Caching - In-memory response caching (optional)
- Resilience - Polly retry and circuit breaker policies (optional)
- Concurrent Requests - Parallel request processing
- Fire-and-Forget - Background request execution
- Streaming - Efficient handling of large responses
- OpenTelemetry - Full observability integration (optional)
- CURL Export - Generate CURL commands for debugging
Crawler Package Features
- Site Crawling - Full website crawling with depth control
- Robots.txt - Automatic compliance with robots.txt rules
- Sitemap Generation - Create XML sitemaps from crawl results
- HTML Parsing - Extract links and metadata with HtmlAgilityPack
- SignalR Progress - Real-time crawl progress updates
- CSV Export - Export crawl results to CSV files
- Performance Tracking - Monitor crawl speed and efficiency
π Common Scenarios
Enable Caching
builder.Services.AddHttpClientUtility(options =>
{
options.EnableCaching = true;
});
// In your service
var request = new HttpRequestResult<Product>
{
RequestPath = "https://api.example.com/products/123",
RequestMethod = HttpMethod.Get,
CacheDurationMinutes = 10 // Cache for 10 minutes
};
Add Resilience (Retry + Circuit Breaker)
builder.Services.AddHttpClientUtility(options =>
{
options.EnableResilience = true;
options.ResilienceOptions.MaxRetryAttempts = 3;
options.ResilienceOptions.RetryDelay = TimeSpan.FromSeconds(2);
});
All Features Enabled
builder.Services.AddHttpClientUtilityWithAllFeatures();
π Upgrading from v1.x
If You DON'T Use Web Crawling
No code changes required! Simply upgrade:
dotnet add package WebSpark.HttpClientUtility --version 2.0.0
Your existing code continues to work exactly as before. All core HTTP features (authentication, caching, resilience, telemetry, etc.) are still in the base package with the same API.
If You DO Use Web Crawling
Three simple steps to migrate:
Step 1: Install the crawler package
dotnet add package WebSpark.HttpClientUtility.Crawler --version 2.0.0
Step 2: Add using directive
using WebSpark.HttpClientUtility.Crawler;
Step 3: Update service registration
// v1.x (old)
services.AddHttpClientUtility();
// v2.0 (new)
services.AddHttpClientUtility();
services.AddHttpClientCrawler(); // Add this line
That's it! Your crawler code (ISiteCrawler, SiteCrawler, SimpleSiteCrawler, etc.) works identically after these changes.
Need Help? See the detailed migration guide or open an issue.
π Documentation
- Getting Started Guide - Complete walkthrough
- Configuration Options - All settings explained
- Caching Guide - Response caching strategies
- Resilience Guide - Retry and circuit breaker patterns
- Web Crawling - Site crawler features
- Migration Guide - From raw HttpClient
- API Reference - Complete API documentation
π Sample Projects
Explore working examples in the samples directory:
- BasicUsage - Simple GET/POST requests
- WithCaching - Response caching implementation
- WithResilience - Retry and circuit breaker patterns
- ConcurrentRequests - Parallel request processing
- WebCrawler - Site crawling example
π€ Contributing
Contributions are welcome! See our Contributing Guide for details.
- Fork the repository
- Create a feature branch
- Add tests for your changes
- Ensure all tests pass
- Submit a pull request
π Project Stats
- 252+ Unit Tests - 100% passing
- Supports .NET 8 LTS, .NET 9, & .NET 10 (Preview)
- MIT Licensed - Free for commercial use
- Active Maintenance - Regular updates
π¦ Related Packages
| Package | Purpose | Status |
|---|---|---|
| WebSpark.HttpClientUtility.Testing | Test helpers & fakes for unit testing | β Available (v2.1.0+) |
Testing Package Features:
- FakeHttpResponseHandler - Mock HTTP responses without network calls
- Fluent API - Easy test setup with
ForRequest().RespondWith() - Sequential Responses - Test retry behavior with multiple responses
- Request Verification - Assert requests were made correctly
- Latency Simulation - Test timeout scenarios
dotnet add package WebSpark.HttpClientUtility.Testing
See the Testing documentation for examples.
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Links
Questions or Issues? Open an issue or start a discussion!
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- Microsoft.Extensions.Caching.Abstractions (>= 10.0.0)
- Microsoft.Extensions.Caching.Memory (>= 10.0.0)
- Microsoft.Extensions.Http (>= 10.0.0)
- Newtonsoft.Json (>= 13.0.4)
- OpenTelemetry (>= 1.14.0)
- OpenTelemetry.Exporter.Console (>= 1.14.0)
- OpenTelemetry.Exporter.OpenTelemetryProtocol (>= 1.14.0)
- OpenTelemetry.Extensions.Hosting (>= 1.14.0)
- OpenTelemetry.Instrumentation.Http (>= 1.14.0)
- Polly (>= 8.6.5)
-
net8.0
- Microsoft.Extensions.Caching.Abstractions (>= 10.0.0)
- Microsoft.Extensions.Caching.Memory (>= 10.0.0)
- Microsoft.Extensions.Http (>= 10.0.0)
- Newtonsoft.Json (>= 13.0.4)
- OpenTelemetry (>= 1.14.0)
- OpenTelemetry.Exporter.Console (>= 1.14.0)
- OpenTelemetry.Exporter.OpenTelemetryProtocol (>= 1.14.0)
- OpenTelemetry.Extensions.Hosting (>= 1.14.0)
- OpenTelemetry.Instrumentation.Http (>= 1.14.0)
- Polly (>= 8.6.5)
-
net9.0
- Microsoft.Extensions.Caching.Abstractions (>= 10.0.0)
- Microsoft.Extensions.Caching.Memory (>= 10.0.0)
- Microsoft.Extensions.Http (>= 10.0.0)
- Newtonsoft.Json (>= 13.0.4)
- OpenTelemetry (>= 1.14.0)
- OpenTelemetry.Exporter.Console (>= 1.14.0)
- OpenTelemetry.Exporter.OpenTelemetryProtocol (>= 1.14.0)
- OpenTelemetry.Extensions.Hosting (>= 1.14.0)
- OpenTelemetry.Instrumentation.Http (>= 1.14.0)
- Polly (>= 8.6.5)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on WebSpark.HttpClientUtility:
| Package | Downloads |
|---|---|
|
WebSpark.Bootswatch
WebSpark.Bootswatch provides Bootswatch themes for ASP.NET Core applications. It includes custom themes and styles that can be easily integrated with ASP.NET Core MVC or Razor Pages applications. Supports .NET 8.0, 9.0, and 10.0. β οΈ IMPORTANT: This package requires WebSpark.HttpClientUtility to be installed and registered separately. SETUP: 1. Install: dotnet add package WebSpark.HttpClientUtility 2. Register: builder.Services.AddHttpClientUtility(); (BEFORE AddBootswatchThemeSwitcher) 3. Configure appsettings.json with HttpRequestResultPollyOptions section See package README for complete setup guide. |
|
|
WebSpark.HttpClientUtility.Crawler
Web crawling extension for WebSpark.HttpClientUtility. Includes SiteCrawler and SimpleSiteCrawler with robots.txt compliance, HTML link extraction (HtmlAgilityPack), sitemap generation (Markdig), CSV export (CsvHelper), and real-time SignalR progress updates. Perfect for web scraping, SEO audits, and site analysis. Supports .NET 8 LTS, .NET 9, and .NET 10 (Preview). Requires WebSpark.HttpClientUtility base package [2.1.0]. Install both packages and call AddHttpClientUtility() + AddHttpClientCrawler() in your DI registration. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 2.1.2 | 211 | 12/4/2025 |
| 2.1.1 | 398 | 11/12/2025 |
| 2.0.0 | 203 | 11/5/2025 |
| 1.5.1 | 182 | 11/2/2025 |
| 1.5.0 | 184 | 11/2/2025 |
| 1.4.0 | 198 | 11/2/2025 |
| 1.3.2 | 135 | 11/1/2025 |
| 1.3.0 | 176 | 10/7/2025 |
| 1.2.0 | 131 | 9/26/2025 |
| 1.1.0 | 232 | 7/1/2025 |
| 1.0.10 | 132 | 5/24/2025 |
| 1.0.8 | 221 | 5/19/2025 |
| 1.0.5 | 316 | 5/4/2025 |
| 1.0.4 | 175 | 5/3/2025 |
| 1.0.3 | 112 | 5/3/2025 |
| 1.0.2 | 110 | 5/3/2025 |
| 0.1.0 | 81 | 5/3/2025 |
2.1.2 - Security patch: Fixed js-yaml prototype pollution (GHSA-mh29-5h37-fv8m,
MODERATE) and glob command injection (GHSA-5j98-mcp5-4vw2, HIGH) in documentation build
dependencies. All Dependabot alerts resolved. Zero breaking changes.
2.1.1 - GitHub Actions: Fixed package verification step. Zero breaking changes.
2.1.0 - Added .NET 10 (Preview) multi-targeting support. All projects now target net8.0,
net9.0, and net10.0. Updated Microsoft.Extensions packages to 10.0.0. All 291 tests
passing on all three frameworks (873 test runs, 0 failures). Zero breaking changes.
2.0.0 - MAJOR: Package split into base + crawler. Base package now 163 KB with 10
dependencies (down from 13). Zero breaking changes for core HTTP users. Web crawling
moved to separate WebSpark.HttpClientUtility.Crawler package. CurlCommandSaver now uses
JSON Lines format. All 474 tests passing.