StorageContextLib 1.0.30

There is a newer version of this package available.
See the version list below for details.
dotnet add package StorageContextLib --version 1.0.30
                    
NuGet\Install-Package StorageContextLib -Version 1.0.30
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="StorageContextLib" Version="1.0.30" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="StorageContextLib" Version="1.0.30" />
                    
Directory.Packages.props
<PackageReference Include="StorageContextLib" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add StorageContextLib --version 1.0.30
                    
#r "nuget: StorageContextLib, 1.0.30"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package StorageContextLib@1.0.30
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=StorageContextLib&version=1.0.30
                    
Install as a Cake Addin
#tool nuget:?package=StorageContextLib&version=1.0.30
                    
Install as a Cake Tool

StorageContextLib - Unified Cloud Storage API

A powerful .NET library that provides a unified abstraction layer for Azure Blob Storage and Amazon S3, enabling seamless multi-cloud storage operations.

NuGet .NET

Features

Multi-Cloud Support - Work with Azure Blob Storage and Amazon S3 through a single interface
Multi-Framework - Supports .NET 8.0, 9.0, and 10.0
Batch Operations - High-performance batch read, write, and copy operations
Streaming Support - Memory-efficient streaming with configurable chunk sizes
Async Enumerable - Stream large files with IAsyncEnumerable<ReadOnlyMemory<byte>>
Retry Logic - Built-in exponential backoff for transient failures
Cross-Account Operations - Copy files between different storage accounts
Background Services - Connection pool management with BackgroundService
Move Operations - Atomic move with overwrite support

⚠️ Note: Amazon S3 implementation is currently under active development. Azure Blob Storage is fully supported and production-ready.

Installation

dotnet add package StorageContextLib

Or via NuGet Package Manager:

Install-Package StorageContextLib

Configuration

1. Add to appsettings.json

{
  "AzureContext": [
    {
      "azureConnectionString": "DefaultEndpointsProtocol=https;AccountName=mystorageaccount;AccountKey=...;EndpointSuffix=core.windows.net",
      "azureStorageName": "mystorageaccount",
      "azureContainerName": "documents"
    },
    {
      "azureConnectionString": "DefaultEndpointsProtocol=https;AccountName=myotherstorage;AccountKey=...;EndpointSuffix=core.windows.net",
      "azureStorageName": "myotherstorage",
      "azureContainerName": "documents"
    }
  ],
  "AmazonContext": [
    {
      "AccessKey": "your-aws-access-key",
      "SecretKey": "your-aws-secret-key",
      "BucketName": "my-s3-bucket",
      "region": 0
    }
  ],
  "BackgroundBlobService": {
    "ConnectionsInPool": 5,
    "PoolInterval": 50,
    "ServiceInterval": 60
  }
}
Background Service Configuration (Performance Optimization)

For improved performance and connection management, configure the BackgroundBlobService:

Property Description Recommended Value
ConnectionsInPool Number of concurrent connections to maintain per storage account 5-10 for production
PoolInterval Seconds of inactivity before pinging storage to keep connection warm 50-120 seconds
ServiceInterval Seconds between background service health checks 60-300 seconds

Performance Impact:

  • ✅ Reduces cold-start latency on blob operations
  • ✅ Maintains warm connections to storage accounts
  • ✅ Prevents connection timeouts during idle periods
  • ⚡ Can improve operation speed by up to 40% under load

Set ConnectionsInPool: 0 to disable background service if not needed.

2. Register Services in Program.cs

using StorageContextLib;

var builder = WebApplication.CreateBuilder(args);

// Register multi-cloud storage services
builder.Services.AddSingleton<IEnumerable<IBlobStorageService>>(ctx =>
{
    var storageContext = new MultiContext
    {
        Azureprops = builder.Configuration
            .GetRequiredSection("AzureContext")
            .Get<List<AzureStorageProps>>(),
        
        Amazonprops = builder.Configuration
            .GetRequiredSection("AmazonContext")
            .Get<List<AmazonStorageProps>>()
    };

    return storageContext.CreateService();
});

// Optional: Enable background connection pool management (recommended for high-performance scenarios)
builder.Services.AddHostedService<BlobServiceBackgroundTask>();

builder.Services.AddControllers();
var app = builder.Build();

app.MapControllers();
app.Run();

Complete Usage Examples

Basic Controller Setup

using Microsoft.AspNetCore.Mvc;
using StorageContextLib;
using System.Collections.Concurrent;
using System.Diagnostics;

[ApiController]
[Route("api/[controller]")]
public class StorageController : ControllerBase
{
    private readonly IEnumerable<IBlobStorageService> _storageServices;
    private readonly ILogger<StorageController> _logger;

    public StorageController(
        IEnumerable<IBlobStorageService> storageServices,
        ILogger<StorageController> logger)
    {
        _storageServices = storageServices;
        _logger = logger;
    }
}

1. Upload Files

[HttpPost("upload")]
public async Task<IActionResult> UploadFile(IFormFile file)
{
    var storage = _storageServices.instance(
        storageAccountName: "mystorageaccount",
        containerName: "documents"
    );

    using var stream = file.OpenReadStream();
    await storage.UploadBlobAsync($"uploads/{file.FileName}", stream);

    return Ok(new { message = "File uploaded successfully" });
}

2. Download Files

[HttpGet("download/{*filePath}")]
public async Task<IActionResult> DownloadFile(string filePath)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var stream = await storage.DownloadBlobAsync(filePath);
    if (stream == null)
        return NotFound();

    return File(stream, "application/octet-stream", Path.GetFileName(filePath));
}

3. Stream Large Files (Memory-Efficient)

[HttpGet("stream/{*blobPath}")]
public async Task<IActionResult> StreamFile(
    string blobPath,
    [FromQuery] int chunkSizeInMB = 4)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var contentType = Path.GetExtension(blobPath).ToLower() switch
    {
        ".mp4" => "video/mp4",
        ".pdf" => "application/pdf",
        ".jpg" or ".jpeg" => "image/jpeg",
        ".png" => "image/png",
        _ => "application/octet-stream"
    };

    Response.ContentType = contentType;

    await foreach (var chunk in storage.StreamBlobInChunksAsync(
        blobPath, 
        chunkSizeInMB, 
        HttpContext.RequestAborted))
    {
        await Response.Body.WriteAsync(chunk.ToArray());
        await Response.Body.FlushAsync();
    }

    return new EmptyResult();
}

4. Batch Read Operations

[HttpPost("batch-read")]
public async Task<IActionResult> BatchRead([FromBody] List<string> filePaths)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var paths = new ConcurrentBag<string>(filePaths);
    var results = await storage.ReadBatch(paths, exGlobal: true);

    var response = results
        .Where(r => r.stream != null)
        .Select(r => new
        {
            path = r.path,
            size = r.stream.Length,
            exists = true
        })
        .ToList();

    return Ok(response);
}

5. Batch Write Operations

[HttpPost("batch-write")]
public async Task<IActionResult> BatchWrite([FromBody] Dictionary<string, string> files)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var writeBatch = new ConcurrentBag<Batch>();
    
    foreach (var file in files)
    {
        var contentBytes = Encoding.UTF8.GetBytes(file.Value);
        writeBatch.Add(new Batch
        {
            path = file.Key,
            stream = new MemoryStream(contentBytes)
        });
    }

    await storage.WriteBatch(writeBatch);

    return Ok(new { filesWritten = writeBatch.Count });
}

6. Move/Rename Files

[HttpPost("move")]
public async Task<IActionResult> MoveFile(
    [FromQuery] string sourcePath,
    [FromQuery] string destinationPath)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var success = await storage.Move(sourcePath, destinationPath);

    return Ok(new { success, message = "File moved successfully" });
}

7. Copy Files Across Storage Accounts

[HttpPost("copy-across-accounts")]
public async Task<IActionResult> CopyAcrossAccounts()
{
    var storage = _storageServices.instance("sourceaccount", "documents");
    
    var copyBatch = new CopyBatch
    {
        blobStorageServices = _storageServices,
        paths = new ConcurrentDictionary<string, string>
        {
            ["folder/source.pdf"] = "folder/destination.pdf",
            ["images/photo.jpg"] = "archive/photo.jpg"
        },
        fromStorageAccount = "sourceaccount",
        toStorageAccount = "destaccount"
    };

    await storage.CopyBatch(copyBatch);

    return Ok(new { message = "Files copied successfully" });
}

8. List Directory Contents

[HttpGet("list")]
public async Task<IActionResult> ListFiles(
    [FromQuery] string folder = "",
    [FromQuery] string pattern = "*")
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var files = await storage.GetDirectoryFilesPath(
        basePath: folder,
        searchoption: SearchOption.AllDirectories,
        pattern: pattern
    );

    return Ok(new { count = files.Count, files });
}

9. Delete Files

[HttpDelete("delete/{*filePath}")]
public async Task<IActionResult> DeleteFile(string filePath)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var deleted = await storage.DeleteBlobAsync(filePath);

    return deleted 
        ? Ok(new { message = "File deleted" })
        : NotFound();
}

10. Download Large Files in Chunks

[HttpGet("download-chunked/{*filePath}")]
public async Task<IActionResult> DownloadChunked(
    string filePath,
    [FromQuery] int chunkSizeInMB = 16)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    var sw = Stopwatch.StartNew();
    var stream = await storage.DownloadBlobInChunksAsync(filePath, chunkSizeInMB);
    sw.Stop();

    if (stream == null)
        return NotFound();

    _logger.LogInformation(
        $"Downloaded {stream.Length} bytes in {sw.ElapsedMilliseconds}ms");

    return File(stream, "application/octet-stream", Path.GetFileName(filePath));
}

Advanced Scenarios

Using Multiple Storage Accounts

// Production storage
var prodStorage = _storageServices.instance("prodaccount", "documents");
await prodStorage.UploadBlobAsync("reports/report.pdf", stream);

// Development storage
var devStorage = _storageServices.instance("devaccount", "documents");
await devStorage.UploadBlobAsync("test/report.pdf", stream);

// QA storage
var qaStorage = _storageServices.instance("qaaccount", "documents");
await qaStorage.UploadBlobAsync("qa/report.pdf", stream);

Append Mode for Audio/Video Files

[HttpPost("append-audio")]
public async Task<IActionResult> AppendAudio(IFormFile audioChunk)
{
    var storage = _storageServices.instance("mystorageaccount", "documents");
    
    using var stream = audioChunk.OpenReadStream();
    
    // Append mode automatically handles WAV header merging
    await storage.UploadBlobAsync(
        "recordings/session1.wav",
        stream,
        appendMode: true
    );

    return Ok(new { message = "Audio chunk appended" });
}

Read/Write Text Files

// Write text
var content = JsonSerializer.Serialize(new { data = "example" });
await storage.WriteAllText("config/settings.json", content);

// Read text
var jsonContent = await storage.ReadAllText("config/settings.json");
var settings = JsonSerializer.Deserialize<MySettings>(jsonContent);

Performance Tips

  1. Enable Background Service - Maintains warm connections for faster operations
  2. Use Batch Operations - 3-5x faster than individual operations for multiple files
  3. Stream Large Files - Use StreamBlobInChunksAsync for files > 100MB to avoid memory issues
  4. Adjust Chunk Size - Larger chunks (16-32MB) for high-bandwidth networks, smaller (4-8MB) for mobile
  5. Parallel Operations - Library is fully thread-safe and optimized for concurrent access

Amazon S3 Region Codes

Region Code Description
US West 1 0 US West (N. California)
US West 2 1 US West (Oregon)
US East 1 2 US East (N. Virginia)
US East 2 3 US East (Ohio)
CA Central 1 4 Canada (Central)

Requirements

  • .NET 8.0, 9.0, or 10.0
  • Azure.Storage.Blobs 12.23.0+
  • AWSSDK.S3 3.7.309.7+
  • Microsoft.Extensions.Hosting 8.0.0+

Troubleshooting

Issue: Slow first request

Solution: Enable BlobServiceBackgroundTask with ConnectionsInPool: 5 to maintain warm connections

Issue: 412 ConditionNotMet errors on concurrent operations

Solution: Library has built-in retry logic with exponential backoff that handles this automatically

Issue: Out of memory on large files

Solution: Use StreamBlobInChunksAsync or DownloadBlobInChunksAsync instead of DownloadBlobAsync

Issue: Task.Delay ArgumentOutOfRangeException

Solution: Ensure ServiceInterval in configuration is less than 2,147,483 seconds (~24 days). Recommended: 60-300 seconds.

Authors

  • Habib - Lead Developer
  • PracticeEHR Team - Architecture & Design

Support & Contributing

License

MIT License - Copyright © Habib and PracticeEHR 2024-2025

Roadmap

  • ✅ Azure Blob Storage (Complete)
  • 🚧 Amazon S3 (In Development)
  • 📋 Google Cloud Storage (Planned)
  • 📋 Cross-cloud migration tools (Planned)

Made with ❤️ by Habib & PracticeEHR Team

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 is compatible.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
1.0.32 6 3/10/2026
1.0.31 115 2/21/2026
1.0.30 81 2/20/2026
1.0.28 89 2/20/2026
1.0.27 80 2/20/2026
1.0.26 85 2/20/2026
1.0.25 694 10/24/2025
1.0.23 215 10/3/2025
1.0.21 136 10/3/2025
1.0.20 151 10/3/2025
1.0.18 652 11/12/2024
1.0.17 825 7/3/2024
1.0.16 198 7/1/2024
1.0.15 171 7/1/2024
1.0.14 152 6/3/2024
1.0.13 265 4/15/2024
1.0.12 165 4/15/2024
1.0.11 184 4/1/2024
1.0.9 178 3/1/2024
1.0.8 182 2/29/2024
Loading failed

Multi-targeting support for .NET 8.0, 9.0, and 10.0
- Azure Blob Storage and Amazon S3
- Batch operations (Read/Write/Copy)
- Streaming with chunk support
- Background service for connection management
- Cross-storage account operations