Kagamine.Extensions 2.0.0

dotnet add package Kagamine.Extensions --version 2.0.0
                    
NuGet\Install-Package Kagamine.Extensions -Version 2.0.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Kagamine.Extensions" Version="2.0.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Kagamine.Extensions" Version="2.0.0" />
                    
Directory.Packages.props
<PackageReference Include="Kagamine.Extensions" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Kagamine.Extensions --version 2.0.0
                    
#r "nuget: Kagamine.Extensions, 2.0.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Kagamine.Extensions@2.0.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Kagamine.Extensions&version=2.0.0
                    
Install as a Cake Addin
#tool nuget:?package=Kagamine.Extensions&version=2.0.0
                    
Install as a Cake Tool

🍊 Kagamine.Extensions

Kagamine.Extensions Kagamine.Extensions.EntityFramework

View on GitHub日本語

This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications (as Microsoft puts it). Human-coded, as with all of my work.

Hosting

ConsoleApplication.CreateBuilder()

Tailors the Generic Host framework for console apps as WebApplication does for ASP.NET Core. Using IHost is desirable for its dependency injection, logging, and configuration setup as well as for consistency with web apps (not to mention EF Core migrations uses it to discover the DbContext), but the out-of-box experience is mainly designed for background workers which leads to some frustrations when trying to use it in a regular executable.

Example Program.cs:

using Kagamine.Extensions.Hosting;
using Microsoft.Extensions.DependencyInjection;

var builder = ConsoleApplication.CreateBuilder();

builder.Services.AddDbContext<FooContext>();
builder.Services.AddScoped<IFooService, FooService>();

// May optionally be async and/or return an exit code
builder.Run((IFooService fooService, CancellationToken cancellationToken) =>
{
    fooService.DoStuff(cancellationToken);
});

Compared to repurposing IHostedService or BackgroundService to run a console app:

  • The entry point is much cleaner and more natural (reminiscent of minimal APIs)
  • SIGINT, SIGQUIT, and SIGTERM produce the correct exit codes (in background services, a Ctrl+C is supposed to trigger a "graceful" shutdown and exit with zero, which only makes sense for a long-lived worker or server)
  • Unhandled exceptions are the same as a regular console app, just going through the ILogger instead (not printed twice with a message for developers tacked on; services are also disposed to ensure logs are flushed)
  • Application lifetime events are properly handled (which a lot of IHostedService examples actually get wrong; implementing it correctly is surprisingly unintuitive, and it's not meant for this anyway)

Several real-world examples of this being used can be found in Serifu.org's projects.

ASP.NET Core projects include a launchSettings.json by default which sets the environment to "Development" in dev, but you have to create this file yourself for a console app. The easiest way in Visual Studio is to open Debug > {Project Name} Debug Properties and under Environment Variables add DOTNET_ENVIRONMENT = Development. Note that the ASPNETCORE_ prefix won't work here, as that's specific to WebApplication.

Collections

ValueArray<T>

There's currently no solution in .NET for putting a collection in a record while maintaining both immutability and value semantics. It's also sometimes necessary to have access to the underlying array for interop with APIs that do not support spans (especially for byte arrays, where copying can have a significant performance impact).

To solve this, I've created a ValueArray<T> type which represents a read-only array with value type semantics suitable for use in immutable records:

Type Immutable Value equality To/from array w/o copying
T[]
List<T>
ReadOnlyCollection<T> ✅(1)
IReadOnlyList<T> ✅(1)
ImmutableArray<T>(2) ✅(3,4) ✅(3)
ReadOnlyMemory<T> ✅(4,5) ⚠️(5)
ValueArray<T> ✅(4,6) ✅(6)
  1. ReadOnlyCollection<T> is merely a read-only view of a List<T>, and IReadOnlyList<T> is usually the List<T> itself.
  2. Has a bug caused by misuse of the null suppression operator that can cause a null reference exception which won't be caught by static analysis if any code returns its default. (ValueArray<T> fixes this by treating a null array as empty, as it is also a struct.)
  3. ImmutableCollectionsMarshal can be used to access the underlying array or create an instance backed by an existing array.
  4. Can be modified inadvertently if a reference is held to the array used to construct it, or if the underlying buffer is accessed and passed to a method that does not treat it as read-only.
  5. Depending on how the ReadOnlyMemory<T> was created, it may be possible to access the buffer using MemoryMarshal, but there's no guarantee the instance is backed by an actual array, or it may represent a slice of an array (like Span<T>).
  6. Supports implicit conversion from T[], and the underlying array can be accessed via explicit cast to T[].

ValueArray<T> supports both collection expressions and array initializers (via implicit cast):

record Song(string Title, ValueArray<string> Artists);

Song song = new("Promise", ["samfree", "Kagamine Rin", "Hatsune Miku"]);
Song song2 = song with { Artists = [.. song.Artists] /* Clone the array */ };

// These would fail if Artists were List<T>, despite the contents being identical
Assert.True(song == song2);
Assert.True(song.Artists == song2.Artists);

ValueArray<Song> songs = new[] { song, song2 };

It's interoperable with spans as well as APIs requiring arrays such as Entity Framework. Using a value converter, a ValueArray<byte> can be cast to its underlying byte[] to use as a BLOB column without the overhead of copying an array:

entity.Property<ValueArray<byte>>(x => x.Data)
    .HasColumnName("data")
    .HasConversion(model => (byte[])model, column => column);

When T is an unmanaged type, ValueArray<T> can also be marshaled to and from ReadOnlySpan<byte>. This could be used, for instance, to store an array of structs in a database as an opaque blob using their binary representation:

readonly record struct Alignment(ushort FromStart, ushort FromEnd, ushort ToStart, ushort ToEnd);

entity.Property<ValueArray<Alignment>>(q => q.AlignmentData)
    .HasConversion(
        model => ValueArray.ToByteArray(model), // Equivalent to model.AsBytes().ToArray()
        column => ValueArray.FromBytes<Alignment>(column));

Incidentally, this is how Serifu.org stores word alignment data in a local SQLite DB. I've also created a JsonConverter that uses the same technique to efficiently serialize a ValueArray<T> of structs in JSON as a base64 string (which it uses in production for storing the alignment data in Elasticsearch):

ValueArray<DateTime> dates = [ DateTime.Parse("2007-08-31"), DateTime.Parse("2007-12-27") ];

var options = new JsonSerializerOptions() { Converters = { new JsonBase64ValueArrayConverter() } };
var json = JsonSerializer.Serialize(dates, options); // "AIAeAnm5yQgAAN2OMhbKCA=="

Without a converter, a ValueArray<T> will serialize as a regular array. To deserialize a JSON array as ValueArray<T> (as System.Text.Json cannot natively deserialize to a custom readonly collection), use the JsonValueArrayConverter. Both converters have generic versions to mix-and-match for specific T's.

IO

TemporaryFileProvider

Provides a number of advantages for working with temp files over Path.GetTempFileName():

  • Unlike GetTempFileName(), it's possible to specify a file extension or suffix, which may be necessary when passing the file path to certain programs (and unlike common solutions on Stack Overflow, it guarantees that the file name is unique / avoids race conditions);
  • Temp files are stored in an application-specific directory which is removed if empty when the application quits;
  • The TemporaryFile can be placed in a using which will automatically clean up the temp file when disposed;
  • TemporaryFile doesn't maintain an open handle to the file, which allows for the file path to be passed to other programs like ffmpeg which may overwrite or replace the file (and expect it to not be in use);
  • 👉 Most importantly: TemporaryFile keeps a ref count and only deletes the file once it and all streams have been disposed, which means a method can return a FileStream backed by the temp file and not have to worry about cleanup, vastly simplifying common error handling patterns such as:
public async Task<Stream> ConvertToOpus(Stream inputStream, CancellationToken cancellationToken)
{
    using TemporaryFile inputFile = tempFileProvider.Create();
    await inputFile.CopyFromAsync(inputStream);

    using TemporaryFile outputFile = tempFileProvider.Create(".opus");

    await FFMpegArguments
        .FromFileInput(inputFile.Path)
        .OutputToFile(outputFile.Path, overwrite: true, options => options
            .WithAudioBitrate(Bitrate))
        .CancellableThrough(cancellationToken)
        .ProcessAsynchronously();

    // If ffmpeg throws, both temp files will be deleted.
    // 
    // If it succeeds, the input file is deleted, but the output file remains on
    // disk until the returned stream is closed, at which point the remaining
    // temp file will be cleaned up automatically.
    return outputFile.OpenRead();
}

ITemporaryFileProvider is added to the service container like so:

services.AddTemporaryFileProvider();

Or you can construct a TemporaryFileProvider yourself if not using DI. The temp directory and base filename format (by default a guid) can be changed via the options (see its overloads).

Http

RateLimitingHttpHandler

A DelegatingHandler that uses System.Threading.RateLimiting to force requests to the same host to wait for a configured period of time since the last request completed before sending a new request:

// Add the rate limiter to all HttpClients
builder.Services.ConfigureHttpClientDefaults(builder => builder.AddRateLimiting());

// In libraries, consider adding it only to your own named or typed client; the
// rate limit won't stack even if the top-level project adds it to all clients
builder.Services.AddHttpClient("foo").AddRateLimiting();

// Alternatively, if not using DI
using RateLimitingHttpHandlerFactory rateLimiterFactory = new();
RateLimitingHttpHandler rateLimiter = rateLimiterFactory.CreateHandler();
rateLimiter.InnerHandler = new HttpClientHandler();
HttpClient client = new(rateLimiter);

When using DI, the per-host rate limit is shared across all named clients. This avoids accidentally hitting a host more frequently than intended simply because the code happens to use multiple clients.

To change the default time between requests or set different rate limits per host:

builder.Services.Configure<RateLimitingHttpHandlerOptions>(options =>
{
    // Setting it to null disables rate limiting by default; can also leave rate
    // limiting on by default and disable it for specific hosts instead.
    // Libraries that need to enforce a particular rate limit to their APIs
    // should avoid relying on the global TimeBetweenRequests.
    options.TimeBetweenRequests = null;
    options.TimeBetweenRequestsByHost.Add("example.com", TimeSpan.FromSeconds(5));
});

Note that the timer starts after the response has been received and returned to the caller, not before sending the request. Otherwise, slow responses and network latency could result in requests exhibiting effectively no rate limit.

Run the sample ConsoleApp for a demo.

Logging

BeginTimedOperation

A small extension method inspired by SerilogMetrics, which I've used on a number of projects in the past:

using (logger.BeginTimedOperation(nameof(DoStuff)))
{
    logger.Debug("Doing stuff...");
}
// [12:00:00 INF] DoStuff: Starting
// [12:00:00 DBG] Doing stuff...
// [12:00:01 INF] DoStuff: Completed in 39 ms

Utilities

TerminalProgressBar

Sends ANSI escape codes to display a progress bar in the terminal and clear it automatically when disposed:

using var progress = new TerminalProgressBar();

for (int i = 0; i < foos.Count; i++)
{
    logger.Information("Foo {Foo} of {TotalFoos}", i + 1, foos.Count);
    progress.SetProgress(i, foos.Count);

    await fooService.DoStuff(foos[i]);
}

EntityFramework

Update<T>(this DbSet<T> set, T entity, T valuesFrom)

Allows for replacing an existing entity with a new instance, correctly transferring both regular property values and navigation properties, since EF will throw if you try to pass a detached entity to Update() while another instance with the same primary key is tracked (e.g. by another query performed elsewhere):

var existingEntities = await db.Foos.ToDictionaryAsync(f => f.Id);

foreach (var entity in entities)
{
    if (existingEntities.Remove(entity.Id, out var existingEntity))
    {
        db.Foos.Update(existingEntity, entity);
    }
    else
    {
        db.Foos.Add(entity);
    }
}

db.Foos.RemoveRange(existingEntities.Values);
await db.SaveChangesAsync();

If it makes sense for your application, consider using IDbContextFactory instead and creating a new context for each unit of work. Doing so can avoid the sort of "spooky action at a distance" (other parts of the code affecting the state of the change tracker unpredictably) that makes doing something like this necessary.

ToHashSetAsync<T>()

⚠️ Removed in v2.0.0, as this was made official in EF 9. Older projects can copy the method from here.

Mirrors ToArrayAsync and ToListAsync. Implemented using await foreach, like the other two, making it slightly more performant than doing ToListAsync then ToHashSet :

HashSet<string> referencedFiles = await db.Foos
    .Select(f => f.FilePath)
    .ToHashSetAsync(StringComparer.OrdinalIgnoreCase);

foreach (var file in Directory.EnumerateFiles(dir))
{
    if (!referencedFiles.Contains(file))
    {
        logger.Warning("Deleting orphaned file {Path}", file);
        File.Delete(file);
    }
}
Product Compatible and additional computed target framework versions.
.NET net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
2.0.0 197 12/22/2025