Soenneker.SemanticKernel.Pool
3.0.15
Prefix Reserved
dotnet add package Soenneker.SemanticKernel.Pool --version 3.0.15
NuGet\Install-Package Soenneker.SemanticKernel.Pool -Version 3.0.15
<PackageReference Include="Soenneker.SemanticKernel.Pool" Version="3.0.15" />
<PackageVersion Include="Soenneker.SemanticKernel.Pool" Version="3.0.15" />
<PackageReference Include="Soenneker.SemanticKernel.Pool" />
paket add Soenneker.SemanticKernel.Pool --version 3.0.15
#r "nuget: Soenneker.SemanticKernel.Pool, 3.0.15"
#addin nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.15
#tool nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.15
Soenneker.SemanticKernel.Pool
A high-performance, thread-safe pool implementation for Microsoft Semantic Kernel instances with built-in rate limiting capabilities.
Features
- Kernel Pooling: Efficiently manages and reuses Semantic Kernel instances
- Rate Limiting: Built-in support for request rate limiting at multiple time windows:
- Per-second rate limiting
- Per-minute rate limiting
- Per-day rate limiting
- Token-based rate limiting
- Thread Safety: Fully thread-safe implementation using concurrent collections
- Async Support: Modern async/await patterns throughout the codebase
- Flexible Configuration: Configurable rate limits and pool settings
- Resource Management: Automatic cleanup of expired rate limit windows
Installation
dotnet add package Soenneker.SemanticKernel.Pool
services.AddSemanticKernelPoolAsSingleton()
Extension Packages
This library has several extension packages for different AI providers:
- Soenneker.SemanticKernel.Pool.Gemini - Google Gemini integration
- Soenneker.SemanticKernel.Pool.OpenAi - OpenAI/OpenRouter.ai/etc integration
- Soenneker.SemanticKernel.Pool.Ollama - Ollama integration
- Soenneker.SemanticKernel.Pool.OpenAi.Azure - Azure OpenAI integration
Usage
Startup Configuration
// In Program.cs or Startup.cs
public class Program
{
public static async Task Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add the kernel pool as a singleton
builder.Services.AddSemanticKernelPoolAsSingleton();
var app = builder.Build();
// Register kernels during startup
var kernelPool = app.Services.GetRequiredService<ISemanticKernelPool>();
// Manually create options, or use one of the extensions mentioned above
var options = new SemanticKernelOptions
{
ApiKey = "your-api-key",
Endpoint = "https://api.openai.com/v1",
Model = "gpt-4",
KernelFactory = async (opts, _) =>
{
return Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: opts.ModelId!,
new OpenAIClient(new ApiKeyCredential(opts.ApiKey), new OpenAIClientOptions {Endpoint = new Uri(opts.Endpoint)}));
}
// Rate Limiting
RequestsPerSecond = 10,
RequestsPerMinute = 100,
RequestsPerDay = 1000,
TokensPerDay = 10000
};
await kernelPool.Register("my-kernel", options);
// Add more registrations... order matters!
await app.RunAsync();
}
}
Using the Pool
public class MyService
{
private readonly ISemanticKernelPool _kernelPool;
public MyService(ISemanticKernelPool kernelPool)
{
_kernelPool = kernelPool;
}
public async Task ProcessAsync()
{
// Get an available kernel that's within its rate limits, preferring the first registered
var (kernel, entry) = await _kernelPool.GetAvailableKernel();
// Get the chat completion service
var chatCompletionService = kernel.GetService<IChatCompletionService>();
// Create a chat history
var chatHistory = new ChatHistory();
chatHistory.AddMessage(AuthorRole.User, "What is the capital of France?");
// Execute chat completion
var response = await chatCompletionService.GetChatMessageContentAsync(chatHistory);
Console.WriteLine($"Response: {response.Content}");
// Access rate limit information through the entry
var remainingQuota = await entry.RemainingQuota();
Console.WriteLine($"Remaining requests - Second: {remainingQuota.Second}, Minute: {remainingQuota.Minute}, Day: {remainingQuota.Day}");
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- Soenneker.SemanticKernel.Cache (>= 3.0.422)
NuGet packages (4)
Showing the top 4 NuGet packages that depend on Soenneker.SemanticKernel.Pool:
Package | Downloads |
---|---|
Soenneker.SemanticKernel.Pool.Ollama
Provides Ollama-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.OpenAi.Azure
Provides Azure OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Gemini
Provides Gemini-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.OpenAi
Provides OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
3.0.15 | 51 | 5/20/2025 |
3.0.14 | 54 | 5/19/2025 |
3.0.13 | 57 | 5/19/2025 |
3.0.12 | 31 | 5/19/2025 |
3.0.11 | 32 | 5/19/2025 |
3.0.10 | 49 | 5/19/2025 |
3.0.9 | 36 | 5/19/2025 |
3.0.8 | 82 | 5/19/2025 |
3.0.7 | 31 | 5/18/2025 |
3.0.6 | 42 | 5/18/2025 |
3.0.5 | 43 | 5/18/2025 |
3.0.4 | 26 | 5/18/2025 |
3.0.3 | 29 | 5/18/2025 |
3.0.2 | 29 | 5/18/2025 |
3.0.1 | 27 | 5/18/2025 |