LiterLlm 1.0.0
dotnet add package LiterLlm --version 1.0.0
NuGet\Install-Package LiterLlm -Version 1.0.0
<PackageReference Include="LiterLlm" Version="1.0.0" />
<PackageVersion Include="LiterLlm" Version="1.0.0" />
<PackageReference Include="LiterLlm" />
paket add LiterLlm --version 1.0.0
#r "nuget: LiterLlm, 1.0.0"
#:package LiterLlm@1.0.0
#addin nuget:?package=LiterLlm&version=1.0.0
#tool nuget:?package=LiterLlm&version=1.0.0
C
<div align="center" style="display: flex; flex-wrap: wrap; gap: 8px; justify-content: center; margin: 20px 0;">
<a href="https://crates.io/crates/liter-llm"> <img src="https://img.shields.io/crates/v/liter-llm?label=Rust&color=007ec6" alt="Rust"> </a> <a href="https://pypi.org/project/liter-llm/"> <img src="https://img.shields.io/pypi/v/liter-llm?label=Python&color=007ec6" alt="Python"> </a> <a href="https://www.npmjs.com/package/@kreuzberg/liter-llm"> <img src="https://img.shields.io/npm/v/@kreuzberg/liter-llm?label=Node.js&color=007ec6" alt="Node.js"> </a> <a href="https://www.npmjs.com/package/@kreuzberg/liter-llm-wasm"> <img src="https://img.shields.io/npm/v/@kreuzberg/liter-llm-wasm?label=WASM&color=007ec6" alt="WASM"> </a> <a href="https://central.sonatype.com/artifact/dev.kreuzberg/liter-llm"> <img src="https://img.shields.io/maven-central/v/dev.kreuzberg/liter-llm?label=Java&color=007ec6" alt="Java"> </a> <a href="https://github.com/kreuzberg-dev/liter-llm/tree/main/packages/go"> <img src="https://img.shields.io/github/v/tag/kreuzberg-dev/liter-llm?label=Go&color=007ec6" alt="Go"> </a> <a href="https://www.nuget.org/packages/LiterLlm"> <img src="https://img.shields.io/nuget/v/LiterLlm?label=C%23&color=007ec6" alt="C#"> </a> <a href="https://packagist.org/packages/kreuzberg/liter-llm"> <img src="https://img.shields.io/packagist/v/kreuzberg/liter-llm?label=PHP&color=007ec6" alt="PHP"> </a> <a href="https://rubygems.org/gems/liter_llm"> <img src="https://img.shields.io/gem/v/liter_llm?label=Ruby&color=007ec6" alt="Ruby"> </a> <a href="https://hex.pm/packages/liter_llm"> <img src="https://img.shields.io/hexpm/v/liter_llm?label=Elixir&color=007ec6" alt="Elixir"> </a> <a href="https://github.com/kreuzberg-dev/liter-llm/pkgs/container/liter-llm"> <img src="https://img.shields.io/badge/Docker-007ec6?logo=docker&logoColor=white" alt="Docker"> </a> <a href="https://github.com/kreuzberg-dev/liter-llm/tree/main/crates/liter-llm-ffi"> <img src="https://img.shields.io/badge/C-FFI-007ec6" alt="C FFI"> </a>
<a href="https://github.com/kreuzberg-dev/liter-llm/blob/main/LICENSE"> <img src="https://img.shields.io/badge/License-MIT-007ec6" alt="License"> </a> <a href="https://docs.liter-llm.kreuzberg.dev"> <img src="https://img.shields.io/badge/docs-kreuzberg.dev-007ec6" alt="Docs"> </a> </div>
<div align="center" style="margin: 20px 0;"> <picture> <img width="100%" alt="kreuzberg.dev" src="https://github.com/user-attachments/assets/1b6c6ad7-3b6d-4171-b1c9-f2026cc9deb8" /> </picture> </div>
<div align="center" style="margin-bottom: 20px;"> <a href="https://discord.gg/xt9WY3GnKR"> <img height="22" src="https://img.shields.io/badge/Discord-Join%20our%20community-7289da?logo=discord&logoColor=white" alt="Discord"> </a> </div>
Universal LLM API client for .NET. Access 142+ LLM providers through a single type-safe interface with full async/await support and .NET 8.0+ compatibility.
Installation
Package Installation
Install via NuGet:
dotnet add package LiterLlm
Or via NuGet Package Manager:
Install-Package LiterLlm
System Requirements
- .NET 8.0+ required
- API keys via environment variables (e.g.
OPENAI_API_KEY,ANTHROPIC_API_KEY)
Quick Start
Basic Chat
Send a message to any provider using the provider/model prefix:
using LiterLlm;
await using var client = new LlmClient(
apiKey: Environment.GetEnvironmentVariable("OPENAI_API_KEY")!);
var response = await client.ChatAsync(new ChatCompletionRequest(
Model: "openai/gpt-4o",
Messages: [new UserMessage("Hello!")]
));
Console.WriteLine(response.Choices[0].Message.Content);
Common Use Cases
Streaming Responses
Stream tokens in real time:
using LiterLlm;
await using var client = new LlmClient(
apiKey: Environment.GetEnvironmentVariable("OPENAI_API_KEY")!);
var request = new ChatCompletionRequest(
Model: "openai/gpt-4o-mini",
Messages: [new UserMessage("Hello")]
);
await foreach (var chunk in client.ChatStreamAsync(request))
{
Console.WriteLine(chunk);
}
Next Steps
- Provider Registry - Full list of supported providers
- GitHub Repository - Source, issues, and discussions
Features
Supported Providers (142+)
Route to any provider using the provider/model prefix convention:
| Provider | Example Model |
|---|---|
| OpenAI | openai/gpt-4o, openai/gpt-4o-mini |
| Anthropic | anthropic/claude-3-5-sonnet-20241022 |
| Groq | groq/llama-3.1-70b-versatile |
| Mistral | mistral/mistral-large-latest |
| Cohere | cohere/command-r-plus |
| Together AI | together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo |
| Fireworks | fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct |
| Google Vertex | vertexai/gemini-1.5-pro |
| Amazon Bedrock | bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0 |
Key Capabilities
Provider Routing -- Single client for 142+ LLM providers via
provider/modelprefixUnified API -- Consistent
chat,chat_stream,embeddings,list_modelsinterfaceStreaming -- Real-time token streaming via
chat_streamTool Calling -- Function calling and tool use across all supporting providers
Type Safe -- Schema-driven types compiled from JSON schemas
Secure -- API keys never logged or serialized, managed via environment variables
Observability -- Built-in OpenTelemetry with GenAI semantic conventions
Error Handling -- Structured errors with provider context and retry hints
Performance
Built on a compiled Rust core for speed and safety:
- Provider resolution at client construction -- zero per-request overhead
- Configurable timeouts and connection pooling
- Zero-copy streaming with SSE and AWS EventStream support
- API keys wrapped in secure memory, zeroed on drop
Provider Routing
Route to 142+ providers using the provider/model prefix convention:
openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest
See the provider registry for the full list.
Documentation
- Documentation -- Full docs and API reference
- GitHub Repository -- Source, issues, and discussions
- Provider Registry -- 142 supported providers
Part of kreuzberg.dev.
Contributing
Contributions are welcome! See CONTRIBUTING.md for guidelines.
Join our Discord community for questions and discussion.
License
MIT -- see LICENSE for details.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 1.0.0 | 0 | 3/28/2026 |
| 1.0.0-rc.9 | 0 | 3/28/2026 |
| 1.0.0-rc.8 | 0 | 3/28/2026 |
| 1.0.0-rc.7 | 0 | 3/28/2026 |
| 1.0.0-rc.6 | 0 | 3/28/2026 |
| 1.0.0-rc.5 | 0 | 3/28/2026 |
| 1.0.0-rc.4 | 0 | 3/28/2026 |
| 1.0.0-rc.3 | 32 | 3/27/2026 |
| 1.0.0-rc.2 | 39 | 3/27/2026 |
| 1.0.0-rc.1 | 33 | 3/27/2026 |
Version 1.0.0