LlmExtract 0.1.0
dotnet add package LlmExtract --version 0.1.0
NuGet\Install-Package LlmExtract -Version 0.1.0
<PackageReference Include="LlmExtract" Version="0.1.0" />
<PackageVersion Include="LlmExtract" Version="0.1.0" />
<PackageReference Include="LlmExtract" />
paket add LlmExtract --version 0.1.0
#r "nuget: LlmExtract, 0.1.0"
#:package LlmExtract@0.1.0
#addin nuget:?package=LlmExtract&version=0.1.0
#tool nuget:?package=LlmExtract&version=0.1.0
LlmExtract
Reliably extract and deserialize JSON from raw LLM text responses.
Every developer calling OpenAI, Anthropic, Ollama, or any LLM writes the same boilerplate to strip markdown fences, remove preamble text, fix trailing commas, and handle edge cases. LlmExtract does all of that in one call.
The Problem
LLM responses are messy:
Sure! Here's the JSON you requested:
```json
{
"name": "Alice",
"age": 30, // trailing comma
}
Let me know if you need anything else!
Your `JsonSerializer.Deserialize<T>()` call throws. Every time.
## The Solution
```csharp
using LlmExtract;
// One line — handles fences, preamble, trailing text, and JSON repair
Person person = LlmJson.Extract<Person>(llmResponseText);
Installation
dotnet add package LlmExtract
API
LlmJson.Extract<T>(text)
Extract and deserialize. Throws ExtractionException on failure.
var person = LlmJson.Extract<Person>(responseText);
LlmJson.TryExtract<T>(text, out result, out error)
Safe version — no exceptions.
if (LlmJson.TryExtract<Person>(text, out var person, out var error))
{
Console.WriteLine(person.Name);
}
else
{
Console.WriteLine($"Failed: {error.Reason}"); // NoJsonFound or DeserializationFailed
Console.WriteLine($"Extracted: {error.ExtractedJson}"); // What we tried to parse
}
LlmJson.ExtractRaw(text)
Get the raw JSON string without deserializing.
string? json = LlmJson.ExtractRaw(text); // null if no JSON found
LlmJson.ExtractAll<T>(text)
Extract multiple JSON objects from one response.
IReadOnlyList<Person> people = LlmJson.ExtractAll<Person>(multiObjectResponse);
LlmJsonAccumulator<T> — Streaming
Accumulate streaming chunks and emit objects as they complete.
var accumulator = new LlmJsonAccumulator<Person>();
foreach (var chunk in streamingResponse)
{
if (accumulator.TryAdd(chunk, out var person))
Console.WriteLine($"Got: {person.Name}");
}
What It Handles
| Issue | Example | Fixed? |
|---|---|---|
| Markdown fences | ```json ... ``` |
✅ |
| Preamble text | "Sure! Here's the JSON: {...}" |
✅ |
| Trailing text | {...} Let me know if you need more! |
✅ |
| Trailing commas | {"name": "Alice",} |
✅ |
| Single quotes | {'name': 'Alice'} |
✅ |
| JS comments | {"name": "Alice" // user} |
✅ |
| Unquoted keys | {name: "Alice"} |
✅ |
| Case mismatch | {"NAME": "Alice"} |
✅ |
| Multiple objects | Two {...} in one response |
✅ |
Zero Dependencies
- Uses
System.Text.Json(built into .NET 8+, NuGet reference for netstandard2.0) - No external packages
- Multi-target:
netstandard2.0+net8.0
Works With Any LLM
OpenAI, Anthropic Claude, Google Gemini, Ollama, Azure OpenAI, Mistral, Llama.cpp — any provider that returns text responses.
License
MIT
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- System.Text.Json (>= 8.0.5)
-
net8.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 0.1.0 | 71 | 5/5/2026 |