mostlylucid.mockllmapi
1.0.2
See the version list below for details.
dotnet add package mostlylucid.mockllmapi --version 1.0.2
NuGet\Install-Package mostlylucid.mockllmapi -Version 1.0.2
<PackageReference Include="mostlylucid.mockllmapi" Version="1.0.2" />
<PackageVersion Include="mostlylucid.mockllmapi" Version="1.0.2" />
<PackageReference Include="mostlylucid.mockllmapi" />
paket add mostlylucid.mockllmapi --version 1.0.2
#r "nuget: mostlylucid.mockllmapi, 1.0.2"
#:package mostlylucid.mockllmapi@1.0.2
#addin nuget:?package=mostlylucid.mockllmapi&version=1.0.2
#tool nuget:?package=mostlylucid.mockllmapi&version=1.0.2
mostlylucid.mockllmapi
A lightweight ASP.NET Core middleware for generating realistic mock API responses using local LLMs (via Ollama). Add intelligent mock endpoints to any project with just 2 lines of code!
Features
- ๐ Super Simple:
Addmostlylucid.mockllmapi()+Mapmostlylucid.mockllmapi("/api/mock")= instant mock API - โ๏ธ Configurable: appsettings.json or inline configuration
- ๐จ Shape Control: Specify exact JSON structure via header, query param, or request body
- ๐ก Real-time Streaming: Server-Sent Events (SSE) support with progressive JSON generation
- ๐ฒ Highly Variable Data: Each request generates completely different realistic data
- ๐ง All HTTP Methods: Supports GET, POST, PUT, DELETE, PATCH
- ๐ Wildcard Routing: Any path under your chosen endpoint works
- ๐ฆ NuGet Package: Easy to add to existing projects
Quick Start
Installation
dotnet add package mostlylucid.mockllmapi
Prerequisites
- Install .NET 8.0 SDK or later
- Install Ollama and pull a model:
ollama pull llama3
Basic Usage
Program.cs:
using mostlylucid.mockllmapi;
var builder = WebApplication.CreateBuilder(args);
// Add LLMock API services
builder.Services.Addmostlylucid.mockllmapi(builder.Configuration);
var app = builder.Build();
// Map mock endpoints at /api/mock
app.Mapmostlylucid.mockllmapi("/api/mock");
app.Run();
appsettings.json:
{
"mostlylucid.mockllmapi": {
"BaseUrl": "http://localhost:11434/v1/",
"ModelName": "llama3",
"Temperature": 1.2
}
}
That's it! Now all requests to /api/mock/** return intelligent mock data.
Configuration Options
Via appsettings.json (Recommended)
{
"mostlylucid.mockllmapi": {
"BaseUrl": "http://localhost:11434/v1/",
"ModelName": "llama3",
"Temperature": 1.2,
"TimeoutSeconds": 30,
"EnableVerboseLogging": false,
"CustomPromptTemplate": null
}
}
Via Code
builder.Services.Addmostlylucid.mockllmapi(options =>
{
options.BaseUrl = "http://localhost:11434/v1/";
options.ModelName = "mixtral";
options.Temperature = 1.5;
options.TimeoutSeconds = 60;
});
Custom Endpoint Patterns
// Default: /api/mock/** and /api/mock/stream/**
app.Mapmostlylucid.mockllmapi("/api/mock");
// Custom pattern
app.Mapmostlylucid.mockllmapi("/demo");
// Creates: /demo/** and /demo/stream/**
// Without streaming
app.Mapmostlylucid.mockllmapi("/api/mock", includeStreaming: false);
Usage Examples
Basic Request
curl http://localhost:5000/api/mock/users?limit=5
Returns realistic user data generated by the LLM.
With Shape Control
curl -X POST http://localhost:5000/api/mock/orders \
-H "X-Response-Shape: {\"orderId\":\"string\",\"total\":0.0,\"items\":[{\"sku\":\"string\",\"qty\":0}]}" \
-H "Content-Type: application/json" \
-d '{"customerId":"cus_123"}'
LLM generates data matching your exact shape specification.
Streaming
curl -N http://localhost:5000/api/mock/stream/products?category=electronics \
-H "Accept: text/event-stream"
Returns Server-Sent Events as JSON is generated:
data: {"chunk":"{","done":false}
data: {"chunk":"\"id\"","done":false}
...
data: {"content":"{full json here}","done":true}
Advanced Features
Custom Prompt Templates
Override the default prompts with your own:
{
"mostlylucid.mockllmapi": {
"CustomPromptTemplate": "Generate mock data for {method} {path}. Body: {body}. Use seed: {randomSeed}"
}
}
Available placeholders:
{method}- HTTP method (GET, POST, etc.){path}- Full request path with query string{body}- Request body{randomSeed}- Generated random seed (GUID){timestamp}- Unix timestamp{shape}- Shape specification (if provided)
Multiple Instances
Mount multiple mock APIs with different configurations:
// Development data with high randomness
builder.Services.Addmostlylucid.mockllmapi("Dev", options =>
{
options.Temperature = 1.5;
options.ModelName = "llama3";
});
// Stable test data
builder.Services.Addmostlylucid.mockllmapi("Test", options =>
{
options.Temperature = 0.3;
options.ModelName = "llama3";
});
app.Mapmostlylucid.mockllmapi("/api/dev");
app.Mapmostlylucid.mockllmapi("/api/test");
Shape Specification
Three ways to control response structure:
- Header (recommended):
X-Response-Shape: {"field":"type"} - Query param:
?shape=%7B%22field%22%3A%22type%22%7D(URL-encoded JSON) - Body field:
{"shape": {...}, "actualData": ...}
Testing
Use the included LLMApi.http file with:
- Visual Studio / Rider HTTP client
- VS Code REST Client extension
- Any HTTP client
Testing
The project includes comprehensive unit tests:
# Run all tests
dotnet test
# Run with detailed output
dotnet test --verbosity detailed
Test Coverage:
- โ Body reading (empty, JSON content)
- โ Shape extraction (query param, header, body field, precedence)
- โ Prompt generation (randomness, shape inclusion, streaming modes)
- โ Request building (temperature, model, messages)
- โ Edge cases (invalid JSON, missing data)
Architecture
graph LR
Client[Client] -->|HTTP Request| API[LLMApi<br/>Minimal API]
API -->|Chat Completion| Ollama[Ollama API<br/>localhost:11434]
Ollama -->|Inference| Model[llama3 Model]
Model -->|Response| Ollama
Ollama -->|JSON/Stream| API
API -->|JSON/SSE| Client
API -.->|uses| Helper[AutoApiHelper]
style API fill:#4CAF50
style Helper fill:#2196F3
style Model fill:#FF9800
Request Flow
sequenceDiagram
participant C as Client
participant A as LLMApi
participant H as AutoApiHelper
participant O as Ollama
participant M as llama3
C->>A: GET/POST/PUT/DELETE /api/auto/**
A->>H: Extract context (method, path, body, shape)
H->>H: Generate random seed + timestamp
H->>H: Build prompt with randomness
H-->>A: Prompt + temperature=1.2
A->>O: POST /v1/chat/completions
O->>M: Run inference
M-->>O: Generated JSON
O-->>A: Response
A-->>C: JSON Response
Shape Control Flow
flowchart TD
Start[Request Arrives] --> CheckQuery{Shape in<br/>Query Param?}
CheckQuery -->|Yes| UseQuery[Use Query Shape]
CheckQuery -->|No| CheckHeader{Shape in<br/>Header?}
CheckHeader -->|Yes| UseHeader[Use Header Shape]
CheckHeader -->|No| CheckBody{Shape in<br/>Body Field?}
CheckBody -->|Yes| UseBody[Use Body Shape]
CheckBody -->|No| NoShape[No Shape Constraint]
UseQuery --> BuildPrompt[Build Prompt]
UseHeader --> BuildPrompt
UseBody --> BuildPrompt
NoShape --> BuildPrompt
BuildPrompt --> AddRandom[Add Random Seed<br/>+ Timestamp]
AddRandom --> SendLLM[Send to LLM]
style UseQuery fill:#4CAF50
style UseHeader fill:#4CAF50
style UseBody fill:#4CAF50
style NoShape fill:#FFC107
Projects:
mostlylucid.mockllmapi: NuGet package libraryLLMApi: Demo applicationLLMApi.Tests: xUnit test suite (13 tests)
Why Use mostlylucid.mockllmapi?
- ๐ Rapid Prototyping: Frontend development without waiting for backend
- ๐ญ Demos: Show realistic data flows without hardcoded fixtures
- ๐งช Testing: Generate varied test data for edge cases
- ๐ API Design: Experiment with response shapes before implementing
- ๐ Learning: Example of LLM integration in .NET minimal APIs
- ๐ Zero Maintenance: No database, no state, no mock data files to maintain
Building the NuGet Package
cd mostlylucid.mockllmapi
dotnet pack -c Release
Package will be in bin/Release/mostlylucid.mockllmapi.{version}.nupkg
Contributing
This is a sample project demonstrating LLM-powered mock APIs. Feel free to fork and customize!
License
This is free and unencumbered software released into the public domain. See LICENSE for details or visit unlicense.org.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- No dependencies.
-
net9.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 2.2.1 | 167 | 12/5/2025 |
| 2.1.0 | 274 | 11/6/2025 |
| 2.1.0-beta1 | 173 | 11/6/2025 |
| 2.0.0-rc3 | 170 | 11/6/2025 |
| 2.0.0-rc2 | 172 | 11/6/2025 |
| 2.0.0-rc1 | 176 | 11/6/2025 |
| 1.7.2 | 185 | 11/6/2025 |
| 1.6.1 | 181 | 11/5/2025 |
| 1.5.0 | 174 | 11/5/2025 |
| 1.2.1 | 176 | 11/4/2025 |
| 1.2.0 | 175 | 11/4/2025 |
| 1.1.4 | 179 | 11/4/2025 |
| 1.1.3 | 176 | 11/4/2025 |
| 1.0.6 | 181 | 11/2/2025 |
| 1.0.5 | 177 | 11/2/2025 |
| 1.0.4 | 181 | 11/2/2025 |
| 1.0.2 | 179 | 11/2/2025 |
| 1.0.1 | 183 | 11/2/2025 |
| 0.0.3 | 181 | 11/2/2025 |