AntRunner.Chat
0.9.1
dotnet add package AntRunner.Chat --version 0.9.1
NuGet\Install-Package AntRunner.Chat -Version 0.9.1
<PackageReference Include="AntRunner.Chat" Version="0.9.1" />
<PackageVersion Include="AntRunner.Chat" Version="0.9.1" />
<PackageReference Include="AntRunner.Chat" />
paket add AntRunner.Chat --version 0.9.1
#r "nuget: AntRunner.Chat, 0.9.1"
#:package AntRunner.Chat@0.9.1
#addin nuget:?package=AntRunner.Chat&version=0.9.1
#tool nuget:?package=AntRunner.Chat&version=0.9.1
AntRunner.Chat
AntRunner.Chat is a .NET library that lets you easily create and manage conversations with tool-based AI assistants. These AI agents can help you by answering questions, running tools, or performing tasks all through a simple chat interface.
What can you do with AntRunner.Chat?
- Converse naturally with AI assistants powered by OpenAI models like GPT-4o and gpt-4.1-mini.
- Manage multi-turn conversations with context.
- Switch between different AI assistants on the fly.
- Undo the last message if you want to change something.
- Save and load conversations to keep your chat history.
- Track token usage to monitor your API consumption.
Getting Started
Install the NuGet Package
You can add AntRunner.Chat to your .NET project using the NuGet package manager:
dotnet add package AntRunner.Chat --version 0.6.1
Or, add this directive in your C# notebook or script:
#r "nuget: AntRunner.Chat, 0.6.1"
Basic Usage Example
Here is a simple example showing how to start a conversation with an AI assistant and send messages:
using System.Threading.Tasks;
using AntRunner.Chat;
// Load your environment variables and AI service configuration (example method)
var envVariables = Settings.GetEnvironmentVariables();
foreach (var kvp in envVariables)
{
Environment.SetEnvironmentVariable(kvp.Key, kvp.Value);
}
var config = AzureOpenAiConfigFactory.Get();
// Set up the chat options with assistant name and model deployment
static ChatRunOptions chatConfiguration = new()
{
AssistantName = "Python Ants",
DeploymentId = "gpt-4.1-mini",
};
// Create a new conversation instance
var conversation = await Conversation.Create(chatConfiguration, config);
// Convenience method to send a message and get a response
async Task<ChatRunOutput> Chat(string message)
{
chatConfiguration.Instructions = message;
var runnerOutput = await conversation.Chat(message);
runnerOutput.LastMessage.DisplayAs("text/markdown");
return runnerOutput;
}
// Example usage: send a message
var output = await Chat("Hello AI! What can you do?");
More Features
Continue the Conversation
You can keep chatting by sending more messages:
var output = await Chat("Tell me about the prerequisites for setting up .NET 8 SDK and Docker.");
Undo the Last Message
If you want to remove the last message from the conversation:
conversation.Undo();
Switch Assistants
Change to a different AI assistant anytime:
await conversation.ChangeAssistant("Web Ants");
var output = await Chat("What events are happening next week in my city?");
Save and Load Conversations
Save your conversation to a file:
conversation.Save("./savedConversation.json");
Load a conversation from a saved file:
var conversation2 = Conversation.Create(@"./savedConversation.json", AzureOpenAiConfigFactory.Get());
View Conversation and Usage Stats
See the last message or the full dialog:
var lastMessage = conversation.LastResponse.LastMessage;
var fullDialog = conversation.LastResponse.Dialog;
Check token usage for the entire conversation or a single turn:
var usage = conversation.Usage;
var lastTurnUsage = output.Usage;
Container Sandboxes
A major feature of AntRunner.Chat is the use of container sandboxes to provide isolated environments for different AI assistants and tools. These sandboxes are implemented as Docker containers, enabling you to run services such as the .NET server, Python environments with or without CUDA support, PlantUML, and more.
Available Containers
- dotnet-server: The main .NET service container.
- python-app: Python 3.11 environment with .NET 9 SDK and optional CUDA support.
- plantuml: Container for PlantUML diagram rendering.
- qdrant: Vector search engine container.
- kernel-memory: Custom service container for kernel memory management.
Setup Guide and Build Scripts
To help you get started with these containers, we provide a comprehensive Setup Guide that covers all prerequisites and detailed instructions for building the local Docker images.
You can use the provided build scripts:
build_local_images.sh
for Linux/macOSbuild_local_images.ps1
for Windows PowerShell
These scripts will prompt you to select whether to build the CPU-only or CUDA-enabled Python images and will build all necessary images in the correct order.
Make sure to follow the setup guide to prepare your environment and build the containers before running the solution.
Summary
AntRunner.Chat makes it simple to build powerful AI chat experiences in your .NET applications. Whether you want to create helpful assistants, automate tasks, or build interactive tools, AntRunner.Chat provides an easy-to-use interface to OpenAI-powered conversations.
If you have questions or want to contribute, feel free to open an issue or pull request!
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- AntRunner.ToolCalling (>= 1.1.0)
- OpenAI-DotNet (>= 8.7.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
0.9.1 | 153 | 6/28/2025 |
0.9.0 | 189 | 6/25/2025 |
0.8.7 | 191 | 6/25/2025 |
0.8.6 | 175 | 6/25/2025 |
0.8.5 | 188 | 6/16/2025 |
0.8.4 | 194 | 5/28/2025 |
0.8.3 | 196 | 5/20/2025 |
0.8.2 | 220 | 5/19/2025 |
0.8.1 | 182 | 5/19/2025 |
0.7.1 | 279 | 5/12/2025 |
0.7.0 | 286 | 5/12/2025 |
0.6.5 | 138 | 5/10/2025 |
0.6.4 | 195 | 5/5/2025 |
0.6.3 | 197 | 5/5/2025 |
0.6.2 | 225 | 4/28/2025 |
0.6.1 | 215 | 4/28/2025 |
0.5.1 | 208 | 4/24/2025 |
Streaming support, intelligent model parameters to support thinking models and non with the same definition, lots of tests added.