AIHelperLibrary 1.0.0
See the version list below for details.
dotnet add package AIHelperLibrary --version 1.0.0
NuGet\Install-Package AIHelperLibrary -Version 1.0.0
<PackageReference Include="AIHelperLibrary" Version="1.0.0" />
<PackageVersion Include="AIHelperLibrary" Version="1.0.0" />
<PackageReference Include="AIHelperLibrary" />
paket add AIHelperLibrary --version 1.0.0
#r "nuget: AIHelperLibrary, 1.0.0"
#:package AIHelperLibrary@1.0.0
#addin nuget:?package=AIHelperLibrary&version=1.0.0
#tool nuget:?package=AIHelperLibrary&version=1.0.0
AIHelperLibrary
Overview
The AIHelperLibrary is a modular and reusable .NET library for integrating with OpenAI's API. Designed for flexibility and efficiency, it supports creating dynamic prompts, managing chatbot sessions, and generating AI-driven responses with ease. Whether you're building simple AI integrations or complex, context-aware chatbot systems, this library provides the tools to streamline development.
Features
Core Functionality
OpenAI Client:
- Supports models like
GPT-3.5-Turbo,GPT-4, andGPT-4o. - Configurable retry logic, proxy support, and error handling.
- Supports models like
Dynamic Prompt Management:
- Define, store, and reuse custom prompts programmatically.
- Contextual system prompts to customize AI behavior.
Chatbot Session Management:
- Maintain multi-turn conversations with persistent context.
- Configurable chat history size for memory management.
Advanced Configuration:
- Fine-tune responses using temperature, token limits, and sampling parameters.
- Proxy support for restricted environments.
Additional Features
- Built-in Retry Logic: Robust retry mechanism with configurable delay.
- Logging Support: Optional logging of requests and responses for debugging.
Installation
Install the package via NuGet:
Install-Package AIHelperLibrary -Version 1.0.0
Or via the .NET CLI:
dotnet add package AIHelperLibrary --version 1.0.0
Getting Started
Configuration Example
Define your preferences using AIExtensionHelperConfiguration:
using AIHelperLibrary.Configurations;
using AIHelperLibrary.Services;
var config = new AIExtensionHelperConfiguration
{
DefaultModel = AIModel.GPT_4,
MaxTokens = 200,
Temperature = 0.7,
TopP = 1.0,
RequestTimeoutMs = 10000,
EnableLogging = true
};
var client = new OpenAIClient("your-api-key", config);
Generate AI Responses
Custom Prompt
Send a freeform prompt to the AI:
var response = await client.GenerateTextAsync("Explain the difference between AI and machine learning.");
Console.WriteLine(response);
Predefined Prompts
Use built-in templates for common tasks:
var predefinedPrompt = PromptManager.GetPrompt(PromptType.Summarize);
var summary = await client.GenerateTextWithPredefinedPromptAsync(predefinedPrompt, "Artificial intelligence is transforming the world...");
Console.WriteLine(summary);
Dynamic Prompts
Define and reuse custom prompts programmatically:
var promptManager = new DynamicPromptManager();
promptManager.AddPrompt("Greeting", "Greet the user warmly and politely.");
var response = await client.GenerateTextWithDynamicPromptAsync(promptManager, "Greeting", "Hi there!");
Console.WriteLine(response);
Chatbot Sessions
Maintain multi-turn conversations with persistent context:
var chatResponse = await client.GenerateChatResponseAsync(
"CustomerSupportBot",
"What are your working hours?",
"You are a polite and helpful customer support assistant."
);
Console.WriteLine(chatResponse);
Supported Models
General Purpose Models
- GPT-3.5-Turbo
- GPT-3.5-Turbo-16k
- GPT-4
- GPT-4-Turbo
Optimized Models
- GPT-4o
- GPT-4o-Mini
- GPT-o1
- GPT-o1-Mini
Advanced Configuration
| Property | Description | Default Value |
|---|---|---|
DefaultModel |
The AI model to use. | GPT-3.5-Turbo |
MaxTokens |
Maximum tokens for each response. | 150 |
Temperature |
Controls creativity (0.0 = deterministic). | 0.7 |
TopP |
Sampling parameter for diversity. | 1.0 |
RequestTimeoutMs |
Timeout for API requests (in milliseconds). | 10000 |
EnableLogging |
Logs requests and responses when enabled. | false |
MaxRetryCount |
Maximum retry attempts for failed requests. | 3 |
RetryDelayMs |
Delay between retries (in milliseconds). | 2000 |
ProxyUrl |
URL for proxy configuration. | (empty) |
ProxyPort |
Port for the proxy server. | 0 |
Contribution
Contributions are welcome! To contribute:
- Fork the repository.
- Create a feature branch.
- Submit a pull request.
License
This library is open-source and licensed under the MIT License.
Contact
For support or inquiries, reach out via GitHub or email at ns.dev.contact@gmail.com.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Newtonsoft.Json (>= 13.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.