Dewiride.Azure.AI.OpenAI.Helper 1.0.3

Prefix Reserved
dotnet add package Dewiride.Azure.AI.OpenAI.Helper --version 1.0.3
                    
NuGet\Install-Package Dewiride.Azure.AI.OpenAI.Helper -Version 1.0.3
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Dewiride.Azure.AI.OpenAI.Helper" Version="1.0.3" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Dewiride.Azure.AI.OpenAI.Helper" Version="1.0.3" />
                    
Directory.Packages.props
<PackageReference Include="Dewiride.Azure.AI.OpenAI.Helper" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Dewiride.Azure.AI.OpenAI.Helper --version 1.0.3
                    
#r "nuget: Dewiride.Azure.AI.OpenAI.Helper, 1.0.3"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Dewiride.Azure.AI.OpenAI.Helper@1.0.3
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Dewiride.Azure.AI.OpenAI.Helper&version=1.0.3
                    
Install as a Cake Addin
#tool nuget:?package=Dewiride.Azure.AI.OpenAI.Helper&version=1.0.3
                    
Install as a Cake Tool

Azure OpenAI Helper

The Azure OpenAI Helper provides utility methods to interact with Azure OpenAI endpoints for both standard API requests and streaming responses. It includes robust retry logic and detailed logging for handling errors and timeouts.

Features

  • Non-Streaming Responses: Send a request to Azure OpenAI and receive a deserialized response.
  • Streaming Responses: Stream responses from Azure OpenAI, useful for large or dynamic outputs.
  • Retry Logic: Automatically retries requests on failure with configurable retry count and delay.
  • Logging: Logs errors and retry attempts for easier debugging.

Installation

  1. Ensure your project references Newtonsoft.Json and Microsoft.Extensions.Logging.
  2. Add this helper to your project as part of a service.

Methods

1. GetChatCompletionResponseAsync

Send a standard request to the Azure OpenAI API and receive a deserialized response.

Method Signature
public async Task<TResponse?> GetChatCompletionResponseAsync<TRequest, TResponse>(
    string apiEndpoint,
    string apiKey,
    TRequest dataRequest,
    int maxRetryAttempts = 10,
    int retryDelayMs = 1000
);
Parameters
  • apiEndpoint (string): The URL of the Azure OpenAI endpoint.
  • apiKey (string): Your Azure OpenAI API key.
  • dataRequest (TRequest): The request payload.
  • maxRetryAttempts (int): Maximum number of retries in case of failure (default: 10).
  • retryDelayMs (int): Delay between retries in milliseconds (default: 1000).
Example Usage
var helper = new AzureOpenAiHelper(logger);

var request = new
{
    model = "gpt-4",
    messages = new[]
    {
        new { role = "system", content = "You are a helpful assistant." },
        new { role = "user", content = "Hello, how are you?" }
    }
};

var response = await helper.GetChatCompletionResponseAsync<object, OpenAiDataResponse>(
    apiEndpoint: "https://<your-endpoint>.openai.azure.com/openai/deployments/<deployment-id>/chat/completions?api-version=2024-01-01",
    apiKey: "<your-api-key>",
    dataRequest: request
);

if (response != null)
{
    Console.WriteLine($"Response: {response.Choices.FirstOrDefault()?.Message?.Content}");
}
else
{
    Console.WriteLine("Failed to retrieve a response.");
}

2. GetChatCompletionStreamedResponseAsync

Stream the response from Azure OpenAI, processing each chunk of data as it arrives.

Method Signature
public async Task GetChatCompletionStreamedResponseAsync<TRequest>(
    string azureOpenAiEndpoint,
    string azureOpenAiKey,
    TRequest request,
    Action<string> onMessageReceived,
    int maxRetryAttempts = 5,
    int retryDelayMs = 1000
);
Parameters
  • azureOpenAiEndpoint (string): The URL of the Azure OpenAI streaming endpoint.
  • azureOpenAiKey (string): Your Azure OpenAI API key.
  • request (TRequest): The request payload.
  • onMessageReceived (Action<string>): A callback to handle streamed content.
  • maxRetryAttempts (int): Maximum number of retries in case of failure (default: 5).
  • retryDelayMs (int): Delay between retries in milliseconds (default: 1000).
Example Usage
var helper = new AzureOpenAiHelper(logger);

var request = new
{
    model = "gpt-4",
    messages = new[]
    {
        new { role = "system", content = "You are a helpful assistant." },
        new { role = "user", content = "Tell me a story!" }
    },
    stream = true
};

await helper.GetChatCompletionStreamedResponseAsync(
    azureOpenAiEndpoint: "https://<your-endpoint>.openai.azure.com/openai/deployments/<deployment-id>/chat/completions?api-version=2024-01-01",
    azureOpenAiKey: "<your-api-key>",
    request: request,
    onMessageReceived: message =>
    {
        Console.WriteLine($"Streamed content: {message}");
    }
);

Logging

The AzureOpenAiHelper relies on ILogger<AzureOpenAiHelper> for logging. To use this feature, ensure you inject an appropriate logger instance into the helper.


Configuration

  • Retry Configuration: Customize maxRetryAttempts and retryDelayMs to suit your application's needs.
  • Timeout Handling: Both methods handle timeouts gracefully using retry logic and logging warnings.

Dependencies

  • Newtonsoft.Json: For JSON serialization and deserialization.
  • Microsoft.Extensions.Logging: For logging errors and warnings.

License

This project is licensed under the MIT License. See LICENSE for details.

Company

  • Dewiride Technologies Private Limited

Repository


Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
1.0.3 127 1/28/2025
1.0.2 109 1/28/2025
1.0.1 111 1/28/2025
1.0.0 105 1/28/2025

Fix Streaming Capability.