Aspire.Azure.AI.Inference 9.4.1-preview.1.25408.4

Prefix Reserved
This is a prerelease version of Aspire.Azure.AI.Inference.
dotnet add package Aspire.Azure.AI.Inference --version 9.4.1-preview.1.25408.4
                    
NuGet\Install-Package Aspire.Azure.AI.Inference -Version 9.4.1-preview.1.25408.4
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Aspire.Azure.AI.Inference" Version="9.4.1-preview.1.25408.4" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Aspire.Azure.AI.Inference" Version="9.4.1-preview.1.25408.4" />
                    
Directory.Packages.props
<PackageReference Include="Aspire.Azure.AI.Inference" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Aspire.Azure.AI.Inference --version 9.4.1-preview.1.25408.4
                    
#r "nuget: Aspire.Azure.AI.Inference, 9.4.1-preview.1.25408.4"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Aspire.Azure.AI.Inference@9.4.1-preview.1.25408.4
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Aspire.Azure.AI.Inference&version=9.4.1-preview.1.25408.4&prerelease
                    
Install as a Cake Addin
#tool nuget:?package=Aspire.Azure.AI.Inference&version=9.4.1-preview.1.25408.4&prerelease
                    
Install as a Cake Tool

Aspire.Azure.AI.Inference library

Registers ChatCompletionsClient as a singleton in the DI container for connecting to Azure AI Foundry and GitHub Models. Enables corresponding metrics, logging and telemetry.

Getting started

Prerequisites

Install the package

Install the .NET Aspire Azure Inference library with NuGet:

dotnet add package Aspire.Azure.AI.Inference

Usage example

In the AppHost.cs file of your project, call the AddChatCompletionsClient extension method to register a ChatCompletionsClient for use via the dependency injection container. The method takes a connection name parameter.

builder.AddChatCompletionsClient("connectionName");

You can then retrieve the ChatCompletionsClient instance using dependency injection. For example, to retrieve the client from a Web API controller:

private readonly ChatCompletionsClient _client;

public CognitiveController(ChatCompletionsClient client)
{
    _client = client;
}

See the Azure AI Foundry SDK quickstarts for examples on using the ChatCompletionsClient.

Configuration

The .NET Aspire Azure AI Inference library provides multiple options to configure the Azure AI Foundry Service based on the requirements and conventions of your project. Note that either an Endpoint and DeploymentId, or a ConnectionString is required to be supplied.

Use a connection string

A connection can be constructed from the Keys, Deployment ID and Endpoint tab with the format Endpoint={endpoint};Key={key};DeploymentId={deploymentId}. You can provide the name of the connection string when calling builder.AddChatCompletionsClient():

builder.AddChatCompletionsClient("connectionName");

And then the connection string will be retrieved from the ConnectionStrings configuration section. Two connection formats are supported:

Azure AI Foundry Endpoint

The recommended approach is to use an Endpoint, which works with the ChatCompletionsClientSettings.Credential property to establish a connection. If no credential is configured, the DefaultAzureCredential is used.

{
  "ConnectionStrings": {
    "connectionName": "Endpoint=https://{endpoint}/;DeploymentId={deploymentName}"
  }
}
Connection string

Alternatively, a custom connection string can be used.

{
  "ConnectionStrings": {
    "connectionName": "Endpoint=https://{endpoint}/;Key={account_key};DeploymentId={deploymentName}"
  }
}

Use configuration providers

The .NET Aspire Azure AI Inference library supports Microsoft.Extensions.Configuration. It loads the ChatCompletionsClientSettings and AzureAIInferenceClientOptions from configuration by using the Aspire:Azure:AI:Inference key. Example appsettings.json that configures some of the options:

{
  "Aspire": {
    "Azure": {
      "AI": {
        "Inference": {
          "DisableTracing": false,
          "ClientOptions": {
            "UserAgentApplicationId": "myapp"
          }
        }
      }
    }
  }
}

Use inline delegates

You can also pass the Action<ChatCompletionsClientSettings> configureSettings delegate to set up some or all the options inline, for example to disable tracing from code:

builder.AddChatCompletionsClient("connectionName", settings => settings.DisableTracing = true);

You can also setup the AzureAIInferenceClientOptions using the optional Action<IAzureClientBuilder<ChatCompletionsClient, AzureAIInferenceClientOptions>> configureClientBuilder parameter of the AddChatCompletionsClient method. For example, to set the client ID for this client:

builder.AddChatCompletionsClient("connectionName", configureClientBuilder: builder => builder.ConfigureOptions(options => options.NetworkTimeout = TimeSpan.FromSeconds(2)));

This can also be used to add a custom scope for the TokenCredential when the specific error message is returned:

401 Unauthorized. Access token is missing, invalid, audience is incorrect, or have expired.

builder.AddAzureChatCompletionsClient("chat", configureClientBuilder: builder =>
{
    var credential = new DefaultAzureCredential();
    var tokenPolicy = new BearerTokenAuthenticationPolicy(credential, "https://cognitiveservices.azure.us/.default");
    builder.ConfigureOptions(options => options.AddPolicy(tokenPolicy, HttpPipelinePosition.PerRetry));
}
);

Experimental Telemetry

Azure AI OpenAI telemetry support is experimental, the shape of traces may change in the future without notice. It can be enabled by invoking

AppContext.SetSwitch("Azure.Experimental.EnableActivitySource", true);

or by setting the "AZURE_EXPERIMENTAL_ENABLE_ACTIVITY_SOURCE" environment variable to "true".

Additional documentation

Feedback & contributing

https://github.com/dotnet/aspire

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
9.4.1-preview.1.25408.4 13 8/12/2025
9.4.0-preview.1.25378.8 103 7/29/2025
9.3.1-preview.1.25305.6 317 6/10/2025
9.3.0-preview.1.25265.20 138 5/19/2025