Aspire.Azure.AI.Inference
9.4.1-preview.1.25408.4
Prefix Reserved
dotnet add package Aspire.Azure.AI.Inference --version 9.4.1-preview.1.25408.4
NuGet\Install-Package Aspire.Azure.AI.Inference -Version 9.4.1-preview.1.25408.4
<PackageReference Include="Aspire.Azure.AI.Inference" Version="9.4.1-preview.1.25408.4" />
<PackageVersion Include="Aspire.Azure.AI.Inference" Version="9.4.1-preview.1.25408.4" />
<PackageReference Include="Aspire.Azure.AI.Inference" />
paket add Aspire.Azure.AI.Inference --version 9.4.1-preview.1.25408.4
#r "nuget: Aspire.Azure.AI.Inference, 9.4.1-preview.1.25408.4"
#:package Aspire.Azure.AI.Inference@9.4.1-preview.1.25408.4
#addin nuget:?package=Aspire.Azure.AI.Inference&version=9.4.1-preview.1.25408.4&prerelease
#tool nuget:?package=Aspire.Azure.AI.Inference&version=9.4.1-preview.1.25408.4&prerelease
Aspire.Azure.AI.Inference library
Registers ChatCompletionsClient as a singleton in the DI container for connecting to Azure AI Foundry and GitHub Models. Enables corresponding metrics, logging and telemetry.
Getting started
Prerequisites
- Azure subscription - create one for free
- Azure AI Foundry Resource - create an Azure AI Foundry resource
Install the package
Install the .NET Aspire Azure Inference library with NuGet:
dotnet add package Aspire.Azure.AI.Inference
Usage example
In the AppHost.cs file of your project, call the AddChatCompletionsClient
extension method to register a ChatCompletionsClient
for use via the dependency injection container. The method takes a connection name parameter.
builder.AddChatCompletionsClient("connectionName");
You can then retrieve the ChatCompletionsClient
instance using dependency injection. For example, to retrieve the client from a Web API controller:
private readonly ChatCompletionsClient _client;
public CognitiveController(ChatCompletionsClient client)
{
_client = client;
}
See the Azure AI Foundry SDK quickstarts for examples on using the ChatCompletionsClient
.
Configuration
The .NET Aspire Azure AI Inference library provides multiple options to configure the Azure AI Foundry Service based on the requirements and conventions of your project. Note that either an Endpoint
and DeploymentId
, or a ConnectionString
is required to be supplied.
Use a connection string
A connection can be constructed from the Keys, Deployment ID and Endpoint tab with the format Endpoint={endpoint};Key={key};DeploymentId={deploymentId}
. You can provide the name of the connection string when calling builder.AddChatCompletionsClient()
:
builder.AddChatCompletionsClient("connectionName");
And then the connection string will be retrieved from the ConnectionStrings
configuration section. Two connection formats are supported:
Azure AI Foundry Endpoint
The recommended approach is to use an Endpoint, which works with the ChatCompletionsClientSettings.Credential
property to establish a connection. If no credential is configured, the DefaultAzureCredential is used.
{
"ConnectionStrings": {
"connectionName": "Endpoint=https://{endpoint}/;DeploymentId={deploymentName}"
}
}
Connection string
Alternatively, a custom connection string can be used.
{
"ConnectionStrings": {
"connectionName": "Endpoint=https://{endpoint}/;Key={account_key};DeploymentId={deploymentName}"
}
}
Use configuration providers
The .NET Aspire Azure AI Inference library supports Microsoft.Extensions.Configuration. It loads the ChatCompletionsClientSettings
and AzureAIInferenceClientOptions
from configuration by using the Aspire:Azure:AI:Inference
key. Example appsettings.json
that configures some of the options:
{
"Aspire": {
"Azure": {
"AI": {
"Inference": {
"DisableTracing": false,
"ClientOptions": {
"UserAgentApplicationId": "myapp"
}
}
}
}
}
}
Use inline delegates
You can also pass the Action<ChatCompletionsClientSettings> configureSettings
delegate to set up some or all the options inline, for example to disable tracing from code:
builder.AddChatCompletionsClient("connectionName", settings => settings.DisableTracing = true);
You can also setup the AzureAIInferenceClientOptions using the optional Action<IAzureClientBuilder<ChatCompletionsClient, AzureAIInferenceClientOptions>> configureClientBuilder
parameter of the AddChatCompletionsClient
method. For example, to set the client ID for this client:
builder.AddChatCompletionsClient("connectionName", configureClientBuilder: builder => builder.ConfigureOptions(options => options.NetworkTimeout = TimeSpan.FromSeconds(2)));
This can also be used to add a custom scope for the TokenCredential
when the specific error message is returned:
401 Unauthorized. Access token is missing, invalid, audience is incorrect, or have expired.
builder.AddAzureChatCompletionsClient("chat", configureClientBuilder: builder =>
{
var credential = new DefaultAzureCredential();
var tokenPolicy = new BearerTokenAuthenticationPolicy(credential, "https://cognitiveservices.azure.us/.default");
builder.ConfigureOptions(options => options.AddPolicy(tokenPolicy, HttpPipelinePosition.PerRetry));
}
);
Experimental Telemetry
Azure AI OpenAI telemetry support is experimental, the shape of traces may change in the future without notice. It can be enabled by invoking
AppContext.SetSwitch("Azure.Experimental.EnableActivitySource", true);
or by setting the "AZURE_EXPERIMENTAL_ENABLE_ACTIVITY_SOURCE" environment variable to "true".
Additional documentation
- https://learn.microsoft.com/dotnet/api/azure.ai.inference
- https://github.com/dotnet/aspire/tree/main/src/Components/README.md
Feedback & contributing
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Azure.AI.Inference (>= 1.0.0-beta.5)
- Azure.Core (>= 1.47.0)
- Azure.Identity (>= 1.14.2)
- Microsoft.Extensions.AI (>= 9.7.0)
- Microsoft.Extensions.AI.AzureAIInference (>= 9.7.0-preview.1.25356.2)
- Microsoft.Extensions.Azure (>= 1.12.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 8.0.0)
- Microsoft.Extensions.Configuration.Binder (>= 8.0.2)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 8.0.2)
- Microsoft.Extensions.Diagnostics.HealthChecks (>= 8.0.18)
- Microsoft.Extensions.Hosting.Abstractions (>= 8.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 8.0.3)
- Microsoft.Extensions.Options (>= 8.0.2)
- Microsoft.Extensions.Primitives (>= 8.0.0)
- OpenTelemetry.Extensions.Hosting (>= 1.9.0)
-
net9.0
- Azure.AI.Inference (>= 1.0.0-beta.5)
- Azure.Core (>= 1.47.0)
- Azure.Identity (>= 1.14.2)
- Microsoft.Extensions.AI (>= 9.7.0)
- Microsoft.Extensions.AI.AzureAIInference (>= 9.7.0-preview.1.25356.2)
- Microsoft.Extensions.Azure (>= 1.12.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 9.0.7)
- Microsoft.Extensions.Configuration.Binder (>= 9.0.7)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 9.0.7)
- Microsoft.Extensions.Diagnostics.HealthChecks (>= 9.0.7)
- Microsoft.Extensions.Hosting.Abstractions (>= 9.0.7)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.7)
- Microsoft.Extensions.Options (>= 9.0.7)
- Microsoft.Extensions.Primitives (>= 9.0.7)
- OpenTelemetry.Extensions.Hosting (>= 1.9.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
9.4.1-preview.1.25408.4 | 13 | 8/12/2025 |
9.4.0-preview.1.25378.8 | 103 | 7/29/2025 |
9.3.1-preview.1.25305.6 | 317 | 6/10/2025 |
9.3.0-preview.1.25265.20 | 138 | 5/19/2025 |