Ater.DeepSeek.Core 1.2.0

dotnet add package Ater.DeepSeek.Core --version 1.2.0
                    
NuGet\Install-Package Ater.DeepSeek.Core -Version 1.2.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Ater.DeepSeek.Core" Version="1.2.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Ater.DeepSeek.Core" Version="1.2.0" />
                    
Directory.Packages.props
<PackageReference Include="Ater.DeepSeek.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Ater.DeepSeek.Core --version 1.2.0
                    
#r "nuget: Ater.DeepSeek.Core, 1.2.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Ater.DeepSeek.Core@1.2.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Ater.DeepSeek.Core&version=1.2.0
                    
Install as a Cake Addin
#tool nuget:?package=Ater.DeepSeek.Core&version=1.2.0
                    
Install as a Cake Tool

DeepSeekSDK-NET

NuGet Version

DeepSeek API SDK specifically for .NET developers

中文文档

🚀 Features

  • List models
  • Chat & Chat streaming
  • Completions & Completions streaming (beta)
  • User balance
  • Local model support
  • ASP.NET Core integration support
  • Function call

Usage Requirements

Usage

Please go to official website, register and apply for DeepSeek's ApiKey

Supported .NET version: .NET8

Install Nuget package

Ater.DeepSeek.Core

dotnet add package Ater.DeepSeek.Core

Instantiate DeepSeekClient

Two methods are provided for instantiation:

public DeepSeekClient(string apiKey);
public DeepSeekClient(HttpClient http, string apiKey);

The first type only requires providing the 'apiKey' to create an instance;

The second method provides a HttpClient parameter, which is suitable for maintaining the HttpClient through the HttpClientFactory and then instance it.

The default timeout for internal HttpClient is 120 seconds, which can be set before sending the request using the 'SetTimeout()' method, or by using the 'CancellationTokeSource' to set the timeout for specific requests.

If you want to call a local model, try customizing HttpClient and setting BaseAddress to the local address.

Calling method

DeepSeekClient class provides six asynchronous methods to call DeepSeek's API:

Task<ModelResponse?> ListModelsAsync(CancellationToken cancellationToken);

Task<ChatResponse?> ChatAsync(ChatRequest request, CancellationToken cancellationToken);

Task<IAsyncEnumerable<Choice>?> ChatStreamAsync(ChatRequest request, CancellationToken cancellationToken);

Task<ChatResponse?> CompletionsAsync(CompletionRequest request, CancellationToken cancellationToken);

Task<IAsyncEnumerable<Choice>?> CompletionsStreamAsync(CompletionRequest request, CancellationToken cancellationToken);

Task<UserResponse?> GetUserBalanceAsync(CancellationToken cancellationToken);

List Models Sample

// Create an instance using the apiKey
var client = new DeepSeekClient(apiKey);

var modelResponse = await client.ListModelsAsync(new CancellationToken());
if (modelResponse is null)
{
    Console.WriteLine(client.ErrorMsg);
    return;
}
foreach (var model in modelResponse.Data)
{
    Console.WriteLine(model);
}

Chat Examples

// Create an instance using the apiKey
var client = new DeepSeekClient(apiKey);
// Construct the request body
var request = new ChatRequest
{
    Messages = [
        Message.NewSystemMessage("You are a language translator"),
        Message.NewUserMessage("Please translate 'They are scared! ' into English!")
    ],
    // Specify the model
    Model = Constant.Model.ChatModel
};

var chatResponse = await client.ChatAsync(request, new CancellationToken());
if (chatResponse is null)
{
    Console.WriteLine(client.ErrorMsg);
}
Console.WriteLine(chatResponse?.Choices.First().Message?.Content);

Chat Examples (Stream)

// Create an instance using the apiKey
var client = new DeepSeekClient(apiKey);
// Construct the request body
var request = new ChatRequest
{
    Messages = [
        Message.NewSystemMessage("You are a language translator"),
        Message.NewUserMessage("Please translate 'They are scared! ' into English!")
    ],
    // Specify the model
    Model = Constant.Model.ChatModel
};

var choices = client.ChatStreamAsync(request, new CancellationToken());
if (choices is null)
{
    Console.WriteLine(client.ErrorMsg);
    return;
}
await foreach (var choice in choices)
{
    Console.Write(choice.Delta?.Content);
}
Console.WriteLine();

Function Calling Example

For example, I have a local function definition:

internal class Functions
{
    public static string GetWeather(WeatherDto dto)
    {
        return $"The weather in {dto.City} on {dto.Date:yyyy-MM-dd} is sunny with a high of 25°C and a low of 15°C.";
    }
}

internal class WeatherDto
{
    public required string City { get; set; }
    
    [Description("The date, default is today's date")]
    public DateOnly Date { get; set; } = DateOnly.FromDateTime(DateTime.UtcNow);
}

When using LLM, pass in the function definition:

public static async Task CallFunctionExampleAsync(DeepSeekClient client)
{
    JsonSerializerOptions options = JsonSerializerOptions.Default;
    // Required configuration, otherwise the generated format will be incorrect.
    JsonSchemaExporterOptions exporterOptions = new()
    {
        TreatNullObliviousAsNonNullable = true,
    };
    var request = new ChatRequest
    {
        Messages = [Message.NewUserMessage("What is the weather in New York today?")],
        Model = DeepSeekModels.ChatModel,
        Stream = true,
        // Add tool definitions
        Tools =
        [
            new Tool
            {
                Function = new RequestFunction
                {
                    Name = "JustUselessFunction",
                    Description = "nothing to do",
                },
            },
            new Tool
            {
                Function = new RequestFunction
                {
                    Name = "GetWeather",
                    Description = "get the weather",
                    Parameters = options.GetJsonSchemaAsNode(
                        typeof(WeatherDto),
                        exporterOptions
                    ),
                },
            },
        ],
    };
    // The first time LLM is returned, it recognizes that a function is to be called and returns the function contents.
    var response = await client.ChatAsync(request, new CancellationToken());
    if (response is null)
    {
        Console.WriteLine(client.ErrorMsg);
        return;
    }

    var message = response.Choices[0].Message;
    if (message == null)
    {
        Console.WriteLine("no message");
        return;
    }
    request.Messages.Add(message); // The message must be added to the request for use in subsequent function calls.
    if (message.ToolCalls != null && message.ToolCalls.Count > 0)
    {
        // If a function call exists, use the local function to obtain the content.
        var tool = message.ToolCalls.FirstOrDefault();
        if (tool?.Function.Name == "GetWeather")
        {
            var weatherDto = JsonSerializer.Deserialize<WeatherDto>(
                tool.Function.Arguments.ToString(),
                options
            );

            var toolResult = Functions.GetWeather(weatherDto);
            // Add the local function call result to the message.
            request.Messages.Add(Message.NewToolMessage(toolResult, tool.Id));

            // Use LLM to process the result again.
            var toolResponse = await client.ChatAsync(request, new CancellationToken());
            if (toolResponse is null)
            {
                Console.WriteLine(client.ErrorMsg);
                return;
            }

            Console.WriteLine(toolResponse.Choices[0].Message?.Content);
        }
    }
    else
    {
        Console.WriteLine("No tool calls found in the response.");
    }
}

Local Model Examples

// use local models api
var httpClient = new HttpClient
{
    // set your local api address
    BaseAddress = new Uri("http://localhost:5000"),
    Timeout = TimeSpan.FromSeconds(300),
};
// if have api key
// httpClient.DefaultRequestHeaders.TryAddWithoutValidation("Authorization", "Bearer " + "your_token");

var localClient = new DeepSeekClient(httpClient);
localClient.SetChatEndpoint("/chat");
localClient.SetCompletionEndpoint("/completions");

var res = await localClient.ChatAsync(new ChatRequest
{
    Messages = new List<Message>
    {
        Message.NewUserMessage("hello")
    }
}, new CancellationToken());

return res?.Choices.First().Message?.Content;

ASP.NET Core Integration

Install Ater.DeepSeek.AspNetCore package

dotnet add package Ater.DeepSeek.AspNetCore

Usage in ASP.NET Core

using DeepSeek.AspNetCore;
using DeepSeek.Core;
using DeepSeek.Core.Models;
using Microsoft.AspNetCore.Mvc;

var builder = WebApplication.CreateBuilder(args);

var apiKey = builder.Configuration["DeepSeekApiKey"];
builder.Services.AddDeepSeek(option =>
{
    option.BaseAddress = new Uri("https://api.deepseek.com");
    option.Timeout = TimeSpan.FromSeconds(300);
    option.DefaultRequestHeaders.TryAddWithoutValidation("Authorization", "Bearer " + apiKey);
});

var app = builder.Build();

app.MapGet("/test", async ([FromServices] DeepSeekClient client) =>
{
    var res = await client.ChatAsync(new ChatRequest
    {
        Messages = new List<Message>
        {
            Message.NewUserMessage("Why dotnet is good?")
        },
        MaxTokens = 200
    }, new CancellationToken());

    return res?.Choices.First().Message?.Content;
});

app.Run();

Usage in ASP.NET Core (Stream)

app.MapGet("/chat", async (HttpContext context, [FromServices] DeepSeekClient client, CancellationToken token) =>
{
    context.Response.ContentType = "text/text;charset=utf-8";
    try
    {
        var choices = client.ChatStreamAsync(new ChatRequest
        {
            Messages = new List<Message>
            {
                Message.NewUserMessage("Why dotnet is good?")
            },
            MaxTokens = 200
        }, token);

        if (choices != null)
        {
            await foreach (var choice in choices)
            {
                await context.Response.WriteAsync(choice.Delta!.Content);
            }
        }
    }
    catch (Exception ex)
    {
        await context.Response.WriteAsync("暂时无法提供服务" + ex.Message);
    }
    await context.Response.CompleteAsync();
});
Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • net8.0

    • No dependencies.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on Ater.DeepSeek.Core:

Package Downloads
Ater.DeepSeek.AspNetCore

DeepSeek API SDK

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
1.2.0 139 8/20/2025
1.1.6 216 7/31/2025
1.1.5 1,762 4/18/2025
1.1.4 4,049 3/4/2025
1.1.3 902 2/11/2025
1.1.2 234 2/10/2025
1.1.1 167 2/8/2025
1.1.0 1,715 1/23/2025
1.0.1 635 5/16/2024
1.0.0 152 5/16/2024
1.0.0-RC1 127 5/11/2024

1. Add Function call support!