OnnxStack.Core 0.8.0

Prefix Reserved
There is a newer version of this package available.
See the version list below for details.
dotnet add package OnnxStack.Core --version 0.8.0
                    
NuGet\Install-Package OnnxStack.Core -Version 0.8.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OnnxStack.Core" Version="0.8.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="OnnxStack.Core" Version="0.8.0" />
                    
Directory.Packages.props
<PackageReference Include="OnnxStack.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add OnnxStack.Core --version 0.8.0
                    
#r "nuget: OnnxStack.Core, 0.8.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#addin nuget:?package=OnnxStack.Core&version=0.8.0
                    
Install OnnxStack.Core as a Cake Addin
#tool nuget:?package=OnnxStack.Core&version=0.8.0
                    
Install OnnxStack.Core as a Cake Tool

OnnxStack.Core - Onnx Services for .NET Applications

OnnxStack.Core is a library that provides higher-level ONNX services for use in .NET applications. It offers extensive support for features such as dependency injection, .NET configuration implementations, ASP.NET Core integration, and IHostedService support.

You can configure a model set for runtime, offloading individual models to different devices to make better use of resources or run on lower-end hardware. The first use-case is StableDiffusion; however, it will be expanded, and other model sets, such as object detection and classification, will be added.

Getting Started

OnnxStack.Core can be found via the nuget package manager, download and install it.

PM> Install-Package OnnxStack.Core

.NET Core Registration

You can easily integrate OnnxStack.Core into your application services layer. This registration process sets up the necessary services and loads the appsettings.json configuration.

Example: Registering OnnxStack

builder.Services.AddOnnxStack();

Configuration example

The appsettings.json is the easiest option for configuring model sets. Below is an example of clip tokenizer.

{
	"Logging": {
		"LogLevel": {
			"Default": "Information",
			"Microsoft.AspNetCore": "Warning"
		}
	},
	"AllowedHosts": "*",

	"OnnxStackConfig": {
		"Name": "Clip Tokenizer",
		"TokenizerLimit": 77,
		"ModelConfigurations": [{
			"Type": "Tokenizer",
			"DeviceId": 0,
			"ExecutionProvider": "Cpu",
			"OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\cliptokenizer.onnx"
		}]
	}
}

Basic C# Example


// From DI
IOnnxModelService _onnxModelService;


// Tokenizer model Example
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });
var inputString = new List<NamedOnnxValue>
{
	NamedOnnxValue.CreateFromTensor("string_input", inputTensor)
};

// Create an InferenceSession from the Onnx clip tokenizer.
// Run session and send the input data in to get inference output. 
using (var tokens = _onnxModelService.RunInference(OnnxModelType.Tokenizer, inputString))
{
	var resultTensor = tokens.ToArray();
}

Basic C# Example (No DI)

// Create Configuration
var onnxStackConfig = new OnnxStackConfig
{
    Name = "OnnxStack",
    TokenizerLimit = 77,
    ModelConfigurations = new List<OnnxModelSessionConfig>
    {
        new OnnxModelSessionConfig
        {
            DeviceId = 0,
            ExecutionProvider = ExecutionProvider.DirectML,

            Type = OnnxModelType.Tokenizer,
            OnnxModelPath = "clip_tokenizer.onnx",
        }
    }
};

// Create Service
var onnxModelService = new OnnxModelService(onnxStackConfig);


// Tokenizer model Example
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });
var inputString = new List<NamedOnnxValue>
{
    NamedOnnxValue.CreateFromTensor("string_input", inputTensor)
};

// Create an InferenceSession from the Onnx clip tokenizer.
// Run session and send the input data in to get inference output. 
using (var tokens = onnxModelService.RunInference(OnnxModelType.Tokenizer, inputString))
{
    var resultTensor = tokens.ToArray();
}

Product Compatible and additional computed target framework versions.
.NET net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on OnnxStack.Core:

Package Downloads
OnnxStack.StableDiffusion

Stable Diffusion Library for .NET

OnnxStack.ImageUpscaler

OnnxRuntime Image Upscale Library for .NET

OnnxStack.FeatureExtractor

OnnxRuntime Image Feature Extractor Library for .NET

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on OnnxStack.Core:

Repository Stars
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
Version Downloads Last updated
0.39.0 1,455 6/12/2024
0.31.0 363 4/25/2024
0.27.0 248 3/31/2024
0.25.0 225 3/14/2024
0.23.0 245 2/29/2024
0.22.0 203 2/23/2024
0.21.0 229 2/15/2024
0.19.0 226 2/1/2024
0.17.0 232 1/18/2024
0.16.0 183 1/11/2024
0.15.0 259 1/5/2024
0.14.0 250 12/27/2023
0.13.0 189 12/22/2023
0.12.0 182 12/15/2023
0.10.0 205 11/30/2023
0.9.0 179 11/23/2023
0.8.0 239 11/16/2023
0.7.0 185 11/9/2023
0.6.0 164 11/2/2023
0.5.0 181 10/27/2023
0.4.0 162 10/19/2023
0.3.1 181 10/9/2023
0.3.0 156 10/9/2023
0.2.0 162 10/3/2023
0.1.0 214 9/25/2023 0.1.0 is deprecated because it is no longer maintained.