Microsoft.ML.OnnxRuntime.Foundry
1.24.4
Prefix Reserved
dotnet add package Microsoft.ML.OnnxRuntime.Foundry --version 1.24.4
NuGet\Install-Package Microsoft.ML.OnnxRuntime.Foundry -Version 1.24.4
<PackageReference Include="Microsoft.ML.OnnxRuntime.Foundry" Version="1.24.4" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.Foundry" Version="1.24.4" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.Foundry" />
paket add Microsoft.ML.OnnxRuntime.Foundry --version 1.24.4
#r "nuget: Microsoft.ML.OnnxRuntime.Foundry, 1.24.4"
#:package Microsoft.ML.OnnxRuntime.Foundry@1.24.4
#addin nuget:?package=Microsoft.ML.OnnxRuntime.Foundry&version=1.24.4
#tool nuget:?package=Microsoft.ML.OnnxRuntime.Foundry&version=1.24.4
About

ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 is compatible. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETCoreApp 0.0
- Microsoft.ML.OnnxRuntime.Gpu.Linux (>= 1.24.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.24.4)
-
.NETFramework 0.0
- Microsoft.ML.OnnxRuntime.Gpu.Linux (>= 1.24.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.24.4)
-
.NETStandard 0.0
- Microsoft.ML.OnnxRuntime.Gpu.Linux (>= 1.24.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.24.4)
NuGet packages (3)
Showing the top 3 NuGet packages that depend on Microsoft.ML.OnnxRuntime.Foundry:
| Package | Downloads |
|---|---|
|
Microsoft.ML.OnnxRuntimeGenAI.Foundry
ONNX Runtime Generative AI Native Package |
|
|
Microsoft.AI.Foundry.Local.Core
Microsoft Foundry Local Core native library - Native AOT compiled multi-platform AI inference library |
|
|
Microsoft.AI.Foundry.Local.Core.WinML
Microsoft Foundry Local Core WinML native library - Native AOT compiled Windows AI inference library with WinML support |
GitHub repositories
This package is not used by any popular GitHub repositories.
Release Def:
Branch: refs/heads/rel-1.24.4
Commit: 2d924974ef147392ced8409d36bd6d2e7fcc8a74
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=1124821