AutoDiffNet 1.4.0
dotnet add package AutoDiffNet --version 1.4.0
NuGet\Install-Package AutoDiffNet -Version 1.4.0
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="AutoDiffNet" Version="1.4.0" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add AutoDiffNet --version 1.4.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: AutoDiffNet, 1.4.0"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install AutoDiffNet as a Cake Addin
#addin nuget:?package=AutoDiffNet&version=1.4.0
// Install AutoDiffNet as a Cake Tool
#tool nuget:?package=AutoDiffNet&version=1.4.0
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
Project Description
This library provides an implementation for automatic differentiation of mathematical functions.
Usage example
using AutoDiffNet;
using System;
namespace AutoDiffNetConsole
{
class Program
{
static void Main(string[] args)
{
Variable x = new Variable();
Term f;
f = Term.Ln(x[0]+2*x[1]);
Func<double[], double> fx = f.Compile();
Func<double[], double> grad = f.Grad(0);
var x0 = new double[] { 1, 2 };
System.Console.WriteLine(f.GradString(0));
System.Console.WriteLine(grad(x0));
System.Console.WriteLine(fx(x0));
System.Console.ReadLine();
}
}
}
Expression Optimizer
As part of the release 1.0.0 I'm including a new feature: the Expression Optimizer that as the name suggest it's providing the capabilities of optimize the generated Lambda (for both the Function and his gradient).
The Optimizer have several feature and they can be enabled or disabled using the flag: ExpressionOptimizerFlags
[Flags]
public enum ExpressionOptimizerFlags
{
DisableAll=1,
ZeroDividedBy =2,
ReuseOfDuplicatedExpression=4,
MultiplyByZero=8,
Default=MultiplyByZero|ReuseOfDuplicatedExpression,
Aggressive = Default | ZeroDividedBy
}
Flags | Description | Default Behavior |
---|---|---|
MultiplyByZero | Any sub-expression in the form 0 * g(x) will be converted in 0 and g(x) will not be evaluated | Enabled |
ReuseOfDuplicatedExpression | The optimizer will scan the Function f(x) and will evalute every sub-expression just 1 times; es: f(x) = g(x)*2+ Log(g(x)) then with this option g(x) will be evaluated only once and the result will be cached | Enabled |
ZeroDivideBy | Any sub-expression in the form 0 / g(x) will be conveted in 0 and g(x) will not be evaluated; condition where g(x) is zero need to be validated externally | Disabled |
What's new 1.4.0
- AutodiffNet is now compatible with DotNet Standard 1.3
- Fix some issue in the Expression Optimizer
Version 1.3.0
- Improved performance for Expression optimizer
Version 1.2.0
- Adding Gradient Function, that evaluate all the Gradient vector at once
Version 1.0.0
- New Expression optimizer feature (enabled by default)
- Some Unit Tests
Version 0.7.4
- Performance improvents
- Term.Sum to add multiple terms
- Tree based evaluation to cover cases where the compiled expression is to large
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp1.0 was computed. netcoreapp1.1 was computed. netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard1.3 is compatible. netstandard1.4 was computed. netstandard1.5 was computed. netstandard1.6 was computed. netstandard2.0 was computed. netstandard2.1 was computed. |
.NET Framework | net46 was computed. net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen30 was computed. tizen40 was computed. tizen60 was computed. |
Universal Windows Platform | uap was computed. uap10.0 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
.NETStandard 1.3
- NETStandard.Library (>= 1.6.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.