CSharpNumerics 2.0.1
dotnet add package CSharpNumerics --version 2.0.1
NuGet\Install-Package CSharpNumerics -Version 2.0.1
<PackageReference Include="CSharpNumerics" Version="2.0.1" />
<PackageVersion Include="CSharpNumerics" Version="2.0.1" />
<PackageReference Include="CSharpNumerics" />
paket add CSharpNumerics --version 2.0.1
#r "nuget: CSharpNumerics, 2.0.1"
#:package CSharpNumerics@2.0.1
#addin nuget:?package=CSharpNumerics&version=2.0.1
#tool nuget:?package=CSharpNumerics&version=2.0.1
🧮 CSharpNumerics
A comprehensive numerical library for scientific computing, mathematical analysis, and iterative processes in C#. NuGet Package
✨ Features
- 🔢 Numerical extensions (Factorial, derivatives, integrals, root finding, etc.)
- 📈 Vectors, matrices, and complex numbers
- 🌊 Vector fields (Gradient, Divergence, Curl, Laplacian)
- 🧠 Complex and real function analysis
- 🔬 Fourier, Laplace, and Monte Carlo transforms
- 📉 Differential equation solvers (Runge–Kutta, Trapezoidal, etc.)
- 📊 Statistics and regression tools
- ✨ Interpolation methods
- 🤖 Machine Learning
- 🔗 Full integration with LINQ and extension methods
📘 Numeric Extensions
Factorial
int result = 5.Factorial(); // 120
Root Finding (Newton–Raphson)
Func<double, double> func = x => Math.Pow(x, 2) - 4;
double root = func.NewtonRaphson(); // 2
Derivative
Func<double, double> f = x => Math.Pow(x, 2);
Func<double, double> g = x => 4 * x - 3;
var result = f.Derivate(g, 1);
Supports Chain, Product, and Quotient rules via:
var result = f.Derivate(g, Numerics.Enums.DerivateOperator.Product);
Multiple variables:
Func<double[], double> func = vars => vars[0] * vars[1];
var dfdx = func.Derivate(new double[] { 2, 3 }, index: 0);
Or with vectors:
Func<Vector, double> func = v => v.x * v.y;
var dfdx = func.Derivate(new Vector(2, 3, 0), Cartesian.X);
Derivate series:
Func<double, double> displacement = t => 9.81 * Math.Pow(t, 2) / 2;
var velocity = displacement.GetSeries(0, 10, 1000).Derivate();
∫ Integrals
Trapezoidal rule:
Func<double, double> f = x => Math.Sin(x);
double integral = f.Integrate(0, Math.PI);
Integrate a series or timeseries:
List<TimeSerie> ts = ...;
double total = ts.Integrate();
Monte Carlo Integration
Func<(double x, double y), double> func = p => p.x * p.y;
double result = func.Integrate((0, 1), (0, 1));
🧩 Complex Numbers
var a = new ComplexNumber(3, 2);
var b = new ComplexNumber(5, 3);
var sum = a + b;
var product = a * b;
var power = a.Pow(2); // 5 + 12i
Exponential:
new ComplexNumber(0, Math.PI).Exponential(); // -1
🧭 Vector
var a = new Vector(5, 3, 0);
var b = new Vector(2, 6, 0);
var dot = a.Dot(b);
var cross = a.Cross(b);
From spherical coordinates:
var v = Vector.FromSphericalCoordinates(radius, inclination, azimuth);
🧮 Matrix
var A = new Matrix(new double[,] { { 1, 3, 7 }, { 5, 2, 9 } });
var transpose = A.Transpose();
var det = A.Determinant();
var inv = A.Inverse();
Arithmetic:
var B = new Matrix(new double[,] { { 2, 5, 1 }, { 4, 3, 7 } });
var sum = A + B;
var product = A * B;
With vector:
var x = new Vector(2, 1, 3);
var y = A * x;
🌐 Vector Field
Gradient
Func<Vector, double> f = p => Math.Pow(p.x, 2) * Math.Pow(p.y, 3);
var grad = f.Gradient((1, -2, 0));
Divergence
var field = new VectorField(p => Math.Sin(p.x * p.y),
p => Math.Cos(p.x * p.y),
p => Math.Exp(p.z));
double div = field.Divergence((1, 2, 2));
Curl
var field = new VectorField(p => p.y, p => -p.x, p => 0);
var curl = field.Curl((1, 4, 2));
⚙️ Transform
FFT
Func<double, double> f = t => Math.Exp(-t * t / 0.02);
var freq = f.FastFouriertransform(-0.5, 0.5, 100)
.ToFrequencyResolution(100);
Laplace Transform
double result = f.LaplaceTransform(2.0);
📐 Differential Equations
Runge–Kutta (RK4)
Func<(double t, double y), double> f = v => Math.Tan(v.y) + 1;
var result = f.RungeKutta(1, 1.1, 0.025, 1);
Linear Systems
var result = A.LinearSystemSolver(b);
var eigenValues = A.EigenValues();
📊 Statistics
var noise = new Random().GenerateNoise(4);
double median = ts.Median(p => p.Value);
double std = ts.StandardDeviation(p => p.Value);
Coefficient of determination:
var data = new[] { (1.0, 5.0), (2.0, 1.0), (3.0, 4.0), (4.0, 6.0) };
double r2 = data.CoefficientOfDetermination(p => (p.Item1, p.Item2));
Regression:
var (slope, intercept, corr) = serie.LinearRegression(p => (p.Index, p.Value));
var expFunc = serie.ExponentialRegression(p => (p.Index, p.Value));
K-nearest neighbors:
var data = new List<(double x, double y, int c)> { (7,7,0), (7,4,0), (3,4,1), (1,4,1) };
int classification = data.KnearestNeighbors(p => (p.x, p.y, p.c), (3,7), 3);
✨ Interpolation
CSharpNumerics provides a unified interpolation API supporting linear and logarithmic scales:
- Linear
- Log–Log (log x, log y)
- Lin–Log (lin x, log y)
- Log–Lin (log x, lin y)
All methods are routed through one central function.
public enum InterpolationType
{
Linear,
Logarithmic, // log–log
LogLin,
LinLog
}
double Interpolate<T>(
this IEnumerable<T> source,
Func<T, (double x, double y)> selector,
double index,
InterpolationType type);
Example:
var data = new List<Serie>
{
new Serie { Index = 1, Value = 10 },
new Serie { Index = 10, Value = 100 }
};
double y = data.Interpolate(
p => (p.Index, p.Value),
3.5,
InterpolationType.Linear
);
🤖 Machine Learning
CSharpNumerics includes a lightweight, fully numerical machine learning framework designed for research, experimentation, and educational use. The focus is on transparency, mathematical clarity, and pipeline-based model evaluation — not black-box automation.
All models are implemented directly on top of the library’s Matrix and Vector primitives.
🧩 Pipelines
Models can be combined with:
- Scalers (e.g. StandardScaler)
- Feature selectors (e.g. SelectKBest)
- Cross-validation strategies
- Hyperparameter search grids
var pipelineGrid = new PipelineGrid()
.AddModel<RandomForest>(g => g
.Add("NumTrees", 50, 100, 200)
.Add("MaxDepth", 5, 8, 10))
.AddModel<Logistic>(g => g
.Add("LearningRate", 0.05, 0.1)
.Add("MaxIterations", 1000, 2000)
.AddScaler<StandardScaler>(s => {})
.AddSelector<SelectKBest>(s => s
.Add("K", 1, 2)))
.AddModel<DecisionTree>(g => g
.Add("MaxDepth", 3, 5, 8))
.AddModel<KNearestNeighbors>(g => g
.Add("K", 3, 5, 7));
Rolling (time-aware) cross validation is supported for both classification and regression tasks.
var cv = new RollingCrossValidator(pipelineGrid, folds: 5);
var result = cv.Run(X, y);
var bestModel = result.BestPipeline;
var score = result.BestScore;
📊 Classification Models
Supported classifiers include:
- Logistic Regression
- Decision Tree
- Random Forest
- K-Nearest Neighbors
- Naive Bayes
📈 Regression Models
Supported regression models:
- Linear Regression
- Ridge Regression (L2)
- Lasso Regression (L1)
- Elastic Net (L1 + L2)
📎 Tips
- All methods are available as extension methods — just
using Numerics.Extensions. - You can export data with
.Save(path)for CSV visualization. - Works with LINQ pipelines for composable scientific workflows.
🧠 Example: Full Workflow
Func<double, double> func = x => Math.Sin(x);
var integral = func.Integrate(0, Math.PI);
var derivative = func.Derivate(Math.PI / 4);
var fft = func.FastFouriertransform(-1, 1, 100);
🧾 License
MIT License © 2025 — CSharpNumerics
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.