Tensor 0.4.11

.NET Standard 2.0
dotnet add package Tensor --version 0.4.11
NuGet\Install-Package Tensor -Version 0.4.11
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
#r "nuget: Tensor, 0.4.11"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Tensor as a Cake Addin
#addin nuget:?package=Tensor&version=0.4.11

// Install Tensor as a Cake Tool
#tool nuget:?package=Tensor&version=0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Additional computed target framework(s)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet

Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.

RPlotTools

Tools for plotting using R from F#.

Tensor.Algorithm

Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.4.11 6,192 5/8/2018
0.4.11-v0.4.11-215 606 5/8/2018
0.4.11-symtensor-core-242 969 11/15/2018
0.4.11-symtensor-core-241 938 11/15/2018
0.4.11-symtensor-core-240 940 11/15/2018
0.4.11-symtensor-core-239 925 11/15/2018
0.4.11-symtensor-core-238 920 11/15/2018
0.4.11-symtensor-core-237 1,001 11/15/2018
0.4.11-symtensor-core-236 871 11/14/2018
0.4.11-symtensor-core-235 941 11/14/2018
0.4.11-symtensor-core-234 894 11/14/2018
0.4.11-symtensor-core-231 997 11/9/2018
0.4.11-symtensor-core-230 945 11/9/2018
0.4.11-symtensor-core-229 930 11/8/2018
0.4.11-symtensor-core-228 958 11/8/2018
0.4.11-symtensor-core-227 953 10/30/2018
0.4.11-symtensor-core-226 1,025 10/30/2018
0.4.11-symtensor-core-225 939 10/30/2018
0.4.11-develop-216 1,133 5/8/2018
0.4.10-develop-213 1,146 5/8/2018
0.4.10-develop-212 1,101 5/7/2018
0.4.10-develop-211 1,181 5/7/2018
0.3.0.712-master 897 9/1/2017
0.3.0.711-master 902 9/1/2017
0.3.0.710-master 863 9/1/2017
0.3.0.709-master 880 8/31/2017
0.3.0.708-master 895 8/30/2017
0.3.0.707-master 866 8/30/2017
0.3.0.706-master 889 8/30/2017
0.3.0.701-master 915 6/26/2017
0.3.0.700-master 924 6/22/2017
0.3.0.699-master 908 6/22/2017
0.3.0.698-master 889 6/21/2017
0.3.0.697-master 880 6/21/2017
0.3.0.696-master 965 6/21/2017
0.3.0.695-master 921 6/21/2017
0.3.0.694-master 884 6/21/2017
0.3.0.693-master 908 6/20/2017
0.3.0.692-master 893 6/19/2017
0.3.0.691-master 929 6/19/2017
0.3.0.690-master 924 6/19/2017
0.3.0.689-master 904 5/14/2017
0.3.0.688 7,046 5/14/2017
0.3.0.686-master 916 5/14/2017
0.2.0.591-master 881 4/19/2017
0.2.0.565-master 877 4/11/2017
0.2.0.556-master 868 3/21/2017
0.2.0.551-master 934 3/17/2017
0.2.0.540-master 854 3/15/2017
0.2.0.536-master 844 3/14/2017
0.2.0.519-master 894 3/2/2017
0.2.0.516-master 874 3/2/2017
0.2.0.499-master 906 2/13/2017
0.2.0.494-master 878 2/7/2017
0.2.0.479-master 895 2/1/2017
0.2.0.463-master 895 1/17/2017
0.2.0.431-master 964 12/2/2016
0.2.0.422-master 1,253 11/9/2016
0.2.0.421-master 1,194 11/9/2016
0.2.0.411-master 938 10/26/2016
0.2.0.400-master 897 10/26/2016
0.2.0.394-master 902 10/25/2016
0.2.0.382-master 902 10/21/2016
0.2.0.377-master 888 10/20/2016
0.2.0.323-master 887 10/11/2016
0.2.0.262-master 921 9/29/2016
0.2.0.248-master 906 9/27/2016
0.2.0.174-master 901 9/16/2016
0.2.0.128-master 916 9/8/2016
0.2.0.122-master 913 9/8/2016
0.2.0.121-master 882 9/7/2016
0.2.0.111-master 880 9/7/2016
0.2.0.105-ci 936 9/5/2016
0.2.0.97-ci 948 8/30/2016
0.2.0.96-ci 892 8/29/2016
0.2.0.90-ci 910 8/25/2016
0.2.0.89-ci 868 8/24/2016
0.2.0.88-ci 908 8/24/2016
0.2.0.87-ci 908 8/24/2016
0.2.0.86-ci 903 8/23/2016
0.2.0.85-ci 909 8/22/2016
0.2.0.84-ci 919 8/22/2016
0.2.0.83-ci 932 8/22/2016
0.2.0.82 2,114 8/22/2016
0.2.0.81-ci 918 8/19/2016
0.2.0.80-ci 921 6/27/2016
0.2.0.79-ci 925 6/27/2016
0.2.0.77-ci 922 6/22/2016
0.2.0.76-ci 939 6/22/2016
0.2.0.75 1,589 6/15/2016
0.2.0.74-ci 1,273 6/15/2016
0.2.0.73 1,828 6/15/2016
0.2.0.72 1,820 6/15/2016
0.2.0.71 1,812 6/14/2016
0.2.0.70 1,701 6/9/2016
0.2.0.69 1,659 6/9/2016
0.2.0.68 1,494 6/9/2016
0.2.0.67 1,966 6/8/2016
0.2.0.66-ci 929 6/8/2016
0.2.0.65-ci 908 6/8/2016
0.2.0.64-ci 975 6/8/2016
0.2.0.63-ci 902 6/7/2016
0.2.0.62 1,476 6/7/2016
0.2.0.61 1,444 6/6/2016
0.2.0.60 1,465 6/6/2016
0.2.0.59 1,409 6/6/2016
0.2.0.57 1,482 6/3/2016
0.2.0.56 1,449 6/3/2016
0.2.0.55 1,532 6/3/2016
0.2.0.54 1,485 6/3/2016
0.2.0.53 1,802 6/3/2016
0.2.0.52-ci 900 6/2/2016
0.2.0.51-ci 927 6/2/2016
0.2.0.50-ci 917 6/2/2016
0.2.0.49 1,834 5/31/2016
0.2.0.48-ci 978 5/31/2016
0.2.0.46-ci 947 5/31/2016
0.2.0.45 1,666 5/31/2016
0.2.0.44 1,684 5/31/2016
0.2.0.43 1,637 5/31/2016
0.2.0.42 1,668 5/30/2016
0.2.0.41 1,644 5/30/2016
0.2.0.40 1,668 5/30/2016
0.2.0.39 1,710 5/30/2016
0.2.0.38 1,701 5/30/2016
0.2.0.37 1,629 5/30/2016
0.2.0.36 1,683 5/25/2016
0.2.0.35 1,681 5/24/2016
0.2.0.34 1,702 5/24/2016
0.2.0.33 2,497 5/24/2016
0.2.0.32-ci 902 5/24/2016
0.1.26-ci 938 5/24/2016
0.1.24-ci 930 5/24/2016
0.1.19-ci 905 5/24/2016