Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet
Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.
RPlotTools
Tools for plotting using R from F#.
Tensor.Algorithm
Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 1,369 5/8/2018
0.4.11-v0.4.11-215 227 5/8/2018
0.4.11-symtensor-core-242 243 11/15/2018
0.4.11-symtensor-core-241 224 11/15/2018
0.4.11-symtensor-core-240 223 11/15/2018
0.4.11-symtensor-core-239 203 11/15/2018
0.4.11-symtensor-core-238 224 11/15/2018
0.4.11-symtensor-core-237 249 11/15/2018
0.4.11-symtensor-core-236 202 11/14/2018
0.4.11-symtensor-core-235 203 11/14/2018
0.4.11-symtensor-core-234 204 11/14/2018
0.4.11-symtensor-core-231 238 11/9/2018
0.4.11-symtensor-core-230 238 11/9/2018
0.4.11-symtensor-core-229 205 11/8/2018
0.4.11-symtensor-core-228 204 11/8/2018
0.4.11-symtensor-core-227 239 10/30/2018
0.4.11-symtensor-core-226 237 10/30/2018
0.4.11-symtensor-core-225 208 10/30/2018
0.4.11-develop-216 340 5/8/2018
0.4.10-develop-213 341 5/8/2018
0.4.10-develop-212 333 5/7/2018
0.4.10-develop-211 335 5/7/2018
0.3.0.712-master 331 9/1/2017
0.3.0.711-master 332 9/1/2017
0.3.0.710-master 320 9/1/2017
0.3.0.709-master 303 8/31/2017
0.3.0.708-master 325 8/30/2017
0.3.0.707-master 336 8/30/2017
0.3.0.706-master 314 8/30/2017
0.3.0.701-master 353 6/26/2017
0.3.0.700-master 375 6/22/2017
0.3.0.699-master 347 6/22/2017
0.3.0.698-master 346 6/21/2017
0.3.0.697-master 347 6/21/2017
0.3.0.696-master 373 6/21/2017
0.3.0.695-master 346 6/21/2017
0.3.0.694-master 341 6/21/2017
0.3.0.693-master 351 6/20/2017
0.3.0.692-master 342 6/19/2017
0.3.0.691-master 365 6/19/2017
0.3.0.690-master 349 6/19/2017
0.3.0.689-master 348 5/14/2017
0.3.0.688 1,513 5/14/2017
0.3.0.686-master 350 5/14/2017
0.2.0.591-master 359 4/19/2017
0.2.0.565-master 368 4/11/2017
0.2.0.556-master 358 3/21/2017
0.2.0.551-master 410 3/17/2017
0.2.0.540-master 344 3/15/2017
0.2.0.536-master 339 3/14/2017
0.2.0.519-master 354 3/2/2017
0.2.0.516-master 344 3/2/2017
0.2.0.499-master 367 2/13/2017
0.2.0.494-master 353 2/7/2017
0.2.0.479-master 369 2/1/2017
0.2.0.463-master 366 1/17/2017
0.2.0.431-master 443 12/2/2016
0.2.0.422-master 379 11/9/2016
0.2.0.421-master 369 11/9/2016
0.2.0.411-master 421 10/26/2016
0.2.0.400-master 370 10/26/2016
0.2.0.394-master 391 10/25/2016
0.2.0.382-master 375 10/21/2016
0.2.0.377-master 367 10/20/2016
0.2.0.323-master 368 10/11/2016
0.2.0.262-master 384 9/29/2016
0.2.0.248-master 388 9/27/2016
0.2.0.174-master 384 9/16/2016
0.2.0.128-master 388 9/8/2016
0.2.0.122-master 390 9/8/2016
0.2.0.121-master 378 9/7/2016
0.2.0.111-master 371 9/7/2016
0.2.0.105-ci 419 9/5/2016
0.2.0.97-ci 408 8/30/2016
0.2.0.96-ci 379 8/29/2016
0.2.0.90-ci 383 8/25/2016
0.2.0.89-ci 370 8/24/2016
0.2.0.88-ci 376 8/24/2016
0.2.0.87-ci 392 8/24/2016
0.2.0.86-ci 378 8/23/2016
0.2.0.85-ci 376 8/22/2016
0.2.0.84-ci 389 8/22/2016
0.2.0.83-ci 392 8/22/2016
0.2.0.82 610 8/22/2016
0.2.0.81-ci 398 8/19/2016
0.2.0.80-ci 403 6/27/2016
0.2.0.79-ci 399 6/27/2016
0.2.0.77-ci 406 6/22/2016
0.2.0.76-ci 404 6/22/2016
0.2.0.75 465 6/15/2016
0.2.0.74-ci 395 6/15/2016
0.2.0.73 433 6/15/2016
0.2.0.72 446 6/15/2016
0.2.0.71 479 6/14/2016
0.2.0.70 432 6/9/2016
0.2.0.69 402 6/9/2016
0.2.0.68 429 6/9/2016
0.2.0.67 510 6/8/2016
0.2.0.66-ci 393 6/8/2016
0.2.0.65-ci 385 6/8/2016
0.2.0.64-ci 428 6/8/2016
0.2.0.63-ci 380 6/7/2016
0.2.0.62 430 6/7/2016
0.2.0.61 416 6/6/2016
0.2.0.60 413 6/6/2016
0.2.0.59 413 6/6/2016
0.2.0.57 431 6/3/2016
0.2.0.56 425 6/3/2016
0.2.0.55 459 6/3/2016
0.2.0.54 436 6/3/2016
0.2.0.53 472 6/3/2016
0.2.0.52-ci 387 6/2/2016
0.2.0.51-ci 392 6/2/2016
0.2.0.50-ci 397 6/2/2016
0.2.0.49 498 5/31/2016
0.2.0.48-ci 407 5/31/2016
0.2.0.46-ci 396 5/31/2016
0.2.0.45 430 5/31/2016
0.2.0.44 436 5/31/2016
0.2.0.43 453 5/31/2016
0.2.0.42 448 5/30/2016
0.2.0.41 447 5/30/2016
0.2.0.40 444 5/30/2016
0.2.0.39 451 5/30/2016
0.2.0.38 435 5/30/2016
0.2.0.37 435 5/30/2016
0.2.0.36 433 5/25/2016
0.2.0.35 454 5/24/2016
0.2.0.34 450 5/24/2016
0.2.0.33 598 5/24/2016
0.2.0.32-ci 385 5/24/2016
0.1.26-ci 408 5/24/2016
0.1.24-ci 396 5/24/2016
0.1.19-ci 385 5/24/2016