Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 763 5/8/2018
0.4.11-v0.4.11-215 190 5/8/2018
0.4.11-symtensor-core-242 188 11/15/2018
0.4.11-symtensor-core-241 142 11/15/2018
0.4.11-symtensor-core-240 157 11/15/2018
0.4.11-symtensor-core-239 150 11/15/2018
0.4.11-symtensor-core-238 158 11/15/2018
0.4.11-symtensor-core-237 189 11/15/2018
0.4.11-symtensor-core-236 146 11/14/2018
0.4.11-symtensor-core-235 153 11/14/2018
0.4.11-symtensor-core-234 143 11/14/2018
0.4.11-symtensor-core-231 176 11/9/2018
0.4.11-symtensor-core-230 167 11/9/2018
0.4.11-symtensor-core-229 146 11/8/2018
0.4.11-symtensor-core-228 145 11/8/2018
0.4.11-symtensor-core-227 185 10/30/2018
0.4.11-symtensor-core-226 183 10/30/2018
0.4.11-symtensor-core-225 151 10/30/2018
0.4.11-develop-216 262 5/8/2018
0.4.10-develop-213 249 5/8/2018
0.4.10-develop-212 251 5/7/2018
0.4.10-develop-211 263 5/7/2018
0.3.0.712-master 272 9/1/2017
0.3.0.711-master 265 9/1/2017
0.3.0.710-master 262 9/1/2017
0.3.0.709-master 244 8/31/2017
0.3.0.708-master 271 8/30/2017
0.3.0.707-master 277 8/30/2017
0.3.0.706-master 262 8/30/2017
0.3.0.701-master 304 6/26/2017
0.3.0.700-master 324 6/22/2017
0.3.0.699-master 289 6/22/2017
0.3.0.698-master 290 6/21/2017
0.3.0.697-master 290 6/21/2017
0.3.0.696-master 314 6/21/2017
0.3.0.695-master 293 6/21/2017
0.3.0.694-master 284 6/21/2017
0.3.0.693-master 293 6/20/2017
0.3.0.692-master 287 6/19/2017
0.3.0.691-master 303 6/19/2017
0.3.0.690-master 286 6/19/2017
0.3.0.689-master 296 5/14/2017
0.3.0.688 550 5/14/2017
0.3.0.686-master 297 5/14/2017
0.2.0.591-master 300 4/19/2017
0.2.0.565-master 303 4/11/2017
0.2.0.556-master 305 3/21/2017
0.2.0.551-master 353 3/17/2017
0.2.0.540-master 285 3/15/2017
0.2.0.536-master 286 3/14/2017
0.2.0.519-master 301 3/2/2017
0.2.0.516-master 290 3/2/2017
0.2.0.499-master 305 2/13/2017
0.2.0.494-master 296 2/7/2017
0.2.0.479-master 313 2/1/2017
0.2.0.463-master 311 1/17/2017
0.2.0.431-master 380 12/2/2016
0.2.0.422-master 326 11/9/2016
0.2.0.421-master 308 11/9/2016
0.2.0.411-master 361 10/26/2016
0.2.0.400-master 315 10/26/2016
0.2.0.394-master 334 10/25/2016
0.2.0.382-master 321 10/21/2016
0.2.0.377-master 311 10/20/2016
0.2.0.323-master 314 10/11/2016
0.2.0.262-master 323 9/29/2016
0.2.0.248-master 323 9/27/2016
0.2.0.174-master 334 9/16/2016
0.2.0.128-master 325 9/8/2016
0.2.0.122-master 336 9/8/2016
0.2.0.121-master 324 9/7/2016
0.2.0.111-master 313 9/7/2016
0.2.0.105-ci 363 9/5/2016
0.2.0.97-ci 344 8/30/2016
0.2.0.96-ci 322 8/29/2016
0.2.0.90-ci 327 8/25/2016
0.2.0.89-ci 313 8/24/2016
0.2.0.88-ci 323 8/24/2016
0.2.0.87-ci 325 8/24/2016
0.2.0.86-ci 319 8/23/2016
0.2.0.85-ci 328 8/22/2016
0.2.0.84-ci 332 8/22/2016
0.2.0.83-ci 335 8/22/2016
0.2.0.82 423 8/22/2016
0.2.0.81-ci 338 8/19/2016
0.2.0.80-ci 343 6/27/2016
0.2.0.79-ci 340 6/27/2016
0.2.0.77-ci 349 6/22/2016
0.2.0.76-ci 342 6/22/2016
0.2.0.75 357 6/15/2016
0.2.0.74-ci 339 6/15/2016
0.2.0.73 339 6/15/2016
0.2.0.72 352 6/15/2016
0.2.0.71 392 6/14/2016
0.2.0.70 352 6/9/2016
0.2.0.69 322 6/9/2016
0.2.0.68 350 6/9/2016
0.2.0.67 347 6/8/2016
0.2.0.66-ci 334 6/8/2016
0.2.0.65-ci 328 6/8/2016
0.2.0.64-ci 364 6/8/2016
0.2.0.63-ci 327 6/7/2016
0.2.0.62 346 6/7/2016
0.2.0.61 336 6/6/2016
0.2.0.60 326 6/6/2016
0.2.0.59 326 6/6/2016
0.2.0.57 344 6/3/2016
0.2.0.56 336 6/3/2016
0.2.0.55 373 6/3/2016
0.2.0.54 346 6/3/2016
0.2.0.53 328 6/3/2016
0.2.0.52-ci 332 6/2/2016
0.2.0.51-ci 334 6/2/2016
0.2.0.50-ci 339 6/2/2016
0.2.0.49 353 5/31/2016
0.2.0.48-ci 344 5/31/2016
0.2.0.46-ci 328 5/31/2016
0.2.0.45 339 5/31/2016
0.2.0.44 337 5/31/2016
0.2.0.43 345 5/31/2016
0.2.0.42 358 5/30/2016
0.2.0.41 354 5/30/2016
0.2.0.40 341 5/30/2016
0.2.0.39 353 5/30/2016
0.2.0.38 342 5/30/2016
0.2.0.37 337 5/30/2016
0.2.0.36 344 5/25/2016
0.2.0.35 352 5/24/2016
0.2.0.34 348 5/24/2016
0.2.0.33 346 5/24/2016
0.2.0.32-ci 327 5/24/2016
0.1.26-ci 345 5/24/2016
0.1.24-ci 336 5/24/2016
0.1.19-ci 328 5/24/2016