Tensor 0.4.11

F# Tensor (multidimensional array) library with SIMD and GPU acceleration

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

Version History

Version Downloads Last updated
0.4.11 532 5/8/2018
0.4.11-v0.4.11-215 143 5/8/2018
0.4.11-symtensor-core-242 124 11/15/2018
0.4.11-symtensor-core-241 105 11/15/2018
0.4.11-symtensor-core-240 115 11/15/2018
0.4.11-symtensor-core-239 106 11/15/2018
0.4.11-symtensor-core-238 112 11/15/2018
0.4.11-symtensor-core-237 145 11/15/2018
0.4.11-symtensor-core-236 106 11/14/2018
0.4.11-symtensor-core-235 107 11/14/2018
0.4.11-symtensor-core-234 101 11/14/2018
0.4.11-symtensor-core-231 132 11/9/2018
0.4.11-symtensor-core-230 123 11/9/2018
0.4.11-symtensor-core-229 104 11/8/2018
0.4.11-symtensor-core-228 102 11/8/2018
0.4.11-symtensor-core-227 138 10/30/2018
0.4.11-symtensor-core-226 137 10/30/2018
0.4.11-symtensor-core-225 107 10/30/2018
0.4.11-develop-216 183 5/8/2018
0.4.10-develop-213 177 5/8/2018
0.4.10-develop-212 184 5/7/2018
0.4.10-develop-211 189 5/7/2018
0.3.0.712-master 213 9/1/2017
0.3.0.711-master 197 9/1/2017
0.3.0.710-master 198 9/1/2017
0.3.0.709-master 183 8/31/2017
0.3.0.708-master 209 8/30/2017
0.3.0.707-master 208 8/30/2017
0.3.0.706-master 201 8/30/2017
0.3.0.701-master 245 6/26/2017
0.3.0.700-master 248 6/22/2017
0.3.0.699-master 229 6/22/2017
0.3.0.698-master 231 6/21/2017
0.3.0.697-master 231 6/21/2017
0.3.0.696-master 242 6/21/2017
0.3.0.695-master 233 6/21/2017
0.3.0.694-master 227 6/21/2017
0.3.0.693-master 233 6/20/2017
0.3.0.692-master 226 6/19/2017
0.3.0.691-master 244 6/19/2017
0.3.0.690-master 225 6/19/2017
0.3.0.689-master 237 5/14/2017
0.3.0.688 463 5/14/2017
0.3.0.686-master 237 5/14/2017
0.2.0.591-master 241 4/19/2017
0.2.0.565-master 239 4/11/2017
0.2.0.556-master 237 3/21/2017
0.2.0.551-master 274 3/17/2017
0.2.0.540-master 229 3/15/2017
0.2.0.536-master 230 3/14/2017
0.2.0.519-master 243 3/2/2017
0.2.0.516-master 232 3/2/2017
0.2.0.499-master 246 2/13/2017
0.2.0.494-master 236 2/7/2017
0.2.0.479-master 254 2/1/2017
0.2.0.463-master 253 1/17/2017
0.2.0.431-master 307 12/2/2016
0.2.0.422-master 265 11/9/2016
0.2.0.421-master 248 11/9/2016
0.2.0.411-master 302 10/26/2016
0.2.0.400-master 256 10/26/2016
0.2.0.394-master 275 10/25/2016
0.2.0.382-master 260 10/21/2016
0.2.0.377-master 246 10/20/2016
0.2.0.323-master 254 10/11/2016
0.2.0.262-master 257 9/29/2016
0.2.0.248-master 261 9/27/2016
0.2.0.174-master 273 9/16/2016
0.2.0.128-master 265 9/8/2016
0.2.0.122-master 272 9/8/2016
0.2.0.121-master 259 9/7/2016
0.2.0.111-master 252 9/7/2016
0.2.0.105-ci 287 9/5/2016
0.2.0.97-ci 277 8/30/2016
0.2.0.96-ci 262 8/29/2016
0.2.0.90-ci 267 8/25/2016
0.2.0.89-ci 251 8/24/2016
0.2.0.88-ci 260 8/24/2016
0.2.0.87-ci 259 8/24/2016
0.2.0.86-ci 261 8/23/2016
0.2.0.85-ci 267 8/22/2016
0.2.0.84-ci 265 8/22/2016
0.2.0.83-ci 271 8/22/2016
0.2.0.82 358 8/22/2016
0.2.0.81-ci 270 8/19/2016
0.2.0.80-ci 285 6/27/2016
0.2.0.79-ci 282 6/27/2016
0.2.0.77-ci 286 6/22/2016
0.2.0.76-ci 281 6/22/2016
0.2.0.75 288 6/15/2016
0.2.0.74-ci 280 6/15/2016
0.2.0.73 278 6/15/2016
0.2.0.72 289 6/15/2016
0.2.0.71 332 6/14/2016
0.2.0.70 290 6/9/2016
0.2.0.69 260 6/9/2016
0.2.0.68 284 6/9/2016
0.2.0.67 284 6/8/2016
0.2.0.66-ci 267 6/8/2016
0.2.0.65-ci 265 6/8/2016
0.2.0.64-ci 297 6/8/2016
0.2.0.63-ci 269 6/7/2016
0.2.0.62 284 6/7/2016
0.2.0.61 272 6/6/2016
0.2.0.60 264 6/6/2016
0.2.0.59 263 6/6/2016
0.2.0.57 280 6/3/2016
0.2.0.56 274 6/3/2016
0.2.0.55 307 6/3/2016
0.2.0.54 285 6/3/2016
0.2.0.53 265 6/3/2016
0.2.0.52-ci 272 6/2/2016
0.2.0.51-ci 274 6/2/2016
0.2.0.50-ci 281 6/2/2016
0.2.0.49 287 5/31/2016
0.2.0.48-ci 281 5/31/2016
0.2.0.46-ci 265 5/31/2016
0.2.0.45 282 5/31/2016
0.2.0.44 273 5/31/2016
0.2.0.43 282 5/31/2016
0.2.0.42 299 5/30/2016
0.2.0.41 287 5/30/2016
0.2.0.40 279 5/30/2016
0.2.0.39 289 5/30/2016
0.2.0.38 280 5/30/2016
0.2.0.37 279 5/30/2016
0.2.0.36 285 5/25/2016
0.2.0.35 291 5/24/2016
0.2.0.34 286 5/24/2016
0.2.0.33 286 5/24/2016
0.2.0.32-ci 265 5/24/2016
0.1.26-ci 280 5/24/2016
0.1.24-ci 274 5/24/2016
0.1.19-ci 269 5/24/2016