Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 561 5/8/2018
0.4.11-v0.4.11-215 151 5/8/2018
0.4.11-symtensor-core-242 155 11/15/2018
0.4.11-symtensor-core-241 113 11/15/2018
0.4.11-symtensor-core-240 125 11/15/2018
0.4.11-symtensor-core-239 116 11/15/2018
0.4.11-symtensor-core-238 124 11/15/2018
0.4.11-symtensor-core-237 155 11/15/2018
0.4.11-symtensor-core-236 114 11/14/2018
0.4.11-symtensor-core-235 119 11/14/2018
0.4.11-symtensor-core-234 110 11/14/2018
0.4.11-symtensor-core-231 144 11/9/2018
0.4.11-symtensor-core-230 133 11/9/2018
0.4.11-symtensor-core-229 115 11/8/2018
0.4.11-symtensor-core-228 114 11/8/2018
0.4.11-symtensor-core-227 150 10/30/2018
0.4.11-symtensor-core-226 148 10/30/2018
0.4.11-symtensor-core-225 118 10/30/2018
0.4.11-develop-216 202 5/8/2018
0.4.10-develop-213 192 5/8/2018
0.4.10-develop-212 196 5/7/2018
0.4.10-develop-211 204 5/7/2018
0.3.0.712-master 226 9/1/2017
0.3.0.711-master 217 9/1/2017
0.3.0.710-master 215 9/1/2017
0.3.0.709-master 201 8/31/2017
0.3.0.708-master 225 8/30/2017
0.3.0.707-master 223 8/30/2017
0.3.0.706-master 217 8/30/2017
0.3.0.701-master 257 6/26/2017
0.3.0.700-master 277 6/22/2017
0.3.0.699-master 243 6/22/2017
0.3.0.698-master 245 6/21/2017
0.3.0.697-master 245 6/21/2017
0.3.0.696-master 269 6/21/2017
0.3.0.695-master 246 6/21/2017
0.3.0.694-master 238 6/21/2017
0.3.0.693-master 247 6/20/2017
0.3.0.692-master 243 6/19/2017
0.3.0.691-master 259 6/19/2017
0.3.0.690-master 241 6/19/2017
0.3.0.689-master 253 5/14/2017
0.3.0.688 480 5/14/2017
0.3.0.686-master 250 5/14/2017
0.2.0.591-master 254 4/19/2017
0.2.0.565-master 257 4/11/2017
0.2.0.556-master 251 3/21/2017
0.2.0.551-master 305 3/17/2017
0.2.0.540-master 243 3/15/2017
0.2.0.536-master 242 3/14/2017
0.2.0.519-master 258 3/2/2017
0.2.0.516-master 246 3/2/2017
0.2.0.499-master 258 2/13/2017
0.2.0.494-master 247 2/7/2017
0.2.0.479-master 264 2/1/2017
0.2.0.463-master 265 1/17/2017
0.2.0.431-master 335 12/2/2016
0.2.0.422-master 281 11/9/2016
0.2.0.421-master 264 11/9/2016
0.2.0.411-master 318 10/26/2016
0.2.0.400-master 272 10/26/2016
0.2.0.394-master 288 10/25/2016
0.2.0.382-master 274 10/21/2016
0.2.0.377-master 265 10/20/2016
0.2.0.323-master 269 10/11/2016
0.2.0.262-master 273 9/29/2016
0.2.0.248-master 277 9/27/2016
0.2.0.174-master 289 9/16/2016
0.2.0.128-master 277 9/8/2016
0.2.0.122-master 288 9/8/2016
0.2.0.121-master 275 9/7/2016
0.2.0.111-master 266 9/7/2016
0.2.0.105-ci 314 9/5/2016
0.2.0.97-ci 296 8/30/2016
0.2.0.96-ci 276 8/29/2016
0.2.0.90-ci 282 8/25/2016
0.2.0.89-ci 266 8/24/2016
0.2.0.88-ci 279 8/24/2016
0.2.0.87-ci 277 8/24/2016
0.2.0.86-ci 273 8/23/2016
0.2.0.85-ci 278 8/22/2016
0.2.0.84-ci 284 8/22/2016
0.2.0.83-ci 285 8/22/2016
0.2.0.82 373 8/22/2016
0.2.0.81-ci 286 8/19/2016
0.2.0.80-ci 298 6/27/2016
0.2.0.79-ci 296 6/27/2016
0.2.0.77-ci 299 6/22/2016
0.2.0.76-ci 293 6/22/2016
0.2.0.75 306 6/15/2016
0.2.0.74-ci 294 6/15/2016
0.2.0.73 294 6/15/2016
0.2.0.72 305 6/15/2016
0.2.0.71 344 6/14/2016
0.2.0.70 305 6/9/2016
0.2.0.69 274 6/9/2016
0.2.0.68 300 6/9/2016
0.2.0.67 297 6/8/2016
0.2.0.66-ci 285 6/8/2016
0.2.0.65-ci 283 6/8/2016
0.2.0.64-ci 316 6/8/2016
0.2.0.63-ci 283 6/7/2016
0.2.0.62 299 6/7/2016
0.2.0.61 287 6/6/2016
0.2.0.60 276 6/6/2016
0.2.0.59 276 6/6/2016
0.2.0.57 296 6/3/2016
0.2.0.56 286 6/3/2016
0.2.0.55 322 6/3/2016
0.2.0.54 296 6/3/2016
0.2.0.53 281 6/3/2016
0.2.0.52-ci 284 6/2/2016
0.2.0.51-ci 287 6/2/2016
0.2.0.50-ci 290 6/2/2016
0.2.0.49 303 5/31/2016
0.2.0.48-ci 296 5/31/2016
0.2.0.46-ci 280 5/31/2016
0.2.0.45 291 5/31/2016
0.2.0.44 289 5/31/2016
0.2.0.43 294 5/31/2016
0.2.0.42 308 5/30/2016
0.2.0.41 302 5/30/2016
0.2.0.40 291 5/30/2016
0.2.0.39 302 5/30/2016
0.2.0.38 292 5/30/2016
0.2.0.37 287 5/30/2016
0.2.0.36 295 5/25/2016
0.2.0.35 303 5/24/2016
0.2.0.34 300 5/24/2016
0.2.0.33 297 5/24/2016
0.2.0.32-ci 280 5/24/2016
0.1.26-ci 296 5/24/2016
0.1.24-ci 286 5/24/2016
0.1.19-ci 281 5/24/2016