DeepNet 0.2.0.34

Deep.Net

Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning.

Make sure to set the platform of your project to x64.

There is a newer version of this package available.
See the version list below for details.
Install-Package DeepNet -Version 0.2.0.34
dotnet add package DeepNet --version 0.2.0.34
paket add DeepNet --version 0.2.0.34
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

Release Notes

Initial release.

Version History

Version Downloads Last updated
0.3.0.712-master 174 9/1/2017
0.3.0.711-master 144 9/1/2017
0.3.0.710-master 135 9/1/2017
0.3.0.709-master 137 8/31/2017
0.3.0.708-master 138 8/30/2017
0.3.0.707-master 137 8/30/2017
0.3.0.706-master 158 8/30/2017
0.3.0.701-master 169 6/26/2017
0.3.0.700-master 163 6/22/2017
0.3.0.699-master 163 6/22/2017
0.3.0.698-master 164 6/21/2017
0.3.0.697-master 165 6/21/2017
0.3.0.696-master 162 6/21/2017
0.3.0.695-master 165 6/21/2017
0.3.0.694-master 164 6/21/2017
0.3.0.693-master 184 6/20/2017
0.3.0.692-master 166 6/19/2017
0.3.0.691-master 242 6/19/2017
0.3.0.690-master 166 6/19/2017
0.3.0.689-master 173 5/14/2017
0.3.0.688 359 5/14/2017
0.3.0.686-master 170 5/14/2017
0.2.0.591-master 177 4/19/2017
0.2.0.565-master 195 4/11/2017
0.2.0.556-master 174 3/21/2017
0.2.0.551-master 183 3/17/2017
0.2.0.540-master 172 3/15/2017
0.2.0.536-master 168 3/14/2017
0.2.0.519-master 170 3/2/2017
0.2.0.516-master 166 3/2/2017
0.2.0.499-master 179 2/13/2017
0.2.0.494-master 194 2/7/2017
0.2.0.479-master 192 2/1/2017
0.2.0.463-master 197 1/17/2017
0.2.0.431-master 205 12/2/2016
0.2.0.422-master 189 11/9/2016
0.2.0.421-master 185 11/9/2016
0.2.0.411-master 220 10/26/2016
0.2.0.400-master 181 10/26/2016
0.2.0.394-master 189 10/25/2016
0.2.0.382-master 196 10/21/2016
0.2.0.377-master 249 10/20/2016
0.2.0.323-master 184 10/11/2016
0.2.0.262-master 188 9/29/2016
0.2.0.248-master 195 9/27/2016
0.2.0.174-master 195 9/16/2016
0.2.0.128-master 193 9/8/2016
0.2.0.122-master 200 9/8/2016
0.2.0.121-master 187 9/7/2016
0.2.0.111-master 188 9/7/2016
0.2.0.105-ci 187 9/5/2016
0.2.0.97-ci 196 8/30/2016
0.2.0.96-ci 197 8/29/2016
0.2.0.90-ci 200 8/25/2016
0.2.0.89-ci 190 8/24/2016
0.2.0.88-ci 191 8/24/2016
0.2.0.87-ci 196 8/24/2016
0.2.0.86-ci 191 8/23/2016
0.2.0.85-ci 193 8/22/2016
0.2.0.84-ci 193 8/22/2016
0.2.0.83-ci 195 8/22/2016
0.2.0.82 269 8/22/2016
0.2.0.81-ci 194 8/19/2016
0.2.0.80-ci 217 6/27/2016
0.2.0.79-ci 213 6/27/2016
0.2.0.77-ci 209 6/22/2016
0.2.0.76-ci 204 6/22/2016
0.2.0.75 235 6/15/2016
0.2.0.74-ci 204 6/15/2016
0.2.0.73 220 6/15/2016
0.2.0.72 216 6/15/2016
0.2.0.71 217 6/14/2016
0.2.0.70 237 6/9/2016
0.2.0.69 220 6/9/2016
0.2.0.68 215 6/9/2016
0.2.0.67 213 6/8/2016
0.2.0.66-ci 199 6/8/2016
0.2.0.65-ci 201 6/8/2016
0.2.0.64-ci 224 6/8/2016
0.2.0.63-ci 206 6/7/2016
0.2.0.62 210 6/7/2016
0.2.0.61 213 6/6/2016
0.2.0.60 202 6/6/2016
0.2.0.59 206 6/6/2016
0.2.0.57 250 6/3/2016
0.2.0.56 234 6/3/2016
0.2.0.55 207 6/3/2016
0.2.0.54 207 6/3/2016
0.2.0.53 208 6/3/2016
0.2.0.52-ci 216 6/2/2016
0.2.0.51-ci 200 6/2/2016
0.2.0.50-ci 206 6/2/2016
0.2.0.49 210 5/31/2016
0.2.0.48-ci 203 5/31/2016
0.2.0.46-ci 201 5/31/2016
0.2.0.45 208 5/31/2016
0.2.0.44 206 5/31/2016
0.2.0.43 206 5/31/2016
0.2.0.42 216 5/30/2016
0.2.0.41 214 5/30/2016
0.2.0.40 217 5/30/2016
0.2.0.39 209 5/30/2016
0.2.0.38 205 5/30/2016
0.2.0.37 207 5/30/2016
0.2.0.36 253 5/25/2016
0.2.0.35 211 5/24/2016
0.2.0.34 251 5/24/2016
0.2.0.33 206 5/24/2016
0.2.0.32-ci 227 5/24/2016
0.1.26-ci 202 5/24/2016
0.1.24-ci 197 5/24/2016
0.1.19-ci 228 5/24/2016
Show less