Gave a talk at:
www.meetup.com/SF-Bayarea-Machine-Learning/events/221739934/
Covers basic architecture of a scientific lib and my take on it with nd4j.
3. Different libraries:
JVM:
● Jblas (not
maintained)
● Breeze (scala)
● Spire (scala)
● Core.matrix(clojure)
● Netlib blas
● N + 1 pure java
impls
Real scientific
environments (no
fragmentation!):
● octave/matlab
● numpy
● R
● Theano
4. Blas and NDArrays
BLAS - Basic Linear Algebra Sub programs. Underlies EVERY fast
matrix lib out there.
NDArrays - matrices, tensors, math is done on axes (eg: rows/cols)
10. Core Concepts
● DataBuffer - A DataBuffer is a storage abstraction. A raw buffer
could be an nio buffer or a jcuda buffer. This allows for backend
optimal storage.
● FFTInstance - An instantiation of FFT. Every hardware maker has
their own impls.
● ConvolutionInstance - An instantiation of Convolution. Same as FFT
● Complex Numbers - (An abstraction over different libs)
● BlasWrapper - An interface to a blas backend (blas is a standard for
cripe sakes!)
● Op - Some sort of a mathematical operation (dot product, element
wise addition,negation,..)
● Garbage Collector (WIP) - Native resource handling where needed
11. Other Features
● Loss Functions
● Adaptive learning rates
● Solvers (LBFGS,Conjugate gradient,Hessian Free,SGD)
● RNG: (DIfferent distributions based on apache math)