In this talk we give an overview of the Eclipse Deeplearning4J ecosystem. Skymind builds open source machine learning tools for the JVM and with a focus on deep learning. We give a quick introduction into the basics of deep learning and discuss a practical use case for illustration. Among other libraries, we introduce ND4J for powerful tensor operations and DL4J for building and deploying deep neural networks.
Tales from a Passkey Provider Progress from Awareness to Implementation.pptx
EclipseCon 2017 - Introduction to Machine Learning with Eclipse Deeplearning4j
1. Introduction to Machine Learning
with Eclipse Deeplearning4j
Max Pumperla
Ludwigsburg, October 25th 2017
2. AGENDA
● About Skymind
● Introduction to machine learning
● Deep Learning
● Example: classification with DL4J
● Model import for DL4J
● Scale-out with Spark
3. Founded 2014
Clients 14 Enterprises
3,500 GH Forks, 7,200 Stars
300,000+ DL4J downloads/mo.
Team ~35; mostly engineers; 6 PhDs
COMPANY OVERVIEW
8. DEEP NEURAL NETWORKS
● Stacking simple layers
to build complex
architectures
● Feed data and labels
● Learn from difference
between labels and
predictions
● Neural network will
learn representation
9. ECLIPSE DEEPLEARNING4J
Eclipse Deeplearning4j
Build, train, and deploy neural
networks on JVM
Eclipse ND4J
High performance linear algebra
CPU and GPU libraries
Eclipse DataVec
Data ingestion, normalization, and
vectorization
Eclipse Arbiter
Hyperparameter search for
optimizing neural networks
Eclipse RL4J
Reinforcement learning for the
JVM
Eclipse ScalNet
Scala wrapper for deeplearning4j
10. GETTING STARTED WITH ECLIPSE DL4J
<dependencies>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-core</artifactId>
<version>0.9.2-SNAPSHOT</version>
</dependency>
</dependencies>
libraryDependencies += "org.deeplearning4j" %
"deeplearning4j-core" % "0.9.1"
pom.xml
build.sbt
● Maven as recommended
build tool
● Versions from maven
central or build locally
● mvn eclipse:eclipse
11. BUILDING DEEP NETWORKS WITH
ECLIPSE DL4J
final int numRows = 28;
final int numColumns = 28;
int outputNum = 10;
int batchSize = 128;
int rngSeed = 123;
int numEpochs = 15;
DataSetIterator mnistTrain = new
MnistDataSetIterator(batchSize, true, rngSeed);
DataSetIterator mnistTest = new
MnistDataSetIterator(batchSize, false, rngSeed);
1. Set parameters: 2. Load data:
12. BUILDING DEEP NETWORKS WITH
ECLIPSE DL4J
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.seed(rngSeed)
.optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
.learningRate(0.006).updater(Updater.NESTEROVS)
.regularization(true).l2(1e-4).list()
3. Configure network:
13. BUILDING DEEP NETWORKS WITH
ECLIPSE DL4J
.layer(new DenseLayer.Builder()
.nIn(numRows * numColumns).nOut(1000)
.activation("relu")
.weightInit(WeightInit.XAVIER)
.build())
.layer(new OutputLayer.Builder(LossFunction.NEGATIVELOGLIKELIHOOD)
.nIn(1000).nOut(outputNum)
.activation("softmax")
.weightInit(WeightInit.XAVIER)
.build())
.pretrain(false).backprop(true)
.build();
4. Add layers:
5. Initialize and run:
MultiLayerNetwork model = new
MultiLayerNetwork(conf);
model.init();
for( int i=0; i<numEpochs; i++ ){
model.fit(mnistTrain);
}
14. BUILDING DEEP NETWORKS WITH
ECLIPSE DL4J
Evaluation eval = new Evaluation(outputNum);
while(mnistTest.hasNext()){
DataSet next = mnistTest.next();
INDArray output = model.output(next.getFeatureMatrix());
eval.eval(next.getLabels(), output);
}
log.info(eval.stats());
6. Evaluate:
16. DISTRIBUTED TRAINING WITH SPARKDISTRIBUTED TRAINING WITH SPARK
● DL4J scale-out module using Spark
● Data-parallel training procedure
● Parameter averaging on master
17. SUMMARY
● Can build much more complex deep nets
● Variety of use cases
● Prototype with python, deploy with java
● Integration with you production stack
● Check out our Deep Learning book!
● Upcoming book: Deep Learning & the Game
of Go (Manning)