Neuron allows users to prepare data, build models, train models using transfer learning, and evaluate models. Key features include deploying Jupyter or Zeppelin notebooks for workspaces, loading pre-built models from S3 or Git, re-training models through transfer learning, and evaluating trained models using various metrics on the Neuron platform. The document provides steps for launching a workbench, performing data preparation and transformation, choosing algorithms for model building, training models on Neuron including through transfer learning, and evaluating trained models.
1. Neuron
Demo
The greatest value of a data is when it forces us to notice what we never expected to see.
How Neuron take care of my Data?
2. Contents
❖ Preliminaries
➢ Requirements for Launching Workbench
➢ Best Practices for Using Workbench
➢ Running the code
➢ How to Launch Workbench
■ Launch Workbench
■ Features on Dashboard
■ Features in Software
❖ Data Preparation Process
➢ Data Selection
➢ Data Pre-processing
➢ Data Transformation
❖ Model Building
➢ How to choose your algorithm
❖ Model Training
➢ How to Train your model on Neuron
➢ Transfer Learning
➢ How to use Transfer Learning
■ Steps for Pytorch Example
■ Steps for TensorFlow Example
❖ Model Evaluation
➢ Evaluation Metrics
➢ How to Evaluate Model on Neuron
3. Preliminaries
1. Requirements for
launching Workbench
Jupyter
Notebook
Zeppelin Notebook
Jupyter Console
Click on
“Notebook”
Launcher
Click on Python 3
Click on
“Console”
Click on Python3
Select “Create new
note”
Choose “Python 3”
Launcher
4. Create a directory. Whole work would be done on that “work” directory.
Create your own virtual environment using console.
Initialize the virtualenv using console.
Use command python text_file_name.py to use text run code on console
You can run code cell by cell in Jupyter or Zeppelin using console.
To run full notebook as script in console and in virtualenv type
ipython notebook_name.ipynb
2. Best Practices
3.Running the code
5. 1. How to Launch Workbenches
Dashboard - General
options showing Available Softwares Deployed Software Total Members
Softwares - Provide
two options View Deployments New Deployments
Members - Options
include
View Members Invite Members
Create an account (manually or by Invitation) and log In
Click on Softwares
TT
New Deployment
Select option to choose instances of notebooks such as Jupyter or Zeppelin.
STEPS
12. Your Workbench is Ready!For Zeppelin
For Jupyter
Console
For Jupyter
Notebook
13. Data Preparation Process
Data Processing
Data Selection
Data
Transformation
Data
Preprocessing
Selected sample
represent entire
population
Handling Null Values
Feature Scaling
Scaling
Aggregation
Select subset of data
variables
Handling Categorical Variables
One-Hot Encoding
Multicollinearity
17. Model Building
How to choose your algorithm
STEPS
Categorize
Understand Data
Supervised Unsupervised ReinforcementSemi-supervised
Outputs
Inputs
Classification Regression Clustering
Find available algorithms based on
Analyze Process Transform
Accuracy Interpretability Scalability Complexity
Time to build, train, test
& model
Time to make predictions using
model
Business goal
18. Implement machine learning algorithms
Optimize hyperparameters
Load existing model on Neuron
Model Selection
Running different
models
Evaluating their
performance
Choose the relevant
for model building
Linear Regression , Logistic Regression or any other
Learning Rate, Row sampling, column sampling and so on.
20. Model Training
Build the model using
above steps (save it to git
S3 or other location, if
saved already bring it to
workspace).
Train your model
using CLI
terminal of
workbench.
Create Workbench (if
not created in above
steps).
1. How to train your model on neuron
If building from scratch you
can choose to write code
(using notebook or python
script) and train the model.
22. 1. How to train your model on StackLabs
ow to train your model on StackLabs
Advantage of less
computational
resource and
training time.
Feature
extraction
re-used and main
processing
re-trained.
A model developed
to build original
dataset.
Can be done on
every
framework.consi
dering neuron.
2
531
2. Transfer Learning
A model developed for a task is
reused as starting point to build
another task.
Pre-Build
Model Import
from S3, Git or
wherever is
the location.
4
23. How to use Transfer Learning - Given two examples of Pytorch & TensorFlow but others
can also be used.
1
Save
data
in
data
folder
(default:data_raw
)
2
Run
thedatasetup
com
m
and
according
to
thedataset.
3
Run
there-training
com
m
and
according
to
thedataset
4
Run
theinferencecom
m
and
according
to
thedataset.
Steps for PyTorch Example:
24. Steps for TensorFlow Example:
1
Save
data
in
data
folder
divided
as
class/label
nam
e
2
Run
python
re-training
com
m
and
according
to
thedataset.
param
etersincludesm
odel_dir,
output_graph,output_labels,
im
age_dir.
3
Run
thepython
inference
com
m
and
according
to
your
datasetfolder.
25. 1.EvaluationMetrics
The parameters using which we
evaluate the performance of a
model during training and after
training called metrics.
Model built trained and deployed
on neuron, can be evaluated on.
Accuracy, F1 Score, Confusion
Matrix, AUC, Mean Absolute
Error, Mean Squared Error,
Logarithmic Loss
Model Evaluation