SlideShare a Scribd company logo
Denoising Autoencoder
Harish.R
1
Problem Statement
To extract features representation from Denoising
Autoencoder and try to enhance predictive accuracy
in recognizing handwritten digits from MNIST
dataset. The pipeline is following:
 Load, reshape, scale and add noise to data,
 Train DAE on merged training and testing data,
 Get neuron outputs from DAE as new features,
 Train classification algorithm on new features.
2
Motivation
3
 When we study different machine learning
architectures, encoding of data are not significant but
in auto encoders encoding and decoding has
significant rule this made me to work on
understanding and analyzing Autoencoder and
apply on my learning to de noise image
Introduction
 Autoencoders are part of neural network family. The algorithm is
fairly simple as AE require output to be the same as input, so that
we can classify them to unsupervised machine learning
algorithms. The AE compress input data to latent-space
representation and then reconstruct the output. We can divide
algorithm into two subsets:
 Encoder – compressing the input data to lower dimensional
representation sometimes called latent-space representation
 Decoder – decompressing the representation to reconstruct the
input as best as possible
4
Encoder-Decoder architecture
5
Encoder : compresses the input and produces the code.
Decoder: then reconstructs the input only using this
code.
Where Autoencoder is used
 AE are currently used in image or sound compressing and
dimensionality reduction. In specific cases they can provide more
interesting/efficient data projections or other dimensionality
reduction techniques. Moreover, the extension of AE, called
Denoising Autoencoders are used in representation learning,
which uses not only training but also testing data to engineer
features
6
Denoising Autoencoder
 While the main purpose of basic AE is to compress and reduce
dimensionality of data, DAE are used in another practical
application. Imagine a set of low quality images with some noise.
Is it possible to clear these images from noise using machine
learning algorithms.
 In the following example we will show how to clear handwritten
MNIST digits from Gaussian random noise.
7
Constructing of Denosing Autoencoder
To introduce Gaussian random noise we add following code:
 Noise factor controls the noisiness of images and we clip the
values to make sure that the elements of feature vector
representing image are between 0 and 1.
 We use basic neural network to encode 784 input features to 32
neurons with rectifier activation function and then decode it back
to 784 neurons outputting values in range thanks
to sigmoid activation function. The only difference is that the
training is done on noisy input samples:
8
 The results on test data are not satisfying, DAE reconstructed
digit 4 to a digit similar to 9(fourth image from left).
 We could vary around with epochs, batch size other parameters
trying to get better results. So use Convolution Neural Networks
which are successful in image processing area.
9
Convolutional Neural Networks
10
Constructing Convolutional Denoising Autoencoder
 Now, we will merge concept of Denoising Autoencoder with
Convolutional Neural Networks.
The input data for CNN will be represented by matrices, not
vectors as in a standard fully-connected neural network
 After this operation, the dimension of x_train and x_test arrays
is num_samples x 28 x 28 x 1. The last dimension (depth/channel)
is added just because convolutional layers in keras accept 3D
tensors.
11
 Then, we add noise to training and testing data in similar way as
in the previous basic DAE:
 We define CNN architecture with following function:
 The pooling layers in encoder are down sampling the input by
halve using operation.
12
 The up sampling layers in decoder reconstruct the input size.
Finally, the last convolutional layer with sigmoid activation
function outputs decoded image.
 We train the model using 40 epochs and batch size=128. We
have leaky rectifier is used to fix limitations of standard rectifier.
The standard rectifier sometimes causes a neuron unit to stay
zero and be never activated in the next gradient descent
iterations. The leaky rectifier solves that problem with the
activation function .
13
 The outputs of Convolutional Denoising Autoencoder on test data
are much better than previously:
14
Implementation
 We will extract features representation from Denoising
Autoencoder and try to enhance predictive accuracy in
recognizing handwritten digits from MNIST dataset. The pipeline
is following:
 Load, reshape, scale and add noise to data,
 Train DAE on merged training and testing data,
 Get neuron outputs from DAE as new features,
 Train classification algorithm on new features.
15
 We import libraries, load and preprocess data.
 Autoencoders belong to unsupervised machine learning
algorithms, in which we do not care about labels in the data and
we can use both training and testing data in Representation
Learning.
 Then, we define architecture of Denoising Autoencoder. Let’s
keep it simple and instead Convolutional Neural Networks, we
use deep network with 3 hidden layers, each containing 1024
neurons. We use rectifier as activation function and sigmoid in
output layer to produce pixel values in range .
16
 We train the Denoising Autoencoder with 40 epochs and
batch_size=128. The validation_data argument to monitor
validation loss is not provided, thus we may need to be careful not
to overfit the Autoencoder. On the other hand, if we are using
both training and testing data that are good representation of
population data, the chances of overfitting are smaller.
 Next step will be to extract features from pretrained Autoencoder.
As mentioned earlier, we are taking outputs of neurons located in
all hidden layers (encoder, bottleneck and decoder layers) as new
representation of the data.
17
 Note, that the noisy data was used only during Autoencoder
training to improve quality of representation. Clean version of
training and testing data is passed through the Autoencoder
network to produce new representations features
train and features test , respectively. These representations have
higher dimensionality 1024+1024+1024=3072 > 784 than training
data, allowing to encode more information. Moreover, the
Autoencoder automatically decided for us which features are
important, because it was trained with the goal to reconstruct the
input as best as possible.
 We use these representations in classification task of recognizing
handwritten digits. We can use new features in any classification
algorithm (random forests, support vector machines)
18
 There are ten possible classes to predict (digits from
0 to 9) and we produce one hot encoding for labels
with np_utils.to_categorical function.
 We train model with 20 epochs and
batch_size=128. Model Checkpoint callback is used
to monitor validation accuracy after each epoch and
save the model with best performance.
19
The Denoising Autoencoder fits nicely to the data:
20
21
22
 Just after first epoch our validation accuracy is
99.3%! Eventually, we end up with 99.71% accuracy,
in comparison to the same model architecture, but
with original features it is 99.5% accuracy. Obviously,
MNIST dataset used for presentation purposes is
relatively simple and in more complex cases the gain
could be higher.
23
Conclusions
 In this we constructed Denoising Autoencoders with
Convolutional Neural Networks and learned the purpose and
implementation of Representation Learning with Denoising
Autoencoders.
24

More Related Content

What's hot

Hog
HogHog
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders
Akash Goel
 
Scikit Learn intro
Scikit Learn introScikit Learn intro
Scikit Learn intro
9xdot
 
Digital image processing
Digital image processing  Digital image processing
Digital image processing
kavitha muneeshwaran
 
Color Image Processing
Color Image ProcessingColor Image Processing
Color Image Processing
kiruthiammu
 
Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)
Prakhar Rastogi
 
Digital Image restoration
Digital Image restorationDigital Image restoration
Digital Image restoration
Md Shabir Alam
 
Data preprocessing using Machine Learning
Data  preprocessing using Machine Learning Data  preprocessing using Machine Learning
Data preprocessing using Machine Learning
Gopal Sakarkar
 
Intepretability / Explainable AI for Deep Neural Networks
Intepretability / Explainable AI for Deep Neural NetworksIntepretability / Explainable AI for Deep Neural Networks
Intepretability / Explainable AI for Deep Neural Networks
Universitat Politècnica de Catalunya
 
NOISE FILTERS IN IMAGE PROCESSING
NOISE FILTERS IN IMAGE PROCESSINGNOISE FILTERS IN IMAGE PROCESSING
NOISE FILTERS IN IMAGE PROCESSING
Animesh Singh Sengar
 
Vanishing & Exploding Gradients
Vanishing & Exploding GradientsVanishing & Exploding Gradients
Vanishing & Exploding Gradients
Siddharth Vij
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
Gaurav Mittal
 
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Universitat Politècnica de Catalunya
 
Chapter 9 morphological image processing
Chapter 9   morphological image processingChapter 9   morphological image processing
Chapter 9 morphological image processing
Ahmed Daoud
 
Convolutional Neural Network (CNN)
Convolutional Neural Network (CNN)Convolutional Neural Network (CNN)
Convolutional Neural Network (CNN)
Muhammad Haroon
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
Mark Chang
 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnn
Kuppusamy P
 
Back propagation
Back propagationBack propagation
Back propagation
Nagarajan
 
Image Classification using deep learning
Image Classification using deep learning Image Classification using deep learning
Image Classification using deep learning
Asma-AH
 
Support Vector Machines- SVM
Support Vector Machines- SVMSupport Vector Machines- SVM
Support Vector Machines- SVM
Carlo Carandang
 

What's hot (20)

Hog
HogHog
Hog
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders
 
Scikit Learn intro
Scikit Learn introScikit Learn intro
Scikit Learn intro
 
Digital image processing
Digital image processing  Digital image processing
Digital image processing
 
Color Image Processing
Color Image ProcessingColor Image Processing
Color Image Processing
 
Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)
 
Digital Image restoration
Digital Image restorationDigital Image restoration
Digital Image restoration
 
Data preprocessing using Machine Learning
Data  preprocessing using Machine Learning Data  preprocessing using Machine Learning
Data preprocessing using Machine Learning
 
Intepretability / Explainable AI for Deep Neural Networks
Intepretability / Explainable AI for Deep Neural NetworksIntepretability / Explainable AI for Deep Neural Networks
Intepretability / Explainable AI for Deep Neural Networks
 
NOISE FILTERS IN IMAGE PROCESSING
NOISE FILTERS IN IMAGE PROCESSINGNOISE FILTERS IN IMAGE PROCESSING
NOISE FILTERS IN IMAGE PROCESSING
 
Vanishing & Exploding Gradients
Vanishing & Exploding GradientsVanishing & Exploding Gradients
Vanishing & Exploding Gradients
 
Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)Convolutional Neural Networks (CNN)
Convolutional Neural Networks (CNN)
 
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
 
Chapter 9 morphological image processing
Chapter 9   morphological image processingChapter 9   morphological image processing
Chapter 9 morphological image processing
 
Convolutional Neural Network (CNN)
Convolutional Neural Network (CNN)Convolutional Neural Network (CNN)
Convolutional Neural Network (CNN)
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnn
 
Back propagation
Back propagationBack propagation
Back propagation
 
Image Classification using deep learning
Image Classification using deep learning Image Classification using deep learning
Image Classification using deep learning
 
Support Vector Machines- SVM
Support Vector Machines- SVMSupport Vector Machines- SVM
Support Vector Machines- SVM
 

Similar to Denoising autoencoder by Harish.R

Digit recognition using neural network
Digit recognition using neural networkDigit recognition using neural network
Digit recognition using neural network
shachibattar
 
House price prediction
House price predictionHouse price prediction
House price prediction
SabahBegum
 
False colouring
False colouringFalse colouring
False colouring
GauravBiswas9
 
Build a simple image recognition system with tensor flow
Build a simple image recognition system with tensor flowBuild a simple image recognition system with tensor flow
Build a simple image recognition system with tensor flow
DebasisMohanty37
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprograms
Paridha Saxena
 
NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximatePrograms
Mohid Nabil
 
UNIT-4.pdf
UNIT-4.pdfUNIT-4.pdf
UNIT-4.pdf
NiharikaThakur32
 
UNIT-4.pdf
UNIT-4.pdfUNIT-4.pdf
UNIT-4.pdf
NiharikaThakur32
 
UNIT-4.pptx
UNIT-4.pptxUNIT-4.pptx
UNIT-4.pptx
NiharikaThakur32
 
Som paper1.doc
Som paper1.docSom paper1.doc
Som paper1.doc
Abhi Mediratta
 
IRJET - Hand Gesture Recognition to Perform System Operations
IRJET -  	  Hand Gesture Recognition to Perform System OperationsIRJET -  	  Hand Gesture Recognition to Perform System Operations
IRJET - Hand Gesture Recognition to Perform System Operations
IRJET Journal
 
Assignment-1-NF.docx
Assignment-1-NF.docxAssignment-1-NF.docx
Assignment-1-NF.docx
KhondokerAbuNaim
 
Deep Learning with Apache Spark: an Introduction
Deep Learning with Apache Spark: an IntroductionDeep Learning with Apache Spark: an Introduction
Deep Learning with Apache Spark: an Introduction
Emanuele Bezzi
 
Unsupervised Feature Learning
Unsupervised Feature LearningUnsupervised Feature Learning
Unsupervised Feature Learning
Amgad Muhammad
 
Power ai tensorflowworkloadtutorial-20171117
Power ai tensorflowworkloadtutorial-20171117Power ai tensorflowworkloadtutorial-20171117
Power ai tensorflowworkloadtutorial-20171117
Ganesan Narayanasamy
 
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
CSCJournals
 
Deep learning-practical
Deep learning-practicalDeep learning-practical
Deep learning-practical
Hitesh Mohapatra
 
Presentation on BornoNet Research Paper and Python Basics
Presentation on BornoNet Research Paper and Python BasicsPresentation on BornoNet Research Paper and Python Basics
Presentation on BornoNet Research Paper and Python Basics
Shibbir Ahmed
 
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
IRJET Journal
 
MachinaFiesta: A Vision into Machine Learning 🚀
MachinaFiesta: A Vision into Machine Learning 🚀MachinaFiesta: A Vision into Machine Learning 🚀
MachinaFiesta: A Vision into Machine Learning 🚀
GDSCNiT
 

Similar to Denoising autoencoder by Harish.R (20)

Digit recognition using neural network
Digit recognition using neural networkDigit recognition using neural network
Digit recognition using neural network
 
House price prediction
House price predictionHouse price prediction
House price prediction
 
False colouring
False colouringFalse colouring
False colouring
 
Build a simple image recognition system with tensor flow
Build a simple image recognition system with tensor flowBuild a simple image recognition system with tensor flow
Build a simple image recognition system with tensor flow
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprograms
 
NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximatePrograms
 
UNIT-4.pdf
UNIT-4.pdfUNIT-4.pdf
UNIT-4.pdf
 
UNIT-4.pdf
UNIT-4.pdfUNIT-4.pdf
UNIT-4.pdf
 
UNIT-4.pptx
UNIT-4.pptxUNIT-4.pptx
UNIT-4.pptx
 
Som paper1.doc
Som paper1.docSom paper1.doc
Som paper1.doc
 
IRJET - Hand Gesture Recognition to Perform System Operations
IRJET -  	  Hand Gesture Recognition to Perform System OperationsIRJET -  	  Hand Gesture Recognition to Perform System Operations
IRJET - Hand Gesture Recognition to Perform System Operations
 
Assignment-1-NF.docx
Assignment-1-NF.docxAssignment-1-NF.docx
Assignment-1-NF.docx
 
Deep Learning with Apache Spark: an Introduction
Deep Learning with Apache Spark: an IntroductionDeep Learning with Apache Spark: an Introduction
Deep Learning with Apache Spark: an Introduction
 
Unsupervised Feature Learning
Unsupervised Feature LearningUnsupervised Feature Learning
Unsupervised Feature Learning
 
Power ai tensorflowworkloadtutorial-20171117
Power ai tensorflowworkloadtutorial-20171117Power ai tensorflowworkloadtutorial-20171117
Power ai tensorflowworkloadtutorial-20171117
 
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
Faster Training Algorithms in Neural Network Based Approach For Handwritten T...
 
Deep learning-practical
Deep learning-practicalDeep learning-practical
Deep learning-practical
 
Presentation on BornoNet Research Paper and Python Basics
Presentation on BornoNet Research Paper and Python BasicsPresentation on BornoNet Research Paper and Python Basics
Presentation on BornoNet Research Paper and Python Basics
 
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
IMPROVEMENT IN IMAGE DENOISING OF HANDWRITTEN DIGITS USING AUTOENCODERS IN DE...
 
MachinaFiesta: A Vision into Machine Learning 🚀
MachinaFiesta: A Vision into Machine Learning 🚀MachinaFiesta: A Vision into Machine Learning 🚀
MachinaFiesta: A Vision into Machine Learning 🚀
 

Recently uploaded

Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
Hitesh Mohapatra
 
2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt
PuktoonEngr
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
Madan Karki
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
thanhdowork
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
Dr Ramhari Poudyal
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 
Technical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prismsTechnical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prisms
heavyhaig
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
IJNSA Journal
 
Modelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdfModelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdf
camseq
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
awadeshbabu
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
Victor Morales
 
Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
Ratnakar Mikkili
 
PPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testingPPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testing
anoopmanoharan2
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
RadiNasr
 

Recently uploaded (20)

Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
 
2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt2. Operations Strategy in a Global Environment.ppt
2. Operations Strategy in a Global Environment.ppt
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 
Technical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prismsTechnical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prisms
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
 
Modelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdfModelagem de um CSTR com reação endotermica.pdf
Modelagem de um CSTR com reação endotermica.pdf
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
 
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
 
Exception Handling notes in java exception
Exception Handling notes in java exceptionException Handling notes in java exception
Exception Handling notes in java exception
 
PPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testingPPT on GRP pipes manufacturing and testing
PPT on GRP pipes manufacturing and testing
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
 

Denoising autoencoder by Harish.R

  • 2. Problem Statement To extract features representation from Denoising Autoencoder and try to enhance predictive accuracy in recognizing handwritten digits from MNIST dataset. The pipeline is following:  Load, reshape, scale and add noise to data,  Train DAE on merged training and testing data,  Get neuron outputs from DAE as new features,  Train classification algorithm on new features. 2
  • 3. Motivation 3  When we study different machine learning architectures, encoding of data are not significant but in auto encoders encoding and decoding has significant rule this made me to work on understanding and analyzing Autoencoder and apply on my learning to de noise image
  • 4. Introduction  Autoencoders are part of neural network family. The algorithm is fairly simple as AE require output to be the same as input, so that we can classify them to unsupervised machine learning algorithms. The AE compress input data to latent-space representation and then reconstruct the output. We can divide algorithm into two subsets:  Encoder – compressing the input data to lower dimensional representation sometimes called latent-space representation  Decoder – decompressing the representation to reconstruct the input as best as possible 4
  • 5. Encoder-Decoder architecture 5 Encoder : compresses the input and produces the code. Decoder: then reconstructs the input only using this code.
  • 6. Where Autoencoder is used  AE are currently used in image or sound compressing and dimensionality reduction. In specific cases they can provide more interesting/efficient data projections or other dimensionality reduction techniques. Moreover, the extension of AE, called Denoising Autoencoders are used in representation learning, which uses not only training but also testing data to engineer features 6
  • 7. Denoising Autoencoder  While the main purpose of basic AE is to compress and reduce dimensionality of data, DAE are used in another practical application. Imagine a set of low quality images with some noise. Is it possible to clear these images from noise using machine learning algorithms.  In the following example we will show how to clear handwritten MNIST digits from Gaussian random noise. 7
  • 8. Constructing of Denosing Autoencoder To introduce Gaussian random noise we add following code:  Noise factor controls the noisiness of images and we clip the values to make sure that the elements of feature vector representing image are between 0 and 1.  We use basic neural network to encode 784 input features to 32 neurons with rectifier activation function and then decode it back to 784 neurons outputting values in range thanks to sigmoid activation function. The only difference is that the training is done on noisy input samples: 8
  • 9.  The results on test data are not satisfying, DAE reconstructed digit 4 to a digit similar to 9(fourth image from left).  We could vary around with epochs, batch size other parameters trying to get better results. So use Convolution Neural Networks which are successful in image processing area. 9
  • 11. Constructing Convolutional Denoising Autoencoder  Now, we will merge concept of Denoising Autoencoder with Convolutional Neural Networks. The input data for CNN will be represented by matrices, not vectors as in a standard fully-connected neural network  After this operation, the dimension of x_train and x_test arrays is num_samples x 28 x 28 x 1. The last dimension (depth/channel) is added just because convolutional layers in keras accept 3D tensors. 11
  • 12.  Then, we add noise to training and testing data in similar way as in the previous basic DAE:  We define CNN architecture with following function:  The pooling layers in encoder are down sampling the input by halve using operation. 12
  • 13.  The up sampling layers in decoder reconstruct the input size. Finally, the last convolutional layer with sigmoid activation function outputs decoded image.  We train the model using 40 epochs and batch size=128. We have leaky rectifier is used to fix limitations of standard rectifier. The standard rectifier sometimes causes a neuron unit to stay zero and be never activated in the next gradient descent iterations. The leaky rectifier solves that problem with the activation function . 13
  • 14.  The outputs of Convolutional Denoising Autoencoder on test data are much better than previously: 14
  • 15. Implementation  We will extract features representation from Denoising Autoencoder and try to enhance predictive accuracy in recognizing handwritten digits from MNIST dataset. The pipeline is following:  Load, reshape, scale and add noise to data,  Train DAE on merged training and testing data,  Get neuron outputs from DAE as new features,  Train classification algorithm on new features. 15
  • 16.  We import libraries, load and preprocess data.  Autoencoders belong to unsupervised machine learning algorithms, in which we do not care about labels in the data and we can use both training and testing data in Representation Learning.  Then, we define architecture of Denoising Autoencoder. Let’s keep it simple and instead Convolutional Neural Networks, we use deep network with 3 hidden layers, each containing 1024 neurons. We use rectifier as activation function and sigmoid in output layer to produce pixel values in range . 16
  • 17.  We train the Denoising Autoencoder with 40 epochs and batch_size=128. The validation_data argument to monitor validation loss is not provided, thus we may need to be careful not to overfit the Autoencoder. On the other hand, if we are using both training and testing data that are good representation of population data, the chances of overfitting are smaller.  Next step will be to extract features from pretrained Autoencoder. As mentioned earlier, we are taking outputs of neurons located in all hidden layers (encoder, bottleneck and decoder layers) as new representation of the data. 17
  • 18.  Note, that the noisy data was used only during Autoencoder training to improve quality of representation. Clean version of training and testing data is passed through the Autoencoder network to produce new representations features train and features test , respectively. These representations have higher dimensionality 1024+1024+1024=3072 > 784 than training data, allowing to encode more information. Moreover, the Autoencoder automatically decided for us which features are important, because it was trained with the goal to reconstruct the input as best as possible.  We use these representations in classification task of recognizing handwritten digits. We can use new features in any classification algorithm (random forests, support vector machines) 18
  • 19.  There are ten possible classes to predict (digits from 0 to 9) and we produce one hot encoding for labels with np_utils.to_categorical function.  We train model with 20 epochs and batch_size=128. Model Checkpoint callback is used to monitor validation accuracy after each epoch and save the model with best performance. 19
  • 20. The Denoising Autoencoder fits nicely to the data: 20
  • 21. 21
  • 22. 22
  • 23.  Just after first epoch our validation accuracy is 99.3%! Eventually, we end up with 99.71% accuracy, in comparison to the same model architecture, but with original features it is 99.5% accuracy. Obviously, MNIST dataset used for presentation purposes is relatively simple and in more complex cases the gain could be higher. 23
  • 24. Conclusions  In this we constructed Denoising Autoencoders with Convolutional Neural Networks and learned the purpose and implementation of Representation Learning with Denoising Autoencoders. 24