Implement LSTM.
Perform LSTM on stock
market analysis
Course Title: Deep Learning
Course Code: 18CS3074P
Skill number: 5
Group: 180030292 180030331
180030572 180031117
180031222 180031318
LSTM
• By Hochreiter and Schmiduber.
• Long short-term memory (LSTM) is an artificial recurrent neural
network (RNN) architecture used in the field of deep learning.
Applications
• Widely been used for speech recognition,
language modeling, sentiment analysis and text prediction.
• This is a behavior required in complex problem domains like
machine translation, speech recognition, and more.
Advantages
• LSTMs were developed to deal with the vanishing gradient problem that
can be encountered when training traditional RNNs.
• Relative insensitivity to gap length is an advantage of LSTM over
RNNs, hidden Markov models and other sequence learning methods in
numerous applications.
Disadvantages
• LSTMs take longer to train.
• LSTMs require more memory to train.
• LSTMs are easy to overfit.
• Dropout is much harder to implement in LSTMs.
• LSTMs are sensitive to different random weight initializations.
Applications Of LSTM
Applications of LSTM include:
• Robot control.
• Time series prediction.
• Speech recognition.
• Rhythm learning.
• Music composition.
• Grammar learning.
• Handwriting recognition.
• Human action recognition.
Long Short Term Memory
• Drawback of RNN pushed the scientists to develop and invent a new
variant of the RNN model, called Long Short Term Memory.
• LSTM can solve this problem, because it uses gates to control the
memorizing process.
• capable of learning long-term dependencies.
• Remembering information for long periods of time is practically their
default behavior
LSTM
• The symbols used here have following
meaning:
• a) X : Scaling of information
• b)+ : Adding information
• c) σ : Sigmoid layer
• d) tanh: tanh layer
• e) h(t-1) : Output of last LSTM unit
• f) c(t-1) : Memory from last LSTM unit
• g) X(t) : Current input
• h) c(t) : New updated memory
• i) h(t) : Current output
DropOut
• from keras.layers import Dropout
• The Dropout layer randomly sets input units to 0 with a frequency of rate
at each step during training time, which helps prevent overfitting
• from sklearn.preprocessing import MinMaxScaler
• Transform features by scaling each feature to a given range.
• between zero and one.
Sequential
• Sequential model is a linear stack of layers.You can create a Sequential
model by passing a list of layer instances to the constructor:
• from keras.models import Sequential
• from keras.layers import Dense, Activation
• model = Sequential([ Dense(32, input_shape=(784,))
Code Process
• #Importing the libraries
• #Visualizing the fetched data
• #Data Preprocessing
• #Defining the LSTM Recurrent Model
• #Compiling and fitting the model
• Fetching the test data and preprocessing
• Making predictions on the test data
• Visualizing the prediction
Code Implementation:
Implement LST perform LSTm stock Makrket Analysis

Implement LST perform LSTm stock Makrket Analysis

  • 1.
    Implement LSTM. Perform LSTMon stock market analysis
  • 2.
    Course Title: DeepLearning Course Code: 18CS3074P Skill number: 5 Group: 180030292 180030331 180030572 180031117 180031222 180031318
  • 3.
    LSTM • By Hochreiterand Schmiduber. • Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning.
  • 4.
    Applications • Widely beenused for speech recognition, language modeling, sentiment analysis and text prediction. • This is a behavior required in complex problem domains like machine translation, speech recognition, and more.
  • 5.
    Advantages • LSTMs weredeveloped to deal with the vanishing gradient problem that can be encountered when training traditional RNNs. • Relative insensitivity to gap length is an advantage of LSTM over RNNs, hidden Markov models and other sequence learning methods in numerous applications.
  • 6.
    Disadvantages • LSTMs takelonger to train. • LSTMs require more memory to train. • LSTMs are easy to overfit. • Dropout is much harder to implement in LSTMs. • LSTMs are sensitive to different random weight initializations.
  • 7.
    Applications Of LSTM Applicationsof LSTM include: • Robot control. • Time series prediction. • Speech recognition. • Rhythm learning. • Music composition. • Grammar learning. • Handwriting recognition. • Human action recognition.
  • 8.
    Long Short TermMemory • Drawback of RNN pushed the scientists to develop and invent a new variant of the RNN model, called Long Short Term Memory. • LSTM can solve this problem, because it uses gates to control the memorizing process. • capable of learning long-term dependencies. • Remembering information for long periods of time is practically their default behavior
  • 9.
    LSTM • The symbolsused here have following meaning: • a) X : Scaling of information • b)+ : Adding information • c) σ : Sigmoid layer • d) tanh: tanh layer • e) h(t-1) : Output of last LSTM unit • f) c(t-1) : Memory from last LSTM unit • g) X(t) : Current input • h) c(t) : New updated memory • i) h(t) : Current output
  • 10.
    DropOut • from keras.layersimport Dropout • The Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent overfitting • from sklearn.preprocessing import MinMaxScaler • Transform features by scaling each feature to a given range. • between zero and one.
  • 11.
    Sequential • Sequential modelis a linear stack of layers.You can create a Sequential model by passing a list of layer instances to the constructor: • from keras.models import Sequential • from keras.layers import Dense, Activation • model = Sequential([ Dense(32, input_shape=(784,))
  • 12.
    Code Process • #Importingthe libraries • #Visualizing the fetched data • #Data Preprocessing • #Defining the LSTM Recurrent Model • #Compiling and fitting the model • Fetching the test data and preprocessing • Making predictions on the test data • Visualizing the prediction
  • 14.