Successfully reported this slideshow.
Upcoming SlideShare
×

# Hidden Markov Model & Stock Prediction

16,724 views

Published on

Introducing how to apply HMM on stock prediction

• Full Name
Comment goes here.

Are you sure you want to Yes No
• Dating for everyone is here: ❶❶❶ http://bit.ly/2ZDZFYj ❶❶❶

Are you sure you want to  Yes  No

Are you sure you want to  Yes  No
• How to use "The Scrambler" ot get a girl obsessed with BANGING you...  http://t.cn/AijLRbnO

Are you sure you want to  Yes  No
• Imagine if you had a magic wealth magnet that attracts success, happiness and limitless money into your life� with no "hard work" required? ★★★ https://tinyurl.com/y44vwbuh

Are you sure you want to  Yes  No
• FREE TRAINING: "How to Earn a 6-Figure Side-Income Online" ...  https://bit.ly/2kS5a5J

Are you sure you want to  Yes  No

### Hidden Markov Model & Stock Prediction

1. 1. HMM & StockPredictionDavid Chiu @ ML/DM Mondayhttp://ywchiu-tw.appspot.com/
2. 2. HIDDEN MARKOV MODEL• Finite state machine which has some fixed number ofstates• Provides a probabilistic framework for modeling a timeseries of multivariate observations
3. 3. STOCK PRICE PREDICTION
4. 4. Every time history repeats itself• Stock behavior of past is similar to behavior of current day• The Next day’s stock price should follow about the same past datapattern
5. 5. BENEFIT OF USING HMM• Handle new data robustly• Computationally efficient to develop and evaluate• Able to predict similar patterns efficiently
6. 6. HMM ON STOCK PREDICTION• Using the trained HMM, likelihood value P for currentday’s dataset is calculated• From the past dataset using the HMM we locate thoseinstances that would produce the nearest P likelihoodvalue.
7. 7. CHARACTERIZE HMM• Number of states in the model: N• Number of observation symbols: M• Transition matrix A = {aij} , where aij represents the transitionprobability from state i to state j• Observation emission matrix B = {bj(Ot)} , where bj(Ot) represent theprobability of observing Ot at state j• Initial state distribution π = {πi}
8. 8. MODELING HMM
9. 9. PROBLEM OF HMM1. The Evaluation Problem - Forward– What is the probability that the given observations O = o1 ,o2,...,oT are generated by the model p{O|λ} with a given HMM λ ?1. The Decoding Problem - Viterbi– What is the most likely state sequence in the given model λ thatproduced the given observations O = o1 ,o2 ,...,oT ?1. The Learning Problem - Baum-Welch– How should we adjust the model parameters {A,B,π } in order tomaximize p{O|λ} , whereat a model λ and a sequence ofobservations O = o1 ,o2 ,...,oT are given?
10. 10. BAUM-WELCH ALGORITHM• Find the unknown parameters of a hidden Markov model(HMM).• Generalized expectation-maximization (GEM) algorithm• Compute maximum likelihood estimates and posteriormode estimates for the parameters (transition and emissionprobabilities) of an HMM, when given only emissions as trainingdata.
11. 11. FIREARM
12. 12. TOOL KIT• R Package– HMM– RHMM• JAVA– JHMM• Python– Scikit Learn
13. 13. DEMO
14. 14. GET DATASET• library(quantmod)• getSymbols("^TWII")• chartSeries(TWII)• TWII_Subset<- window(TWII, start = as.Date("2012-01-01"))• TWII_Train <- cbind(TWIISubset\$TWII.Close - TWII_Subset\$TWII.Open,TWII_Subset\$TWII.Volume)
15. 15. BUILD HMM MODEL# Include RHMM Library•library(RHmm)# Baum-Welch Algorithm•hm_model <- HMMFit(obs =TWII_Train , nStates = 5)# Viterbi Algorithm•VitPath <- viterbi(hm_model, TWII_Train)