Hidden Markov Model & Stock Prediction

12,913 views

Published on

Introducing how to apply HMM on stock prediction

3 Comments
18 Likes
Statistics
Notes
  • Due to the fact that binary options functions more like a "yes or no" kind of trade mechanism, there is a tendency among inexperienced traders to trade it like a poker game or like they would do in the slot machines of Vegas. Hey All! I enjoyed this video, it was fantastic,I earned from this tools, But I am getting more from GINO SHEARER TRADING although it was launched last month.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Binary trading is advertised as the only genuine system that lets users earn preposterous amounts of money in ridiculously short period of time. Advertisers try to implicate as if you can make $350 every 60 seconds; if it was true then binary trading would truly be an astonishing business. If I were able to show you a simple online business that you can work from home in your spare time that would allow you to make an extra $2000 to $5,000 a month, i would recomend 30DAYCHANGEPROGRAM COM.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • hi David, The slide were realy good and helpful. Could you also please give an example how the RHMM package could be used for future observation prediction for the "Weather" dataset ? TO generate the sequence I found this part of the code on Internet: weatherDist
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
12,913
On SlideShare
0
From Embeds
0
Number of Embeds
146
Actions
Shares
0
Downloads
436
Comments
3
Likes
18
Embeds 0
No embeds

No notes for slide
  • Transition matrix [[ 8.11812291e-01 2.64161406e-18 3.52349878e-02 2.81734681e-02 1.24779253e-01] [ 8.14457640e-18 9.18502509e-01 8.14974906e-02 8.07189607e-18 8.78456355e-18] [ 2.00105290e-02 2.44748522e-02 8.98883560e-01 2.65606045e-18 5.66310587e-02] [ 4.34984913e-07 3.52347211e-18 3.73302534e-18 8.62880547e-01 1.37119018e-01] [ 3.54215900e-01 6.07265466e-18 6.95345225e-02 1.44955842e-01 4.31293736e-01]] means and vars of each hidden state 0th hidden state mean = [ 6.47102406e+00 1.77594314e+06] var = [ 2.08576453e+03 2.84056347e+10] 1th hidden state mean = [ 3.58949104e+01 3.52922834e+06] var = [ 6.49509351e+03 3.01606472e+11] 2th hidden state mean = [ 8.00347774e+00 2.38896636e+06] var = [ 2.97831204e+03 1.13694938e+11] 3th hidden state mean = [ -1.56406476e+01 1.42048949e+06] var = [ 2.74139080e+03 2.90272908e+10] 4th hidden state mean = [ -7.67653562e+00 2.03930314e+06] var = [ 1.46905928e+04 2.24122434e+11]
  • Hidden Markov Model & Stock Prediction

    1. 1. HMM & StockPredictionDavid Chiu @ ML/DM Mondayhttp://ywchiu-tw.appspot.com/
    2. 2. HIDDEN MARKOV MODEL• Finite state machine which has some fixed number ofstates• Provides a probabilistic framework for modeling a timeseries of multivariate observations
    3. 3. STOCK PRICE PREDICTION
    4. 4. Every time history repeats itself• Stock behavior of past is similar to behavior of current day• The Next day’s stock price should follow about the same past datapattern
    5. 5. BENEFIT OF USING HMM• Handle new data robustly• Computationally efficient to develop and evaluate• Able to predict similar patterns efficiently
    6. 6. HMM ON STOCK PREDICTION• Using the trained HMM, likelihood value P for currentday’s dataset is calculated• From the past dataset using the HMM we locate thoseinstances that would produce the nearest P likelihoodvalue.
    7. 7. CHARACTERIZE HMM• Number of states in the model: N• Number of observation symbols: M• Transition matrix A = {aij} , where aij represents the transitionprobability from state i to state j• Observation emission matrix B = {bj(Ot)} , where bj(Ot) represent theprobability of observing Ot at state j• Initial state distribution π = {πi}
    8. 8. MODELING HMM
    9. 9. PROBLEM OF HMM1. The Evaluation Problem - Forward– What is the probability that the given observations O = o1 ,o2,...,oT are generated by the model p{O|λ} with a given HMM λ ?1. The Decoding Problem - Viterbi– What is the most likely state sequence in the given model λ thatproduced the given observations O = o1 ,o2 ,...,oT ?1. The Learning Problem - Baum-Welch– How should we adjust the model parameters {A,B,π } in order tomaximize p{O|λ} , whereat a model λ and a sequence ofobservations O = o1 ,o2 ,...,oT are given?
    10. 10. BAUM-WELCH ALGORITHM• Find the unknown parameters of a hidden Markov model(HMM).• Generalized expectation-maximization (GEM) algorithm• Compute maximum likelihood estimates and posteriormode estimates for the parameters (transition and emissionprobabilities) of an HMM, when given only emissions as trainingdata.
    11. 11. FIREARM
    12. 12. TOOL KIT• R Package– HMM– RHMM• JAVA– JHMM• Python– Scikit Learn
    13. 13. DEMO
    14. 14. GET DATASET• library(quantmod)• getSymbols("^TWII")• chartSeries(TWII)• TWII_Subset<- window(TWII, start = as.Date("2012-01-01"))• TWII_Train <- cbind(TWIISubset$TWII.Close - TWII_Subset$TWII.Open,TWII_Subset$TWII.Volume)
    15. 15. BUILD HMM MODEL# Include RHMM Library•library(RHmm)# Baum-Welch Algorithm•hm_model <- HMMFit(obs =TWII_Train , nStates = 5)# Viterbi Algorithm•VitPath <- viterbi(hm_model, TWII_Train)
    16. 16. SCATTER PLOT• TWII_Predict <- cbind(TWII_Subset$TWII.Close, VitPath$states)• chartSeries(TWII_Predict[,1])• addTA(TWII_Predict[TWII_Predict[,2]==1,1],on=1,type="p",col=5,pch=25)• addTA(TWII_Predict[TWII_Predict[,2]==2,1],on=1,type="p",col=6,pch=24)• addTA(TWII_Predict[TWII_Predict[,2]==3,1],on=1,type="p",col=7,pch=23)• addTA(TWII_Predict[TWII_Predict[,2]==4,1],on=1,type="p",col=8,pch=22)• addTA(TWII_Predict[TWII_Predict[,2]==5,1],on=1,type="p",col=10,pch=21)
    17. 17. DATA VISUALIZATION
    18. 18. SCIKIT LEARN#Baum-Welch Algorithm•n_components = 5•model = GaussianHMM(n_components, "diag")•model.fit([X], n_iter=1000)# predict the optimal sequence of internal hidden state•hidden_states = model.predict(X)
    19. 19. MODELING SAMPLE
    20. 20. MODELING SAMPLE
    21. 21. PREDICTION#State Prediction – using Scikit-learn•data_vec = [diff[last_day], volume[last_day]]•State = model.predict([data_vec])
    22. 22. REFERENCE• Hassan, M. (2009). A combination of hidden Markov model andfuzzy model for stock market forecasting. Neurocomputing, 72(16),3439-3446.• Gupta, A., & Dhingra, B. (2012, March). Stock Market PredictionUsing Hidden Markov Models. In Engineering and Systems (SCES),2012 Students Conference on (pp. 1-4). IEEE.

    ×