Amnesic Neural Network for Classification: Application on Stock Trend Prediction* <br />Author: Qiang Ye, Bing Liang, Yiju...
Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Tr...
Artificial Neural Network models (ANN)<br /><ul><li>Based on the neural structure of the brain
The ANN learns from experience
Critical step: network training</li></ul>Two classes of predict stock market method<br /><ul><li>Fundamental analysis</li>...
By analyzing patterns and trends shown in price and volume chart</li></ul>Introduction & Literature review<br />3<br />
Stock price prediction<br /><ul><li>Traditional assumption: customer behavior is consistent
In real world: customer behavior change greatly
Training data set may be time-variant
Difficult to predict customer behavior from old data</li></ul>Two strategies<br /><ul><li>Select all data from different t...
Select only the latest data: lose useful information hidden in data of early 			            time</li></ul>Data selectionin...
Amnesic Neural network (ANN*) model<br /><ul><li>Introducing psychological notion of forgetting into </li></ul>Back Propag...
Effectiveness data depends on time
Present data is more useful than old data
Old data has less effect on training result, like gradually forgetting</li></ul>5<br />Introduction (cont.)<br />
Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Tr...
Artificial Neural Network<br /><ul><li>Computational modeling tools for Modeling complex real-world problems
Capable of performing massively parallel computations for data processing and knowledge representation
The feed-forward-error-back-propagation learning algorithm is the most famous procedure for training ANN</li></ul>Back Pro...
Each iteration:
Forwardactivation to produce a solution
Backwardpropagation of computed error to modify the weight</li></ul>7<br />Methodology<br />
Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Tr...
9<br />Methodology (cont.)<br /> BP network Model<br /><ul><li>Input:
Output:
Weight:      connects the node j in previous layer to the node k
Activation function:
Error signal:
I(n) is a set of input
Upcoming SlideShare
Loading in …5
×

Amnestic neural network for classification

1,174
-1

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,174
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
35
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Amnestic neural network for classification

  1. 1. Amnesic Neural Network for Classification: Application on Stock Trend Prediction* <br />Author: Qiang Ye, Bing Liang, Yijun Li<br /> Publication: ICSSSM 2005 <br /> Presenter: Yu-Hsiang Huang<br />2011.9.23<br />1<br />
  2. 2. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Training Algorithm<br />Experiment Data <br />Classification algorithm <br />Experiment Result<br />Outline<br />2<br />
  3. 3. Artificial Neural Network models (ANN)<br /><ul><li>Based on the neural structure of the brain
  4. 4. The ANN learns from experience
  5. 5. Critical step: network training</li></ul>Two classes of predict stock market method<br /><ul><li>Fundamental analysis</li></ul>Macroeconomic data and basic financial status of company<br /><ul><li>Technical analysis</li></ul>History will repeat itself<br />The correlation between price and volume reveals market behavior<br />Prediction<br /><ul><li>By exploiting implications hidden in past trading activities
  6. 6. By analyzing patterns and trends shown in price and volume chart</li></ul>Introduction & Literature review<br />3<br />
  7. 7. Stock price prediction<br /><ul><li>Traditional assumption: customer behavior is consistent
  8. 8. In real world: customer behavior change greatly
  9. 9. Training data set may be time-variant
  10. 10. Difficult to predict customer behavior from old data</li></ul>Two strategies<br /><ul><li>Select all data from different time: can’t represent current knowledge
  11. 11. Select only the latest data: lose useful information hidden in data of early time</li></ul>Data selectionin stock markets prediction will influence the training result<br />Introduction & Literature review<br />4<br />
  12. 12. Amnesic Neural network (ANN*) model<br /><ul><li>Introducing psychological notion of forgetting into </li></ul>Back Propagation (BP) neural network<br /><ul><li>Solve the problem of cross-temporal data classification
  13. 13. Effectiveness data depends on time
  14. 14. Present data is more useful than old data
  15. 15. Old data has less effect on training result, like gradually forgetting</li></ul>5<br />Introduction (cont.)<br />
  16. 16. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Training Algorithm<br /> Experiment Data<br /> Classification algorithm<br />Experiment Result<br />6<br />
  17. 17. Artificial Neural Network<br /><ul><li>Computational modeling tools for Modeling complex real-world problems
  18. 18. Capable of performing massively parallel computations for data processing and knowledge representation
  19. 19. The feed-forward-error-back-propagation learning algorithm is the most famous procedure for training ANN</li></ul>Back Propagation Neural Network<br /><ul><li>Based on searching an error surface using gradient descent for points with minimum errors
  20. 20. Each iteration:
  21. 21. Forwardactivation to produce a solution
  22. 22. Backwardpropagation of computed error to modify the weight</li></ul>7<br />Methodology<br />
  23. 23. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br /> Amnesic Neural Network Model<br /> Training Algorithm<br /> Experiment Data<br /> Classification algorithm<br />Experiment Result<br />8<br />
  24. 24. 9<br />Methodology (cont.)<br /> BP network Model<br /><ul><li>Input:
  25. 25. Output:
  26. 26. Weight: connects the node j in previous layer to the node k
  27. 27. Activation function:
  28. 28. Error signal:
  29. 29. I(n) is a set of input
  30. 30. Y(n) is a set of corresponding output
  31. 31. D(n) is the expected output (given by training example)</li></li></ul><li>10<br />Methodology (cont.)<br />Input<br />hidden<br />output<br />Input vector<br />output vector<br />weight<br />bias<br />
  32. 32. 11<br />Methodology (cont.)<br />Square error:<br />Summation of square error:<br />Summation of square error for entire training data set: <br />Objective of learning<br /><ul><li>Find W to minimize , which is </li></li></ul><li>Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br />Amnesic Neural Network Model<br /> Training Algorithm<br /> Experiment Data<br /> Classification algorithm<br />Experiment Result<br />12<br />
  33. 33. 13<br />Methodology (cont.)<br />Amnesic Neural Network Model<br /><ul><li>The effectiveness of data depends on the time.
  34. 34. Present data is more useful than old data.
  35. 35. Just like forgetting as time goes on.
  36. 36. Assign different weight to data of different times ( new>old ).
  37. 37. Original square error:
  38. 38. Square error: </li></li></ul><li><ul><li>is forgettingfunction
  39. 39. is forgetting coefficient
  40. 40. is the benchmark time (current time or the time of newest data)
  41. 41. is time of I(n)
  42. 42. Summation of square error for entire training data set:</li></ul>Objective of learning<br /><ul><li> Find W to minimize , which is
  43. 43. Becomes to</li></ul>14<br />Methodology (cont.)<br />
  44. 44. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br />Amnesic Neural Network Model<br /> Training Algorithm<br /> Classification algorithm<br /> Experiment Data<br />Experiment Result<br />15<br />
  45. 45. 16<br />Methodology (cont.)<br /> Training Algorithm in ANN*<br /><ul><li>Step1
  46. 46. Initialize weights to connections with random weight.
  47. 47. Subject to Uniform Distribution(0,1)
  48. 48. Step2
  49. 49. Input I(n) = , d(n) is expected output of example n.
  50. 50. Step3
  51. 51. Compute output according to Equation 1, compute the output of neuron in each layer . = is initial input.
  52. 52. ...... (1)
  53. 53. is output of neuron j in the ith layer in nth round learning</li></li></ul><li>17<br />Methodology (cont.)<br /><ul><li>Step4
  54. 54. Modify weight </li></ul>Calculateδbackward <br />For the node in output layer (1)<br />For the node in hidden layer (2)<br />Modify the weight from output layer to previous layer<br /><ul><li>(3)
  55. 55. ηis learning step.
  56. 56. Step5
  57. 57. n=n+1 go to Step2 until the training converges and system error decrease below an acceptable threshold.</li></li></ul><li>18<br />Methodology (cont.)<br />Step4<br />Modify weight<br />Step2<br />Input I(n) <br />Step3<br />Compute output<br />Step1<br />Initialize weights<br />
  58. 58. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br />Amnesic Neural Network Model<br /> Training Algorithm<br /> Experiment Data<br /> Classification algorithm<br />Experiment Result<br />19<br />
  59. 59. 20<br />Methodology (cont.)<br />Experiment data<br /><ul><li>About 900 stocks are selected with the data from 2001 to 2004
  60. 60. 14 factors were considered
  61. 61. Stock price, major profit ratio
  62. 62. EPS, interest cover
  63. 63. ROE, receivables turnover
  64. 64. Liability/asset ratio, asset turn-over
  65. 65. Liquidity ratio, Liquid market value
  66. 66. PE, tax profit growth
  67. 67. PB</li></li></ul><li><ul><li>Percentage figure of price change for each stock
  68. 68. 1000 records were selected for training the ANN*
  69. 69. Other 500 records formed a testing sample</li></ul>21<br />Methodology (cont.)<br />
  70. 70. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br />Amnesic Neural Network Model<br /> Training Algorithm<br /> Experiment Data<br /> Classification algorithm<br />Experiment Result<br />22<br />
  71. 71. 23<br />Methodology (cont.)<br />Classification algorithm<br /><ul><li>Input</li></ul>Patterns to be classified ;<br />Neural Network Weight matrix W;<br /><ul><li>Output
  72. 72. Class of X</li></li></ul><li><ul><li>Algorithm
  73. 73. Step1</li></ul>Input as an output of layer 0<br /><ul><li>Step2</li></ul>Computes output in hidden layer <br /><ul><li>Step3</li></ul>Computes output in output layer <br /><ul><li>Step4</li></ul>Assigns output to , satisfying <br /><ul><li>Step5</li></ul>Return class label , satisfying <br />24<br />Methodology (cont.)<br />
  74. 74. Introduction & Literature review<br />Methodology<br />BP Neural Network Model<br />Amnesic Neural Network Model<br /> Training Algorithm<br /> Classification algorithm<br /> Experiment Data<br />Experiment Result<br />25<br />
  75. 75. Parameters of ANN*<br /><ul><li>Neural nodes in the input layer, 21
  76. 76. Neural nodes in the hidden layer, 25
  77. 77. Neural nodes in the output layer, 3
  78. 78. Training step (η), 0.01
  79. 79. Threshold of maximum training cycles, 10000
  80. 80. Threshold of minimum error rate, 0.005
  81. 81. Forgetting coefficient, test 10 different forgetting coefficients from 0~1 )</li></ul>26<br />Experiment Result<br />
  82. 82. Result<br />2.<br />1. <br />3.<br /><ul><li>The highestcorrect classification ratio on the test sample is achieved when forgetting coefficient is set to 0.1
  83. 83. When forgetting coefficient is 0, the ANN* would be ordinary BP
  84. 84. The ANN* could do better than ordinary BP with careful selection of forgetting coefficient</li></ul>27<br />Experiment Result (cont.)<br />BP ANN<br />Amnesic ANN<br />better<br />
  85. 85. <ul><li>This paper introduced the psychological notion of ‘forgetting’ into the BP neural network model established the Amnesic ANN model.
  86. 86. The aim of Amnesic ANN is to address the stochastic time variation problem in customer’s behaviors.
  87. 87. With carefully selected forgetting coefficients, the Amnesic ANN could perform better than the common ANN (BP) in stock price prediction
  88. 88. Further research should be done for the ratio of the right classified stocks is still very low in this experiment.</li></ul>28<br />Conclusions<br />
  89. 89. Thanks for your listening.<br />Q & A<br />29<br />
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×