Upcoming SlideShare
×

Microsoft PowerPoint - ourfinal-1.ppt [Compatibility Mode]

2,354
-1

Published on

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total Views
2,354
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
13
0
Likes
0
Embeds 0
No embeds

No notes for slide

Microsoft PowerPoint - ourfinal-1.ppt [Compatibility Mode]

1. 1. Group EEGGroup EEG (El t h l )(El t h l )(Electroencephalogram)(Electroencephalogram) Anthony Hampton, TonyAnthony Hampton, Tony NuthNuth,, MiralMiral PatelPatel (Portions credited to Jack Shelley(Portions credited to Jack Shelley--Tremblay and E. Keogh)Tremblay and E. Keogh) 05/09/200805/09/2008 5/9/20085/9/2008 11
2. 2. OutlineOutlineOutlineOutline IntroductionIntroductionIntroductionIntroduction GoalGoal M th d lM th d lMethodologyMethodology ResultsResults DiscussionDiscussion ConclusionConclusionConclusionConclusion 5/9/20085/9/2008 22
3. 3. GoalsGoalsGoalsGoals The goal of this project is to evaluate EEGThe goal of this project is to evaluate EEGThe goal of this project is to evaluate EEGThe goal of this project is to evaluate EEG data from 19 subjects using variousdata from 19 subjects using various techniques of mathematicstechniques of mathematicstechniques of mathematics.techniques of mathematics. 5/9/20085/9/2008 33
4. 4. EEGEEGEEGEEG A recording of the electrical waves that sweepsA recording of the electrical waves that sweepsA recording of the electrical waves that sweepsA recording of the electrical waves that sweeps over the brain’s surface.over the brain’s surface. It is measured by the electrodes thatIt is measured by the electrodes that is placed on topis placed on topis placed on topis placed on top of the scalp.of the scalp. 5/9/20085/9/2008 44
5. 5. Parts of the brainParts of the brainParts of the brainParts of the brain We are interested in the Primary Motor Cortex,We are interested in the Primary Motor Cortex,We are interested in the Primary Motor Cortex,We are interested in the Primary Motor Cortex, and Preand Pre--motor Cortexmotor Cortex 5/9/20085/9/2008 55
6. 6. Understand the WavesUnderstand the WavesUnderstand the WavesUnderstand the Waves InIn neurophysiologyneurophysiology, an, an action potentialaction potential (also(alsoInIn neurophysiologyneurophysiology, an, an action potentialaction potential (also(also known as aknown as a nerve impulsenerve impulse oror spikespike) is a pulse) is a pulse-- like wave oflike wave of voltagevoltage that travels along severalthat travels along several types oftypes of cell membranescell membranes.. 5/9/20085/9/2008 66
7. 7. PlacementPlacementPlacementPlacement InternationalInternationalInternationalInternational 1010--20 system,20 system, PlPlPlace anPlace an Electrode onElectrode on Each point.Each point. 5/9/20085/9/2008 77
8. 8. Event related PotentialsEvent related PotentialsEvent related PotentialsEvent related Potentials AnAn eventevent--related potentialrelated potential (ERP) is any stereotyped(ERP) is any stereotypedpp ( ) y yp( ) y yp electrophysiologicalelectrophysiological response to an internal orresponse to an internal or external stimulus. More simply, it is any measuredexternal stimulus. More simply, it is any measured brainbrain response that is directly the result of aresponse that is directly the result of a thoughtthoughtbrainbrain response that is directly the result of aresponse that is directly the result of a thoughtthought oror perceptionperception.. By collecting multiple trials of the same type of stimuli,By collecting multiple trials of the same type of stimuli, we can enhance the signal and reduce the noisewe can enhance the signal and reduce the noise using simple math(typically neuroscientist use this)using simple math(typically neuroscientist use this)using simple math(typically neuroscientist use this).using simple math(typically neuroscientist use this). 5/9/20085/9/2008 88
9. 9. Defined an EpochDefined an EpochDefined an EpochDefined an Epoch A series of time points locked in aA series of time points locked in aA series of time points locked in aA series of time points locked in a significant point time. The button press issignificant point time. The button press is our significant point in timeour significant point in timeour significant point in time.our significant point in time. 5/9/20085/9/2008 99
10. 10. A subject’s dataA subject’s dataA subject s dataA subject s data Example of anExample of an E hE hEpochEpoch 5/9/20085/9/2008 1010
11. 11. Same dataSame dataSame dataSame data Same dataSame dataSame dataSame data At a differentAt a different E hE hEpochEpoch 5/9/20085/9/2008 1111
12. 12. Time SeriesTime SeriesTime SeriesTime Series AA time seriestime series is a sequence ofis a sequence of datadataAA time seriestime series is a sequence ofis a sequence of datadata pointspoints, measured typically at successive, measured typically at successive times spaced at (often uniform) timetimes spaced at (often uniform) timetimes, spaced at (often uniform) timetimes, spaced at (often uniform) time intervals.intervals. 5/9/20085/9/2008 1212
13. 13. Time SeriesTime Series What is a time series?What is a time series?What is a time series?What is a time series? 5/9/20085/9/2008 1313
14. 14. What EM clustering doesWhat EM clustering doesWhat EM clustering doesWhat EM clustering does How do we classify points and estimateHow do we classify points and estimateHow do we classify points and estimateHow do we classify points and estimate parameters of the models in a mixture atparameters of the models in a mixture at the same time?the same time?the same time?the same time? Ad ti ft l t i EMAd ti ft l t i EM D tD tAdaptive soft clustering: EMAdaptive soft clustering: EM. Data. Data points are assigned to each group with apoints are assigned to each group with a b bilit l tb bilit l t lik lih d f th tlik lih d f th tprobability equal to aprobability equal to a likelihood of thatlikelihood of that point belonging to that group.point belonging to that group. 5/9/20085/9/2008 1414
15. 15. What is EMWhat is EM -- ExpectationExpectation MaximizationMaximization A statistical model that makes use of theA statistical model that makes use of theA statistical model that makes use of theA statistical model that makes use of the finite Gaussian mixture models.finite Gaussian mixture models. A set of parameters are recomputed until aA set of parameters are recomputed until aA set of parameters are recomputed until aA set of parameters are recomputed until a desired value is reacheddesired value is reached I iti l i bl d l i iti li dI iti l i bl d l i iti li dInitial variables are randomly initializedInitial variables are randomly initialized 5/9/20085/9/2008 1515
16. 16. The methods of EMThe methods of EMThe methods of EMThe methods of EM Initialization: Pick start values forInitialization: Pick start values forInitialization: Pick start values forInitialization: Pick start values for parameters (for us it was making randomparameters (for us it was making random models and setting a sigma)models and setting a sigma)g g )g g ) Iteratively process until parametersIteratively process until parameters convergeconvergegg Expectation (E) step: Calculate weights forExpectation (E) step: Calculate weights for every data point and update the weights toevery data point and update the weights to ff t f th tff t f th taffect further stepsaffect further steps Maximization (M) step: Maximize a logMaximization (M) step: Maximize a log likelihood function with the weights given by Elikelihood function with the weights given by Elikelihood function with the weights given by Elikelihood function with the weights given by E step to update the parametersstep to update the parameters5/9/20085/9/2008 1616
17. 17. EvaluateEvaluateEvaluateEvaluate Initialized 2 models of data using the mean of the EEGInitialized 2 models of data using the mean of the EEGgg entered using the first half of data over time for the firstentered using the first half of data over time for the first model and the second half of over time for the secondmodel and the second half of over time for the second modelmodelmodelmodel Compared each model and created a weight matrixCompared each model and created a weight matrixCo pa ed eac ode a d c eated a e g t atCo pa ed eac ode a d c eated a e g t at Normalized the dataNormalized the data 5/9/20085/9/2008 1717
18. 18. OSB algorithmOSB algorithmOSB algorithmOSB algorithm OSB : Optimal SubsequenceOSB : Optimal Subsequence BijectionBijection It is an algorithm that determines the optimalIt is an algorithm that determines the optimal subsequencesubsequence bijectionbijection between two sequences ofbetween two sequences of real numbers.real numbers.real numbers.real numbers. We were given a code ‘tsDAGjump4’ that worked forWe were given a code ‘tsDAGjump4’ that worked for only one channels and we modified so it can work foronly one channels and we modified so it can work for more than 1 channels.( works for 40 channels)more than 1 channels.( works for 40 channels) Modification: Created difference matrix with eachModification: Created difference matrix with eachModification: Created difference matrix with eachModification: Created difference matrix with each entry containing differences of correspondingentry containing differences of corresponding elements.elements. 5/9/20085/9/2008 1818
19. 19. Why OSB algorithm?Why OSB algorithm?Why OSB algorithm?Why OSB algorithm? The OSB is efficient because we use DAG(Directed AcyclicThe OSB is efficient because we use DAG(Directed Acyclic Graph), cheapest path to find the solution.Graph), cheapest path to find the solution. By using DAG in OSB, we get perfect and correct results onBy using DAG in OSB, we get perfect and correct results ony g , g py g , g p Time Series dataset.Time Series dataset. DAG helps us to get rid of outlier elements and get oneDAG helps us to get rid of outlier elements and get one--toto--one orone orDAG helps us to get rid of outlier elements and get oneDAG helps us to get rid of outlier elements and get one toto one orone or ontoonto bijectionbijection of a sequences.of a sequences. Comparing OSB with DTW using warping window OSB showsComparing OSB with DTW using warping window OSB showsComparing OSB with DTW using warping window, OSB showsComparing OSB with DTW using warping window, OSB shows that by skipping elements improves results.that by skipping elements improves results. 5/9/20085/9/2008 1919
20. 20. Directed Acyclic GraphDirected Acyclic GraphDirected Acyclic GraphDirected Acyclic Graph This is a simpleThis is a simpleThis is a simpleThis is a simple example of DAG.example of DAG. By skipping overBy skipping over outlier elements weoutlier elements we get perfect result.get perfect result. 5/9/20085/9/2008 2020
21. 21. OSB algorithmOSB algorithmOSB algorithmOSB algorithm This program is used to find:This program is used to find:This program is used to find:This program is used to find: TsTs -- Time SeriesTime Series DAGDAG –– Directed Acyclic GraphDirected Acyclic Graph OSB between two sequences of real numbersOSB between two sequences of real numbers D is to find distance between two elements.D is to find distance between two elements. C is for the jump cost.(penalty for skipping anC is for the jump cost.(penalty for skipping an element)element)element)element) WW-- weight of edgesweight of edges 5/9/20085/9/2008 2121
22. 22. OSB algorithmOSB algorithmgg Fi d b f t l tFi d b f t l tFind subsequences of two elements.Find subsequences of two elements. Create dissimilarity matrix.Create dissimilarity matrix. Use shortest path algorithm on Directed AcyclicUse shortest path algorithm on Directed Acyclic Graph.Graph. Fi d j tFi d j tFind jump cost.Find jump cost. Nodes are index pairs of matrix.Nodes are index pairs of matrix. The main thing in the algorithm is to find edgeThe main thing in the algorithm is to find edgeThe main thing in the algorithm is to find edgeThe main thing in the algorithm is to find edge weights of DAG.weights of DAG. 5/9/20085/9/2008 2222
23. 23. WaveletWaveletWaveletWavelet “Wavelets are mathematical functions that cut up“Wavelets are mathematical functions that cut upWavelets are mathematical functions that cut upWavelets are mathematical functions that cut up data into different frequency components, anddata into different frequency components, and then study each component with a resolutionthen study each component with a resolution matched to its scale.”matched to its scale.” ((IEEE Computational Science andIEEE Computational Science and Engineering, Summer 1995, vol. 2, num. 2,Engineering, Summer 1995, vol. 2, num. 2, bli h d b h IEEE C S i )bli h d b h IEEE C S i )published by the IEEE Computer Society)published by the IEEE Computer Society) 5/9/20085/9/2008 2323
24. 24. Discussion of ResultsDiscussion of ResultsDiscussion of ResultsDiscussion of Results We used OSB to obtain our results.We used OSB to obtain our results. Our results consist of two sequences a and b then findOur results consist of two sequences a and b then find subsequences a’ of a and b’ of b so that a’ matches bestsubsequences a’ of a and b’ of b so that a’ matches bestsubsequences a of a and b of b so that a matches bestsubsequences a of a and b of b so that a matches best with b’.with b’. Results are divided on two parts:Results are divided on two parts: Cluster PrecisionCluster Precision -- means x% of time that you will cluster an epochmeans x% of time that you will cluster an epoch correctly.correctly.yy Cluster RecallCluster Recall -- means y% of time that you will cluster a known left ormeans y% of time that you will cluster a known left or right buttonright button 5/9/20085/9/2008 2424
25. 25. Discussion of ResultsDiscussion of ResultsDiscussion of ResultsDiscussion of Results Example:Example: typeout1: [75typeout1: [75 7676 27 21 22] = a27 21 22] = atypeout1: [75typeout1: [75 7676 27 21 22] = a27 21 22] = a rtypeout1: [77rtypeout1: [77 7272 24 22 24] = b24 22 24] = b ltypeout2: [25 24 73 79 78] = a’ltypeout2: [25 24 73 79 78] = a’ t t2 [23t t2 [23 2828 76 78 76] b’76 78 76] b’rtypeout2: [23rtypeout2: [23 2828 76 78 76] = b’76 78 76] = b’ For Cluster Precision:For Cluster Precision: left button cluster: 76, right button cluster: 72 (fromleft button cluster: 76, right button cluster: 72 (from aa andand bb)) Formula : 76/ (76+72) 0 51351351 51%Formula : 76/ (76+72) 0 51351351 51%Formula : 76/ (76+72) = 0.51351351 = 51%Formula : 76/ (76+72) = 0.51351351 = 51% For Cluster Recall:For Cluster Recall: left button cluster: 76, right button cluster: 28 (fromleft button cluster: 76, right button cluster: 28 (from aa andand b’b’)) Now we applied the formula so: 76/ (76+28) = 0 73076923 = 73%Now we applied the formula so: 76/ (76+28) = 0 73076923 = 73%Now, we applied the formula so: 76/ (76+28) = 0.73076923 = 73%Now, we applied the formula so: 76/ (76+28) = 0.73076923 = 73% Note: Apply same formula to find both right clusterNote: Apply same formula to find both right cluster precision and recall.precision and recall. 5/9/20085/9/2008 2525
26. 26. ConclusionConclusionConclusionConclusion We’re given total 19 subjects (EEG Datasets) but weWe’re given total 19 subjects (EEG Datasets) but weWe re given total 19 subjects (EEG Datasets) but weWe re given total 19 subjects (EEG Datasets) but we derived correct result for only 10 subjects.derived correct result for only 10 subjects. Other 9 subjects gave us all zeros as a result.Other 9 subjects gave us all zeros as a result. ltypeout1: [0 0 0 0 0]ltypeout1: [0 0 0 0 0] rtypeout1: [0 0 0 0 0]rtypeout1: [0 0 0 0 0] ltypeout2: [0 0 0 0 0]ltypeout2: [0 0 0 0 0] rtypeout2: [0 0 0 0 0]rtypeout2: [0 0 0 0 0] …..error………..error…… 5/9/20085/9/2008 2626
27. 27. Extra ReferencesExtra ReferencesExtra ReferencesExtra References Yang RanYang Ran Expectation Maximization : AnExpectation Maximization : AnYang RanYang Ran,, Expectation Maximization : AnExpectation Maximization : An Approach to Parameter Estimation”,Approach to Parameter Estimation”, www umiacs umd edu/~shaohua/enee698www umiacs umd edu/~shaohua/enee698www.umiacs.umd.edu/ shaohua/enee698www.umiacs.umd.edu/ shaohua/enee698 a_f03/em.ppta_f03/em.ppt Andrew Blake Bill Freeman “LearningAndrew Blake Bill Freeman “LearningAndrew Blake, Bill Freeman, LearningAndrew Blake, Bill Freeman, Learning and Vision: Generative Methods”and Vision: Generative Methods”--pptppt ICCVICCV 20032003 October 12 2003October 12 200320032003 October 12, 2003October 12, 2003 5/9/20085/9/2008 2727
28. 28. Thank YouThank You 5/9/20085/9/2008 2828
1. A particular slide catching your eye?

Clipping is a handy way to collect important slides you want to go back to later.