SlideShare a Scribd company logo
1 of 67
Artificial Intelligence and Learning Algorithms Presented By Brian M. Frezza 12/1/05
Game Plan ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Hard Math
What’s a Learning Algorithm? ,[object Object],[object Object],[object Object],[object Object],[object Object]
Why do I care? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Street Smarts ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
The Algorithms ,[object Object],[object Object],[object Object],[object Object]
Bayesian Networks: Basics ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Bayesian Network Example ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Bayesian Network: Trace H a : 100%  Redhead H b : 50%  Redhead  50% Not H c : 100% Not Redhead  0 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(1/3) + (1/2)*(1/3) + (0)(1/3) =(1/2) Prediction:   Will their next kid be a  Redhead ? 1/3 1/3 1/3 P(H c ) P(H b ) P(H a )
Bayesian Network:Trace H a : 100%  Redhead H b : 50%  Redhead  50% Not H c : 100% Not Redhead  1 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(1/2) + (1/2)*(1/2) + (0)(1/3) =(3/4) Prediction:   Will their next kid be a  Redhead ? 0 1/2 1/2 P(H c ) P(H b ) P(H a )
Bayesian Network: Trace H a : 100%  Redhead H b : 50%  Redhead  50% Not H c : 100% Not Redhead  2 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(3/4) + (1/2)*(1/4) + (0)(1/3) =(7/8) Prediction:   Will their next kid be a  Redhead ? 0 1/4 3/4 P(H c ) P(H b ) P(H a )
Bayesian Network: Trace H a : 100%  Redhead H b : 50%  Redhead  50% Not H c : 100% Not Redhead  3 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(7/8) + (1/2)*(1/8) + (0)(1/3) =(15/16) Prediction:   Will their next kid be a  Redhead ? 0 1/8 7/8 P(H c ) P(H b ) P(H a )
Bayesian Networks Notes ,[object Object],[object Object],[object Object],[object Object]
The Algorithms ,[object Object],[object Object],[object Object],[object Object]
Hidden Markov Models(HMM) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Hidden Markov Models: Take a Step Back ,[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4
1 st  order Markov Model Setup ,[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4 P 1   P 2   P 3   P 4 0.6  0.2   0.9   0.4
1 st  order Markov Model Trace ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4 P 1   P 2   P 3   P 4 0.6  0.2   0.9   0.4
1 st  order Markov Model Trace ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4 P 1   P 2   P 3   P 4 0.6  0.2   0.9   0.4
1 st  order Markov Model Trace ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4 P 1   P 2   P 3   P 4 0.6  0.2   0.9   0.4
1 st  order Markov Model Trace ,[object Object],[object Object],[object Object],[object Object],[object Object],Q 1 Q 4 Q 2 Q 3 P 1 P 2 1-P 1 -P 2 P 3 1-P 3 1 1-P 4 P 4 P 1   P 2   P 3   P 4 0.6  0.2   0.9   0.4
What else can Markov do? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Hidden Markov Models (HMMs) ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Hidden Markov Models: Example ,[object Object],Observable States Hidden States Unstructured Alpha Helix Beta Sheet His Asp Arg Phe Ala Cis Ser Gln Glu Lys Leu Met Asn Ser Tyr Thr Ile Trp Pro Val Gly
Hidden Markov Models:  Smaller Example ,[object Object],G T C A Exon Intergenic Intron Observable States Hidden States P(Ex|Ex) P( In |Ex) P( In |Ex) P(It|It) P( Ig |It) P( Ex |It) P(Ig|Ig) P( Itr |Ig) P( Ex |Ig) P(A| Ex ) P(A| It ) P(A| Ig ) P(C| It ) P(G| It ) P(T| It ) P(T| Ex ) P(G| Ex ) P(C| Ex ) P(C| Ig ) P(T| Ig ) P(G| Ig )
Hidden Markov Models:  Smaller Example ,[object Object],Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex
Hidden Markov Model ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = P(A|Ex) * Start Exon = 3.3*10 -2 Introgenic = P(A|Ig) * Start Ig = 2.2*10 -1 Intron = P(A|It) * Start It = 0.14 * 0.01 = 1.4*10 -3 0.8 0.02 0.18 It 0.01 0.9 0.09 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A A T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) = 4.6*10 -2 Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) = 2.8*10 -2 Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) = 1.1*10 -3 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) = 1.1*10 -2 Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) = 3.5*10 -3 Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) = 1.3*10 -3 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C 2.9*10 -5 2.2*10 -6 4.3*10 -6 G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex 4.7*10 -10 3.6*10 -11 1.1*10 -10 G 1.2*10 -9 1.2*10 -10 1.4*10 -9 T 9.2*10 -9 4.1*10 -10 4.9* -9 A 8.2*10 -8 2.7*10 -9 8.4* -9 G 2.0*10 -7 9.1*10 -9 1.1*10 -7 A 1.8*10 -6 3.5*10 -8 9.1*10 -8 G 4.6*10 -6 2.8*10 -7 7.2*10 -7 C 2.9*10 -5 2.2*10 -6 4.3*10 -6 G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
Hidden Markov Models ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],CENSORED
The Algorithms ,[object Object],[object Object],[object Object],[object Object]
Genetic Algorithms ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Genetic Algorithms ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Genetic Algorithms ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
The Algorithms ,[object Object],[object Object],[object Object],[object Object]
Neural Networks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Neural Networks:  A Neuron, Node, or Unit Σ ( W )- W 0,c Activation Function Output W a,c W b,c W 0,c (Bias) W c, n a  z (Bias)
Neural Networks:  Activation Functions Sigmoid Function (logistic function) Threshold Function Zero point set by bias In In out out +1 +1
Threshold Functions can make Logic Gates with Neurons! Logical And W 0,c  = 1.5 W b,c  = 1 W a,c  = 1 A B Σ ( W )- W 0,c a  z (Bias) Output If (  Σ (w) – W o,c  > 0  ) Then  FIRE Else  Don’t (Bias) 0 0 0 0 1 1 0 1 ∩
And Gate: Trace W 0,c  = 1.5 W b,c  = 1 W a,c  = 1 -1.5 Off Off Off -1.5 < 0 (Bias)
And Gate: Trace W 0,c  = 1.5 W b,c  = 1 W a,c  = 1 -0.5 On Off Off -0.5 < 0 (Bias)
And Gate: Trace W 0,c  = 1.5 W b,c  = 1 W a,c  = 1 -0.5 Off On Off -0.5 < 0 (Bias)
And Gate: Trace W 0,c  = 1.5 W b,c  = 1 W a,c  = 1 0.5 On On On 0.5 > 0 (Bias)
Threshold Functions can make Logic Gates with Neurons! W 0,c  = 0.5 W b,c  = 1 W a,c  = 1 A Σ ( W )- W 0,c a  z (Bias) If (  Σ (w) – W o,c  > 0  ) Then  FIRE Else  Don’t (Bias) Logical Or B 0 1 0 1 1 1 0 1 U
Or Gate: Trace W 0,c  = 0.5 W b,c  = 1 W a,c  = 1 -0.5 Off Off Off -0.5 < 0 (Bias)
Or Gate: Trace W 0,c  = 0.5 W b,c  = 1 W a,c  = 1 0.5 On Off On 0.5 > 0 (Bias)
Or Gate: Trace W 0,c  = 0.5 W b,c  = 1 W a,c  = 1 0.5 Off On On 0.5 > 0 (Bias)
Or Gate: Trace W 0,c  = 0.5 W b,c  = 1 W a,c  = 1 1.5 On On On 1.5 > 0 (Bias)
Threshold Functions can make Logic Gates with Neurons! W 0,c  = -0.5 W a,c  = -1 Σ ( W )- W 0,c a  z (Bias) If (  Σ (w) – W o,c  > 0  ) Then  FIRE Else  Don’t (Bias) Logical Not 1 0 0 1 !
Not Gate: Trace W 0,c  = -0.5 W a,c  = -1 -0.5 Off On 0.5 > 0 (Bias) 0 – (-0.5) = 0.5
Not Gate: Trace W 0,c  = -0.5 W a,c  = -1 -0.5 On Off -0.5 < 0 (Bias) -1 – (-0.5) = -0.5
Feed-Forward Vs.  Recurrent Networks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Feed-Forward Networks ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Layers Input Output Hidden layer
Perceptron Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],CENSORED
Hidden Network Learning ,[object Object],[object Object],[object Object],[object Object],[object Object],CENSORED
They don’t get it either: Issues that aren’t well understood ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
How Are Neural Nets Different From My Brain? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],“ Fraser’s” Rules “ In theory one can, of course, implement biologically realistic neural networks, but this is a mammoth task.  All kinds of details have to be gotten right, or you end up with a network that completely decays to unconnectedness, or one that ramps up its connections until it basically has a seizure.”
Frontiers in AI ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
 

More Related Content

What's hot

Learn from LL(1) to PEG parser the hard way
Learn from LL(1) to PEG parser the hard wayLearn from LL(1) to PEG parser the hard way
Learn from LL(1) to PEG parser the hard wayKir Chou
 
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...corehard_by
 
Otter 2016-11-14-ss
Otter 2016-11-14-ssOtter 2016-11-14-ss
Otter 2016-11-14-ssRuo Ando
 
StackOverflow
StackOverflowStackOverflow
StackOverflowSusam Pal
 

What's hot (9)

Java Bytecodes by Example
Java Bytecodes by ExampleJava Bytecodes by Example
Java Bytecodes by Example
 
Learn from LL(1) to PEG parser the hard way
Learn from LL(1) to PEG parser the hard wayLearn from LL(1) to PEG parser the hard way
Learn from LL(1) to PEG parser the hard way
 
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...
Update on C++ Core Guidelines Lifetime Analysis. Gábor Horváth. CoreHard Spri...
 
Otter 2016-11-14-ss
Otter 2016-11-14-ssOtter 2016-11-14-ss
Otter 2016-11-14-ss
 
StackOverflow
StackOverflowStackOverflow
StackOverflow
 
Java Puzzlers
Java PuzzlersJava Puzzlers
Java Puzzlers
 
Transcription
TranscriptionTranscription
Transcription
 
ELUTE
ELUTEELUTE
ELUTE
 
Lexical Analysis
Lexical AnalysisLexical Analysis
Lexical Analysis
 

Similar to Learning Algorithms For Life Scientists

ECCV2010: distance function and metric learning part 1
ECCV2010: distance function and metric learning part 1ECCV2010: distance function and metric learning part 1
ECCV2010: distance function and metric learning part 1zukun
 
Otter 2016-11-28-01-ss
Otter 2016-11-28-01-ssOtter 2016-11-28-01-ss
Otter 2016-11-28-01-ssRuo Ando
 
Application of parallel hierarchical matrices and low-rank tensors in spatial...
Application of parallel hierarchical matrices and low-rank tensors in spatial...Application of parallel hierarchical matrices and low-rank tensors in spatial...
Application of parallel hierarchical matrices and low-rank tensors in spatial...Alexander Litvinenko
 
Jaimin chp-5 - network layer- 2011 batch
Jaimin   chp-5 - network layer- 2011 batchJaimin   chp-5 - network layer- 2011 batch
Jaimin chp-5 - network layer- 2011 batchJaimin Jani
 
Discrete Math IP4 - Automata Theory
Discrete Math IP4 - Automata TheoryDiscrete Math IP4 - Automata Theory
Discrete Math IP4 - Automata TheoryMark Simon
 
Wang labsummer2010
Wang labsummer2010Wang labsummer2010
Wang labsummer2010russodl
 
Current limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov modelsCurrent limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov modelsPierre Jacob
 
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...Luigi Vanfretti
 
Theory to consider an inaccurate testing and how to determine the prior proba...
Theory to consider an inaccurate testing and how to determine the prior proba...Theory to consider an inaccurate testing and how to determine the prior proba...
Theory to consider an inaccurate testing and how to determine the prior proba...Toshiyuki Shimono
 
2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmmnozomuhamada
 
String kmp
String kmpString kmp
String kmpthinkphp
 
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...Michael Hoffman
 
Discrete probability
Discrete probabilityDiscrete probability
Discrete probabilityRanjan Kumar
 
Ip 5 discrete mathematics
Ip 5 discrete mathematicsIp 5 discrete mathematics
Ip 5 discrete mathematicsMark Simon
 

Similar to Learning Algorithms For Life Scientists (20)

Markov chain
Markov chainMarkov chain
Markov chain
 
RNA synthesis
RNA synthesisRNA synthesis
RNA synthesis
 
Datacompression1
Datacompression1Datacompression1
Datacompression1
 
ECCV2010: distance function and metric learning part 1
ECCV2010: distance function and metric learning part 1ECCV2010: distance function and metric learning part 1
ECCV2010: distance function and metric learning part 1
 
01.4.pssm theory
01.4.pssm theory01.4.pssm theory
01.4.pssm theory
 
Otter 2016-11-28-01-ss
Otter 2016-11-28-01-ssOtter 2016-11-28-01-ss
Otter 2016-11-28-01-ss
 
Application of parallel hierarchical matrices and low-rank tensors in spatial...
Application of parallel hierarchical matrices and low-rank tensors in spatial...Application of parallel hierarchical matrices and low-rank tensors in spatial...
Application of parallel hierarchical matrices and low-rank tensors in spatial...
 
Dcs unit 2
Dcs unit 2Dcs unit 2
Dcs unit 2
 
BAYSM'14, Wien, Austria
BAYSM'14, Wien, AustriaBAYSM'14, Wien, Austria
BAYSM'14, Wien, Austria
 
Jaimin chp-5 - network layer- 2011 batch
Jaimin   chp-5 - network layer- 2011 batchJaimin   chp-5 - network layer- 2011 batch
Jaimin chp-5 - network layer- 2011 batch
 
Discrete Math IP4 - Automata Theory
Discrete Math IP4 - Automata TheoryDiscrete Math IP4 - Automata Theory
Discrete Math IP4 - Automata Theory
 
Wang labsummer2010
Wang labsummer2010Wang labsummer2010
Wang labsummer2010
 
Current limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov modelsCurrent limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov models
 
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...
Optimal Multisine Probing Signal Design for Power System Electromechanical Mo...
 
Theory to consider an inaccurate testing and how to determine the prior proba...
Theory to consider an inaccurate testing and how to determine the prior proba...Theory to consider an inaccurate testing and how to determine the prior proba...
Theory to consider an inaccurate testing and how to determine the prior proba...
 
2012 mdsp pr06  hmm
2012 mdsp pr06  hmm2012 mdsp pr06  hmm
2012 mdsp pr06  hmm
 
String kmp
String kmpString kmp
String kmp
 
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...
Segway and the Graphical Models Toolkit: a framework for probabilistic genomi...
 
Discrete probability
Discrete probabilityDiscrete probability
Discrete probability
 
Ip 5 discrete mathematics
Ip 5 discrete mathematicsIp 5 discrete mathematics
Ip 5 discrete mathematics
 

Recently uploaded

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfChris Hunter
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 

Recently uploaded (20)

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 

Learning Algorithms For Life Scientists

  • 1. Artificial Intelligence and Learning Algorithms Presented By Brian M. Frezza 12/1/05
  • 2.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10. Bayesian Network: Trace H a : 100% Redhead H b : 50% Redhead 50% Not H c : 100% Not Redhead 0 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(1/3) + (1/2)*(1/3) + (0)(1/3) =(1/2) Prediction: Will their next kid be a Redhead ? 1/3 1/3 1/3 P(H c ) P(H b ) P(H a )
  • 11. Bayesian Network:Trace H a : 100% Redhead H b : 50% Redhead 50% Not H c : 100% Not Redhead 1 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(1/2) + (1/2)*(1/2) + (0)(1/3) =(3/4) Prediction: Will their next kid be a Redhead ? 0 1/2 1/2 P(H c ) P(H b ) P(H a )
  • 12. Bayesian Network: Trace H a : 100% Redhead H b : 50% Redhead 50% Not H c : 100% Not Redhead 2 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(3/4) + (1/2)*(1/4) + (0)(1/3) =(7/8) Prediction: Will their next kid be a Redhead ? 0 1/4 3/4 P(H c ) P(H b ) P(H a )
  • 13. Bayesian Network: Trace H a : 100% Redhead H b : 50% Redhead 50% Not H c : 100% Not Redhead 3 Not 0 Hypothesis History Likelihood's = P( red |H a )*P(H a ) + P( red |H b )*P(H b ) + P( red |H c )*P(H c ) = (1)*(7/8) + (1/2)*(1/8) + (0)(1/3) =(15/16) Prediction: Will their next kid be a Redhead ? 0 1/8 7/8 P(H c ) P(H b ) P(H a )
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = P(A|Ex) * Start Exon = 3.3*10 -2 Introgenic = P(A|Ig) * Start Ig = 2.2*10 -1 Intron = P(A|It) * Start It = 0.14 * 0.01 = 1.4*10 -3 0.8 0.02 0.18 It 0.01 0.9 0.09 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A A T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 30. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) = 4.6*10 -2 Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) = 2.8*10 -2 Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) = 1.1*10 -3 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 31. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) = 1.1*10 -2 Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) = 3.5*10 -3 Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) = 1.3*10 -3 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 32. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 33. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 34. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 35. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex G T A G A G C 2.9*10 -5 2.2*10 -6 4.3*10 -6 G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 36. Viterbi Algorithm: Trace Hidden State Transition Probabilities Observable State Probabilities To From Hidden State Observable Starting Distribution Example Sequence: ATAATGGCGAGTG Exon = Max( P(Ex|Ex)*P n-1 (Ex), P(Ex|Ig)*P n-1 (Ig), P(Ex|It)*P n-1 (It) ) *P(T|Ex) Introgenic =Max( P(Ig|Ex)*P n-1 (Ex), P(Ig|Ig)*P n-1 (Ig), P(Ig|It)*P n-1 (It) ) * P(T|Ig) Intron = Max( P(It|Ex)*P n-1 (Ex), P(It|Ig)*P n-1 (Ig), P(It,It)*P n-1 (It) ) * P(T|It) 0.8 0.02 0.18 It 0.01 0.5 0.49 Ig 0.2 0.1 0.7 Ex It Ig Ex 0.2 0.5 0.16 0.14 It 0.25 0.25 0.25 0.25 Ig 0.14 0.11 0.42 0.33 Ex C G T A 0.01 0.89 0.1 It Ig Ex 4.7*10 -10 3.6*10 -11 1.1*10 -10 G 1.2*10 -9 1.2*10 -10 1.4*10 -9 T 9.2*10 -9 4.1*10 -10 4.9* -9 A 8.2*10 -8 2.7*10 -9 8.4* -9 G 2.0*10 -7 9.1*10 -9 1.1*10 -7 A 1.8*10 -6 3.5*10 -8 9.1*10 -8 G 4.6*10 -6 2.8*10 -7 7.2*10 -7 C 2.9*10 -5 2.2*10 -6 4.3*10 -6 G 7.2*10 -5 1.8*10 -5 5.5*10 -5 G 7.8*10 -5 6.1*10 -5 7.2*10 -4 T 2.9*10 -4 4.3*10 -4 2.4*10 -3 A 1.3*10 -3 3.5*10 -3 1.1*10 -2 A 1.1*10 -3 2.8*10 -2 4.6*10 -2 T 1.4*10 -3 2.2*10 -1 3.3*10 -2 A Intron Introgenic Exon
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44. Neural Networks: A Neuron, Node, or Unit Σ ( W )- W 0,c Activation Function Output W a,c W b,c W 0,c (Bias) W c, n a z (Bias)
  • 45. Neural Networks: Activation Functions Sigmoid Function (logistic function) Threshold Function Zero point set by bias In In out out +1 +1
  • 46. Threshold Functions can make Logic Gates with Neurons! Logical And W 0,c = 1.5 W b,c = 1 W a,c = 1 A B Σ ( W )- W 0,c a z (Bias) Output If ( Σ (w) – W o,c > 0 ) Then FIRE Else Don’t (Bias) 0 0 0 0 1 1 0 1 ∩
  • 47. And Gate: Trace W 0,c = 1.5 W b,c = 1 W a,c = 1 -1.5 Off Off Off -1.5 < 0 (Bias)
  • 48. And Gate: Trace W 0,c = 1.5 W b,c = 1 W a,c = 1 -0.5 On Off Off -0.5 < 0 (Bias)
  • 49. And Gate: Trace W 0,c = 1.5 W b,c = 1 W a,c = 1 -0.5 Off On Off -0.5 < 0 (Bias)
  • 50. And Gate: Trace W 0,c = 1.5 W b,c = 1 W a,c = 1 0.5 On On On 0.5 > 0 (Bias)
  • 51. Threshold Functions can make Logic Gates with Neurons! W 0,c = 0.5 W b,c = 1 W a,c = 1 A Σ ( W )- W 0,c a z (Bias) If ( Σ (w) – W o,c > 0 ) Then FIRE Else Don’t (Bias) Logical Or B 0 1 0 1 1 1 0 1 U
  • 52. Or Gate: Trace W 0,c = 0.5 W b,c = 1 W a,c = 1 -0.5 Off Off Off -0.5 < 0 (Bias)
  • 53. Or Gate: Trace W 0,c = 0.5 W b,c = 1 W a,c = 1 0.5 On Off On 0.5 > 0 (Bias)
  • 54. Or Gate: Trace W 0,c = 0.5 W b,c = 1 W a,c = 1 0.5 Off On On 0.5 > 0 (Bias)
  • 55. Or Gate: Trace W 0,c = 0.5 W b,c = 1 W a,c = 1 1.5 On On On 1.5 > 0 (Bias)
  • 56. Threshold Functions can make Logic Gates with Neurons! W 0,c = -0.5 W a,c = -1 Σ ( W )- W 0,c a z (Bias) If ( Σ (w) – W o,c > 0 ) Then FIRE Else Don’t (Bias) Logical Not 1 0 0 1 !
  • 57. Not Gate: Trace W 0,c = -0.5 W a,c = -1 -0.5 Off On 0.5 > 0 (Bias) 0 – (-0.5) = 0.5
  • 58. Not Gate: Trace W 0,c = -0.5 W a,c = -1 -0.5 On Off -0.5 < 0 (Bias) -1 – (-0.5) = -0.5
  • 59.
  • 60.
  • 61. Layers Input Output Hidden layer
  • 62.
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.