Foilsを使ってみた。

701 views

Published on

Foilsクラスを適当に使ってみた例。
題材は機械学習とかそのあたりのもの。

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
701
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Foilsを使ってみた。

  1. 1. Recently Keyword Keyword • (Artificial Intelligence) • (Pattern Recognition) • (Machine Learning) • (Mathematics) • (Ocaml, Haskell, ...) • etc. 1
  2. 2. 2
  3. 3. Haskell • OCaml 1 • • 3
  4. 4. • • Haskell 1.0 Haskell 98 Haskell 2010 • qsort [] = [] qsort (p:xs) = qsort lt ++ [p] ++ qsort gteq where lt = [x | x <- xs, x < p] gteq = [x | x <- xs, x >= p] 4
  5. 5. • quicksort 4 • 2 qs [] = [] qs (p:xs) = qs [x|x<-xs,x<p]++[p]++qs[x|x<-xs,x>=p] www 5
  6. 6. • • Haskell ( ) Haskell 6
  7. 7. OCaml • Haskell • Haskell Haskell • for while Haskell for while Microsoft F# 7
  8. 8. let rec qsort= function | [] -> [] | pivot :: rest -> let il x = x < pivot in let left, right = List.partition il rest in qsort left @ [pivot] @ qsort right ( 8
  9. 9. ML • • 9
  10. 10. 10
  11. 11. 11
  12. 12. Introduction to Machine Learning • ML=Machine Learning • • • wikipedia 12
  13. 13. Introduction to Machine Learning • • 2 – — Supervised learning – — Unsupervised Learning 13
  14. 14. Introduction to Machine Learning — Supervised learning • ( • – SVM — Support Vector Machine – — Regression – — Back Propagation – — Decision tree 14
  15. 15. Introduction to Machine Learning — Unsupervised learning • ( • – — Clustering – — Frequent Pattern Analysis 15
  16. 16. Introduction to Machine Learning • — statistical machine learning – – • — computational learning – – 16
  17. 17. Introduction to Machine Learning • – – – & • – – (?) 17
  18. 18. Introduction to Machine Learning (1) • – • – = • – / 18
  19. 19. Introduction to Machine Learning (2) • – – – • – ? – 19
  20. 20. Introduction to Machine Learning (3) • – ? — – — – – • 20
  21. 21. Introduction to Machine Learning • – symbol • (?) – – 21
  22. 22. Introduction to Machine Learning PRML 22
  23. 23. @ 23
  24. 24. SVM SVM — Support Vector Machine • 2 (+1, -1) • ( ) 24
  25. 25. SVM • • yk (w · x − θ) ≥ 1 • Maximize Margin ≡ Minimize |w|2 • Lagurange αk L ∑ ∑∑ xk · xl L= αk − αk αl yk yl 2 k k l 25
  26. 26. SVM • L minimize • • ? – x x → ψ(x) – 26
  27. 27. clustering — Clustering • & • ( 27
  28. 28. clustering • = • ( ) 28
  29. 29. clustering • = – ||x − y||2 – ∑ n 1 dk (x, y) = ( |xi − yi | ) k k i=1 – – 29
  30. 30. 30
  31. 31. • – — Black, Blonde, Brown, Red, ... – — Safe, Normal, Danger • • 31
  32. 32. decision tree — decision tree • • • Height Color Eatable small colorful can’t middle colorful can’t 32
  33. 33. decision tree • • • ? • — • ? — NP 33
  34. 34. decision tree • • ? 34
  35. 35. decision tree • ( ) • • ? • ? • 35
  36. 36. frequent pattern • • —A B • (1/0 — / ) 36
  37. 37. frequent pattern • • =1/ =0 , A B C D E F 1 1 0 1 0 1 1 0 1 1 1 1 0 0 1 0 0 1 1 0 1 0 1 1 0 1 0 1 0 1 37
  38. 38. frequent pattern • – ( ) • – A C – A,C,D • Algorithm – Apriori – FP-growth, ZDD-growth 38
  39. 39. frequent pattern • I = {i1, i2, · · · , in} – I = {A, B, C , D, E , F } • X ∈ 2I • σ • supp(X ) – supp(X) X – supp(X)≥ σ 39
  40. 40. frequent pattern • A → α, Aispattern, X ∈ I – A, B → C – A B C ( • conf supp(A∪{α}) • conf(A )≡ supp(A) • conf θ – 100 10 40
  41. 41. frequent pattern • – Apriori – FP tree and FP-growth – ZBDD and ZBDD-growth • – 41
  42. 42. add-up • • • – etc. • – — Data Mining – (?) 42
  43. 43. add-up • – – – • (?) – ( ) – NN – 43
  44. 44. 44
  45. 45. COLT • — • — • • DM ? 45
  46. 46. COLT • • – – – – ( ) • – keyword, 46
  47. 47. COLT • – – • • ( ) • 47
  48. 48. automaton and language • automaton • grammer • • automaton grammer – 0 TM( ) – 2 ( ) PDA( ) – 3 ( ) FA( ) 48
  49. 49. learning in the limit • • A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} – A Σ = {a} • M( – ( – – M 49
  50. 50. learning in the limit • (DFA) D = (Q, Σ, δ, s0, F ) • γ • γ L(γ) • ( ) A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} • A E (w , l), w ∈ Σ∗, l ∈ {1, 0} • E=(w , l) w A l = 1( ) 50
  51. 51. learning in the limit • A ( ) σ – σ = (w 1, l1) (w 2, l2) (w 3, l3) (w 4, l4) · · · – ∀i, wi ∈ Σ∗ – if wi ∈ A then li = 1 • – A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} – σ = ( , 1) (a01) (aa, 1) (aaa, 0) (aaaa, 1) · · · 51
  52. 52. learning in the limit • ( )L ∈ C, C (Σ ) • R, R γ • M R L M L σ g1, g2, · · · (gi γ ) {gi } L(g ) = L g 52
  53. 53. learning in the limit • – • – σ – – (class P? NP complete? NP hard?) • – 53
  54. 54. Probably Approximately Correct Learning PAC — probably approximately correct • δ • • f g d(f , g ) • PAC ∀ ,δ Pr (d(f , g ) ≤ ) ≥ 1 − δ g f PAC 54
  55. 55. appendix • (?) – – – – – – – etc 55
  56. 56. Weak Probably Approximately Correct Learning PAC — Weak PAC learning • ∀ ,δ • PAC 0 , δ0 Pr (d(f , g ) ≤ ≥ 1 − δ0 0) g f PAC • – ? – ,δ 56
  57. 57. boosting Boosting • • M0 • M0 Mw • Mw • ( PAC) (PAC) 57
  58. 58. boosting • – Boosting – – • ? • Boosting AdaBoost (?) 58

×