Recently Keyword

                              Keyword

•            (Artificial Intelligence)

•                (Pattern Recognition)

•            (Machine Learning)

•        (Mathematics)

•              (Ocaml, Haskell, ...)

• etc.

                                                       1
2
Haskell


•
                  OCaml
    1

•

•



                      3
•

• Haskell 1.0   Haskell 98   Haskell 2010

•

    qsort []     = []
    qsort (p:xs) = qsort lt ++ [p] ++ qsort gteq
                      where
                        lt   = [x | x <- xs, x < p]
                        gteq = [x | x <- xs, x >= p]

                                                       4
•        quicksort   4

•          2

    qs []     = []
    qs (p:xs) = qs [x|x<-xs,x<p]++[p]++qs[x|x<-xs,x>=p]



                                   www


                                                          5
•


•
    Haskell             ( )


              Haskell



                              6
OCaml

• Haskell

•                             Haskell
    Haskell

• for     while
  Haskell     for     while


          Microsoft    F#

                                         7
let rec qsort= function
  | [] -> []
  | pivot :: rest ->
      let il x = x < pivot in
      let left, right = List.partition il rest in
      qsort left @ [pivot] @ qsort right


                               (



                                                    8
ML

•

•




         9
10
11
Introduction to Machine Learning




• ML=Machine Learning

•

•

• wikipedia




                                                     12
Introduction to Machine Learning




•




•       2
    –       — Supervised learning
    –       — Unsupervised Learning




                                                                   13
Introduction to Machine Learning


                        — Supervised learning

•                                    (

•
    – SVM — Support Vector Machine
    –        — Regression
    –           — Back Propagation
    –      — Decision tree




                                                                      14
Introduction to Machine Learning


            — Unsupervised learning

•                       (

•
    –   — Clustering
    –    — Frequent Pattern Analysis




                                                               15
Introduction to Machine Learning




•         — statistical machine learning
    –
    –

•       — computational learning
    –
    –



                                                                        16
Introduction to Machine Learning




•
    –
    –
    –     &

•
    –
    –   (?)


                                           17
Introduction to Machine Learning


                (1)

•
    –

•
    –   =

•
    –       /


                                                   18
Introduction to Machine Learning


        (2)

•
    –
    –
    –

•
    –   ?
    –


                                           19
Introduction to Machine Learning


               (3)

•
    –    ? —
    –   —
    –
    –

•




                                                  20
Introduction to Machine Learning




•
    – symbol

•              (?)
    –
    –




                                                  21
Introduction to Machine Learning

PRML




                                    22
@




    23
SVM

    SVM — Support Vector Machine

•   2                   (+1, -1)




•       ( )

                                    24
SVM




•

•
                         yk (w · x − θ) ≥ 1

• Maximize Margin ≡ Minimize |w|2

• Lagurange     αk                        L
                     ∑          ∑∑                  xk · xl
               L=        αk −           αk αl yk yl
                                                      2
                     k          k   l



                                                               25
SVM




•           L   minimize

•

•                                     ?
    –   x

                           x → ψ(x)

    –


                                           26
clustering


        — Clustering

•   &




•   (

                              27
clustering




•   =




•       (   )



                       28
clustering




•                 =
    –
                 ||x − y||2
    –
                        ∑
                        n
                                         1
        dk (x, y) = (         |xi − yi | )
                                       k k

                        i=1
    –
    –



                                                    29
30
•
    –   — Black, Blonde, Brown, Red, ...
    –   — Safe, Normal, Danger

•

•




                                           31
decision tree


         — decision tree

•

•

•

    Height Color Eatable
    small colorful can’t
    middle colorful can’t


                                      32
decision tree




•

•

•           ?

•   —

•       ? — NP




                           33
decision tree




•




•   ?
                  34
decision tree




•   ( )

•

•         ?

•             ?

•




                            35
frequent pattern




•

•                —A   B

•   (1/0 —   /    )




                                       36
frequent pattern




•

•   =1/       =0

          ,        A B C D E F
                   1 1 0 1 0 1
                   1 0 1 1 1 1
                   0 0 1 0 0 1
                   1 0 1 0 1 1
                   0 1 0 1 0 1


                                              37
frequent pattern




•
    –         (           )

•
    – A           C
    – A,C,D

• Algorithm
    – Apriori
    – FP-growth, ZDD-growth

                                           38
frequent pattern




•                      I = {i1, i2, · · · , in}
    – I = {A, B, C , D, E , F }

•                  X ∈ 2I

•                  σ

•             supp(X )
    – supp(X)                 X
    – supp(X)≥ σ

                                                               39
frequent pattern




•                  A → α, Aispattern, X ∈ I
    – A, B → C
    – A B                     C          (

•                            conf
                        supp(A∪{α})
• conf(A           )≡      supp(A)


•           conf               θ
    – 100                          10

                                                           40
frequent pattern




•
    – Apriori
    – FP tree and FP-growth
    – ZBDD and ZBDD-growth

•
    –




                                           41
add-up




•

•

•
    –                         etc.

•
    –         — Data Mining
    –   (?)

                                     42
add-up




•
    –
    –
    –

•            (?)
    –              (   )
    –   NN
    –


                           43
44
COLT




•      —

•      —

•




• DM       ?


                 45
COLT




•

•
    –
    –
    –
    –            (   )

•
    – keyword,

                           46
COLT




•
    –
    –

•

•       (   )

•



                  47
automaton and language




• automaton

• grammer

•

• automaton     grammer
    – 0         TM(                )
    – 2     (             ) PDA(                               )
    – 3     (         )    FA(         )

                                                              48
learning in the limit




•

•                   A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·}
    – A   Σ = {a}

•           M(
    –         (
    –
    – M

                                                                                  49
learning in the limit




•                    (DFA) D = (Q, Σ, δ, s0, F )

•                             γ

• γ                                            L(γ)

•     (             ) A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·}

•     A       E (w , l), w ∈ Σ∗, l ∈ {1, 0}

• E=(w , l)     w         A                    l = 1(     )

                                                                                50
learning in the limit




•        A                  (        ) σ
    – σ = (w 1, l1) (w 2, l2) (w 3, l3) (w 4, l4) · · ·
    – ∀i, wi ∈ Σ∗
    – if wi ∈ A then li = 1

•
    – A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·}
    – σ = ( , 1) (a01) (aa, 1) (aaa, 0) (aaaa, 1) · · ·



                                                                            51
learning in the limit




•                     (       )L ∈ C, C       (Σ       )

•                                 R, R                                γ

•                             M    R               L
    M       L                      σ                                 g1, g2, · · ·
                (gi       γ               )                {gi }   L(g ) = L
        g



                                                                                                   52
learning in the limit




•
    –

•
    –   σ
    –
    –       (class P? NP complete? NP hard?)

•
    –

                                                             53
Probably Approximately Correct Learning


    PAC          — probably approximately correct

•                                             δ

•

•           f      g          d(f , g )

•          PAC
    ∀ ,δ         Pr (d(f , g ) ≤ ) ≥ 1 − δ                              g
    f                            PAC


                                                                                 54
appendix




•           (?)
    –
    –
    –
    –
    –
    –
    – etc


                       55
Weak Probably Approximately Correct Learning


               PAC          — Weak PAC learning

• ∀ ,δ

•          PAC
               0 , δ0        Pr (d(f , g ) ≤    ≥ 1 − δ0
                                               0)
           g   f                               PAC

•
    –                   ?
    – ,δ

                                                                                         56
boosting

                  Boosting

•

•                       M0

• M0                          Mw

• Mw

•      (   PAC)       (PAC)


                                        57
boosting




•
    – Boosting
    –
    –

•                ?

• Boosting           AdaBoost   (?)




                                           58

Foilsを使ってみた。

  • 1.
    Recently Keyword Keyword • (Artificial Intelligence) • (Pattern Recognition) • (Machine Learning) • (Mathematics) • (Ocaml, Haskell, ...) • etc. 1
  • 2.
  • 3.
    Haskell • OCaml 1 • • 3
  • 4.
    • • Haskell 1.0 Haskell 98 Haskell 2010 • qsort [] = [] qsort (p:xs) = qsort lt ++ [p] ++ qsort gteq where lt = [x | x <- xs, x < p] gteq = [x | x <- xs, x >= p] 4
  • 5.
    quicksort 4 • 2 qs [] = [] qs (p:xs) = qs [x|x<-xs,x<p]++[p]++qs[x|x<-xs,x>=p] www 5
  • 6.
    • • Haskell ( ) Haskell 6
  • 7.
    OCaml • Haskell • Haskell Haskell • for while Haskell for while Microsoft F# 7
  • 8.
    let rec qsort=function | [] -> [] | pivot :: rest -> let il x = x < pivot in let left, right = List.partition il rest in qsort left @ [pivot] @ qsort right ( 8
  • 9.
  • 10.
  • 11.
  • 12.
    Introduction to MachineLearning • ML=Machine Learning • • • wikipedia 12
  • 13.
    Introduction to MachineLearning • • 2 – — Supervised learning – — Unsupervised Learning 13
  • 14.
    Introduction to MachineLearning — Supervised learning • ( • – SVM — Support Vector Machine – — Regression – — Back Propagation – — Decision tree 14
  • 15.
    Introduction to MachineLearning — Unsupervised learning • ( • – — Clustering – — Frequent Pattern Analysis 15
  • 16.
    Introduction to MachineLearning • — statistical machine learning – – • — computational learning – – 16
  • 17.
    Introduction to MachineLearning • – – – & • – – (?) 17
  • 18.
    Introduction to MachineLearning (1) • – • – = • – / 18
  • 19.
    Introduction to MachineLearning (2) • – – – • – ? – 19
  • 20.
    Introduction to MachineLearning (3) • – ? — – — – – • 20
  • 21.
    Introduction to MachineLearning • – symbol • (?) – – 21
  • 22.
    Introduction to MachineLearning PRML 22
  • 23.
    @ 23
  • 24.
    SVM SVM — Support Vector Machine • 2 (+1, -1) • ( ) 24
  • 25.
    SVM • • yk (w · x − θ) ≥ 1 • Maximize Margin ≡ Minimize |w|2 • Lagurange αk L ∑ ∑∑ xk · xl L= αk − αk αl yk yl 2 k k l 25
  • 26.
    SVM • L minimize • • ? – x x → ψ(x) – 26
  • 27.
    clustering — Clustering • & • ( 27
  • 28.
    clustering • = • ( ) 28
  • 29.
    clustering • = – ||x − y||2 – ∑ n 1 dk (x, y) = ( |xi − yi | ) k k i=1 – – 29
  • 30.
  • 31.
    – — Black, Blonde, Brown, Red, ... – — Safe, Normal, Danger • • 31
  • 32.
    decision tree — decision tree • • • Height Color Eatable small colorful can’t middle colorful can’t 32
  • 33.
    decision tree • • • ? • — • ? — NP 33
  • 34.
  • 35.
    decision tree • ( ) • • ? • ? • 35
  • 36.
    frequent pattern • • —A B • (1/0 — / ) 36
  • 37.
    frequent pattern • • =1/ =0 , A B C D E F 1 1 0 1 0 1 1 0 1 1 1 1 0 0 1 0 0 1 1 0 1 0 1 1 0 1 0 1 0 1 37
  • 38.
    frequent pattern • – ( ) • – A C – A,C,D • Algorithm – Apriori – FP-growth, ZDD-growth 38
  • 39.
    frequent pattern • I = {i1, i2, · · · , in} – I = {A, B, C , D, E , F } • X ∈ 2I • σ • supp(X ) – supp(X) X – supp(X)≥ σ 39
  • 40.
    frequent pattern • A → α, Aispattern, X ∈ I – A, B → C – A B C ( • conf supp(A∪{α}) • conf(A )≡ supp(A) • conf θ – 100 10 40
  • 41.
    frequent pattern • – Apriori – FP tree and FP-growth – ZBDD and ZBDD-growth • – 41
  • 42.
    add-up • • • – etc. • – — Data Mining – (?) 42
  • 43.
    add-up • – – – • (?) – ( ) – NN – 43
  • 44.
  • 45.
    COLT • — • — • • DM ? 45
  • 46.
    COLT • • – – – – ( ) • – keyword, 46
  • 47.
    COLT • – – • • ( ) • 47
  • 48.
    automaton and language •automaton • grammer • • automaton grammer – 0 TM( ) – 2 ( ) PDA( ) – 3 ( ) FA( ) 48
  • 49.
    learning in thelimit • • A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} – A Σ = {a} • M( – ( – – M 49
  • 50.
    learning in thelimit • (DFA) D = (Q, Σ, δ, s0, F ) • γ • γ L(γ) • ( ) A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} • A E (w , l), w ∈ Σ∗, l ∈ {1, 0} • E=(w , l) w A l = 1( ) 50
  • 51.
    learning in thelimit • A ( ) σ – σ = (w 1, l1) (w 2, l2) (w 3, l3) (w 4, l4) · · · – ∀i, wi ∈ Σ∗ – if wi ∈ A then li = 1 • – A = {(aa)n |n ≥ 0} = { , aa, aaaa, · · ·} – σ = ( , 1) (a01) (aa, 1) (aaa, 0) (aaaa, 1) · · · 51
  • 52.
    learning in thelimit • ( )L ∈ C, C (Σ ) • R, R γ • M R L M L σ g1, g2, · · · (gi γ ) {gi } L(g ) = L g 52
  • 53.
    learning in thelimit • – • – σ – – (class P? NP complete? NP hard?) • – 53
  • 54.
    Probably Approximately CorrectLearning PAC — probably approximately correct • δ • • f g d(f , g ) • PAC ∀ ,δ Pr (d(f , g ) ≤ ) ≥ 1 − δ g f PAC 54
  • 55.
    appendix • (?) – – – – – – – etc 55
  • 56.
    Weak Probably ApproximatelyCorrect Learning PAC — Weak PAC learning • ∀ ,δ • PAC 0 , δ0 Pr (d(f , g ) ≤ ≥ 1 − δ0 0) g f PAC • – ? – ,δ 56
  • 57.
    boosting Boosting • • M0 • M0 Mw • Mw • ( PAC) (PAC) 57
  • 58.
    boosting • – Boosting – – • ? • Boosting AdaBoost (?) 58