Machine Learning 1
Lecture 6.3 - Supervised Learning
Classification - Discriminative Models
Erik Bekkers
(Bishop 4.0, 4.1.1, 4.1.2)
Image credit: Kirillm | Getty Images
Slide credits: Patrick Forré and
Rianne van den Berg
Machine Learning 1
‣ Input:
‣ Targets:
Discriminant functions :
Direct mapping of input to target
‣ Generalized Linear Models (GLM)
‣ Decision boundary:
2
Discriminant Functions: Two Classes
x 2 RD
t 2 {C1, C2}
= ( 0(x), 1(x), ..., M 1(x))T
y(x, w̃) = f(w̃T
)
y(x, w̃) = const
E E f -
I
,
I }
now
-
linear activation function
A t
-
linear
1
into = const,
Machine Learning 1
‣ Simplest discriminant function
‣ Decision boundary:
‣ Consider 2 datapoints xA and
xB on decision boundary
3
Discriminant Functions: Two Classes
y(x, w) = 0
y(xA) = y(xB) = 0
linear discriminant function in two dimensions.
.
x2
x1
w
x
y(x)
!w!
x⊥
−w0
!w!
y = 0
y < 0
y > 0
R2
R1
Figure: 2-class decision boundary (Bishop 4.1)
y(x, w̃) = wT
x + w0
0(x) = 1
j(x) = xj j = 1, ..., D
( f Cx ) -_ x ) ( canonical basis )
WTX two = O
Tss
•
×,
d
d
(
XA¥1
WIC Ia
-
Ers ) '
-
O
( WLCXA -
Ars ))
so he determines the orientation of
the decision
boundary
Machine Learning 1
‣ Take x’ a point on decision surface:
‣ Normal (signed) distance d from
origin to decision surface
‣ Normal (signed) distance r from
general x to decision surface:
4
Discriminant Functions: Two Classes
y(x) = 0
d =
wT
x
||w||
=
linear discriminant function in two dimensions.
.
x2
x1
w
x
y(x)
!w!
x⊥
−w0
!w!
y = 0
y < 0
y > 0
R2
R1
Figure: 2-class decision boundary (Bishop 4.1)
x = x? + r
w
||w||
wT
x + w0 = wT
x? + r
wT
w
||w||
+ w0
I
Fitton w . )
I
-
Wo
-
11Wh
SO Wo shifts the
boundary away from the origin
!
=±+rFw,
t
yCX+)=o -
• X
'
six
.
I
"
m .
⇒ ✓ =
ycx )
1¥
So yee , determines
the
distance to the surface !
Machine Learning 1
‣ K-class discriminant
‣ Assign x to Ck if
‣ Decision boundary between and :
‣ Decision regions (for GLM) are convex
5
Discriminant Functions: Multiple Classes
yk(x) = wT
k x + wk0
Rk Rj
Illustration of the decision regions for a mul-
ticlass linear discriminant, with the decision
both lie inside the same decision re-
that lies on the line
connecting these two points must also lie in
, and hence the decision region must be
Ri
Rj
Rk
xA
xB
x̂
y k CE ) > yjcx ) ,
Vj # k
Tk CE ) -
-
Yj CI )
went It who = HIE two
so blear decision boundaries

6.3_DiscriminantFunctions for machine learning supervised learning

  • 1.
    Machine Learning 1 Lecture6.3 - Supervised Learning Classification - Discriminative Models Erik Bekkers (Bishop 4.0, 4.1.1, 4.1.2) Image credit: Kirillm | Getty Images Slide credits: Patrick Forré and Rianne van den Berg
  • 2.
    Machine Learning 1 ‣Input: ‣ Targets: Discriminant functions : Direct mapping of input to target ‣ Generalized Linear Models (GLM) ‣ Decision boundary: 2 Discriminant Functions: Two Classes x 2 RD t 2 {C1, C2} = ( 0(x), 1(x), ..., M 1(x))T y(x, w̃) = f(w̃T ) y(x, w̃) = const E E f - I , I } now - linear activation function A t - linear 1 into = const,
  • 3.
    Machine Learning 1 ‣Simplest discriminant function ‣ Decision boundary: ‣ Consider 2 datapoints xA and xB on decision boundary 3 Discriminant Functions: Two Classes y(x, w) = 0 y(xA) = y(xB) = 0 linear discriminant function in two dimensions. . x2 x1 w x y(x) !w! x⊥ −w0 !w! y = 0 y < 0 y > 0 R2 R1 Figure: 2-class decision boundary (Bishop 4.1) y(x, w̃) = wT x + w0 0(x) = 1 j(x) = xj j = 1, ..., D ( f Cx ) -_ x ) ( canonical basis ) WTX two = O Tss • ×, d d ( XA¥1 WIC Ia - Ers ) ' - O ( WLCXA - Ars )) so he determines the orientation of the decision boundary
  • 4.
    Machine Learning 1 ‣Take x’ a point on decision surface: ‣ Normal (signed) distance d from origin to decision surface ‣ Normal (signed) distance r from general x to decision surface: 4 Discriminant Functions: Two Classes y(x) = 0 d = wT x ||w|| = linear discriminant function in two dimensions. . x2 x1 w x y(x) !w! x⊥ −w0 !w! y = 0 y < 0 y > 0 R2 R1 Figure: 2-class decision boundary (Bishop 4.1) x = x? + r w ||w|| wT x + w0 = wT x? + r wT w ||w|| + w0 I Fitton w . ) I - Wo - 11Wh SO Wo shifts the boundary away from the origin ! =±+rFw, t yCX+)=o - • X ' six . I " m . ⇒ ✓ = ycx ) 1¥ So yee , determines the distance to the surface !
  • 5.
    Machine Learning 1 ‣K-class discriminant ‣ Assign x to Ck if ‣ Decision boundary between and : ‣ Decision regions (for GLM) are convex 5 Discriminant Functions: Multiple Classes yk(x) = wT k x + wk0 Rk Rj Illustration of the decision regions for a mul- ticlass linear discriminant, with the decision both lie inside the same decision re- that lies on the line connecting these two points must also lie in , and hence the decision region must be Ri Rj Rk xA xB x̂ y k CE ) > yjcx ) , Vj # k Tk CE ) - - Yj CI ) went It who = HIE two so blear decision boundaries