Dynamic Score Combination
 a supervised and unsupervised
     score combination method

    R. Tronci, G. Giacinto, F. Rol...
Outline
!     Goal of score combination mechanisms

!     Dynamic Score Combination

!     Experimental evaluation

!     ...
Behavior of biometric experts

                                                             Genuine scores should produce
...
Performance assessment
!     True Positive Rate = 1 - FNMR




Giorgio Giacinto     MLDM 2009 - July 23-25, 2009   4
Goal of score combination
!     To improve system                            reliability,   different
      experts are co...
Goal of score combination




                       Combined score




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   6
Goal of score combination
!    The aim is to maximize the separation
     between classes
     e.g.
                      ...
Static combination
!     Let E = {E1,E2,…Ej,…EN} be a set of N experts
!     Let X = {xi} be the set of patterns
!     Let...
Dynamic combination
     The weights of the combination also depends
     on the test pattern to be classified
           ...
Estimation of the parameters
for the dynamic combination
!     Let us suppose without loss of generality
                 ...
Estimation of the parameters
for the dynamic combination
!     This reasoning can be extended to N experts,
      so we ca...
Properties of the Dynamic
Score Combination
                       ( )
    si* = !i max sij + (1 " !i ) min sij
          ...
Properties of the Dynamic
Score Combination
                       ( )
    si* = !i max sij + (1 " !i ) min sij
          ...
Supervised estimation of "i
                       ( )
    si* = !i max sij + (1 " !i ) min sij
                   j      ...
Unsupervised estimation of "i
                       ( )
    si* = !i max sij + (1 " !i ) min sij
                   j    ...
Dataset
!     The dataset used is the Biometric Scores Set
      Release 1 of the NIST
     http://www.itl.nist.gov/iad/89...
Experimental Setup
!     Experiments aimed at assessing the performance of
      !     The unsupervised Dynamic Score Comb...
Performance assessment
!     Area Under the ROC Curve (AUC)
!     Equal Error Rate (ERR)

                   µ gen " µimp
...
Combination of three experts
                         AUC                       EER                       d’
  ISS        ...
DSC Mean Vs. Mean rule


                                                  Combination of three experts

                 ...
Unsupervised DSC Vs. fixed rules
AUC




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   21
Unsupervised DSC Vs. fixed rules
EER




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   22
Unsupervised DSC Vs. fixed rules
FMR at 0% FNMR




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   23
DSC Mean Vs. supervised DSC
AUC




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   24
DSC Mean Vs. supervised DSC
EER




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   25
DSC Mean Vs. supervised DSC
FMR at 0% FNMR




Giorgio Giacinto   MLDM 2009 - July 23-25, 2009   26
Conclusions
!     The Dynamic Score Combination mechanism
      embeds different combination modalities
!     Experiments ...
Upcoming SlideShare
Loading in …5
×

Dynamic Score Combination: A supervised and unsupervised score combination method

483
-1

Published on

In two-class score-based problems the combination of scores from an ensemble of experts is generally used to obtain distributions for positive and negative patterns that exhibit a larger degree of separation than those of the scores to be combined. Typically, combination is carried out by a "static" linear combination of scores, where the weights are computed by maximising a
performance function. These weights are equal for all the patterns, as they are assigned to each of the expert to be combined. In this paper we propose a "dynamic" formulation where the weights are computed individually for each pattern. Reported results on a biometric dataset show the effectiveness of the proposed combination methodology with respect to "static" linear combinations and trained combination rules.

Published in: Entertainment & Humor, Sports
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
483
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
18
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Dynamic Score Combination: A supervised and unsupervised score combination method

  1. 1. Dynamic Score Combination a supervised and unsupervised score combination method R. Tronci, G. Giacinto, F. Roli DIEE - University of Cagliari, Italy Pattern Recognition and Applications Group http://prag.diee.unica.it MLDM 2009 - Leipzig, July 23-25, 2009
  2. 2. Outline ! Goal of score combination mechanisms ! Dynamic Score Combination ! Experimental evaluation ! Conclusions Giorgio Giacinto MLDM 2009 - July 23-25, 2009 2
  3. 3. Behavior of biometric experts Genuine scores should produce a positive outcome Impostor scores should produce a negative outcome th FNMRj (th) = $ p(s j | s j ! positive)ds j = P(s j % th | s j ! positive) "# # FMRj (th) = $ p(s j | s j ! negative)ds j = P(s j > th | s j ! negative) th Giorgio Giacinto MLDM 2009 - July 23-25, 2009 3
  4. 4. Performance assessment ! True Positive Rate = 1 - FNMR Giorgio Giacinto MLDM 2009 - July 23-25, 2009 4
  5. 5. Goal of score combination ! To improve system reliability, different experts are combined ! different sensors, different features, different matching algorithms ! Combination is typically performed at the matching score level Giorgio Giacinto MLDM 2009 - July 23-25, 2009 5
  6. 6. Goal of score combination Combined score Giorgio Giacinto MLDM 2009 - July 23-25, 2009 6
  7. 7. Goal of score combination ! The aim is to maximize the separation between classes e.g. (µ ) 2 gen ! µimp FD = " gen + " imp 2 2 ! Thus the distributions have to be shifted far apart, and the spread of the scores reduced Giorgio Giacinto MLDM 2009 - July 23-25, 2009 7
  8. 8. Static combination ! Let E = {E1,E2,…Ej,…EN} be a set of N experts ! Let X = {xi} be the set of patterns ! Let fj ( ) be the function associated to expert Ej that produces a score sij = fj(xi) for each pattern xi Static linear combination N si* = # ! j " sij j =1 ! The weights are computed as to maximize some measure of class separability on a training set ! The combination is static with respect to the test pattern to be classified Giorgio Giacinto MLDM 2009 - July 23-25, 2009 8
  9. 9. Dynamic combination The weights of the combination also depends on the test pattern to be classified N si* = # ! ij " sij j =1 The local estimation of combination parameters may yield better results than the global estimation, in terms of separation between the distributions of scores si* Giorgio Giacinto MLDM 2009 - July 23-25, 2009 9
  10. 10. Estimation of the parameters for the dynamic combination ! Let us suppose without loss of generality s i1 ! s i2 ! ! ! siN ! The linear combination of three experts ! i1si1 + ! i 2 si 2 + ! i 3 si 3 ! ij "[ 0,1] can also be written as " i1si1 + si 2 + " i!3 si 3 ! which is equivalent to " i1si1 + " i!! si 3 !! 3 Giorgio Giacinto MLDM 2009 - July 23-25, 2009 10
  11. 11. Estimation of the parameters for the dynamic combination ! This reasoning can be extended to N experts, so we can get ( ) si* = !i1 min sij + !i 2 max sij j j ( ) ! Thus, for each pattern we have to estimate two parameters ! If we set the constraint !i1 + !i 2 = 1 only one parameter has to be estimated and si* ! [minj(sij),maxj(sij)] Giorgio Giacinto MLDM 2009 - July 23-25, 2009 11
  12. 12. Properties of the Dynamic Score Combination ( ) si* = !i max sij + (1 " !i ) min sij j j ( ) ! This formulation embeds the typical static combination rules #" J sij $ min ( sij ) N j j =1 Linear combination !i = ( ) ( ) ! max sij $ min sij j j 1 N ( ) " sij # min sij N j =1 j Mean rule !i = max ( s ) # min ( s ) ! ij ij j j ! Max rule for "i = 1 and Min rule for "i = 0 Giorgio Giacinto MLDM 2009 - July 23-25, 2009 12
  13. 13. Properties of the Dynamic Score Combination ( ) si* = !i max sij + (1 " !i ) min sij j j ( ) ! This formulation also embeds the Dynamic Score Selection (DSS) "1 if xi belongs to the positive class !i = # $0 if xi belongs to the negative class ! DSS clearly maximize class separability if the estimation of the class of xi is reliable ! e.g., a classifier trained on the outputs of the experts E Giorgio Giacinto MLDM 2009 - July 23-25, 2009 13
  14. 14. Supervised estimation of "i ( ) si* = !i max sij + (1 " !i ) min sij j j ( ) ! "i = P(pos|xi,E) P(pos|xi,E) can be estimated by a classifier trained on the outputs of the experts E ! "i is estimated by a supervised procedure ! This formulation can also be seen as a soft version of DSS ! P(pos|xi,E) accounts for the uncertainty in class estimation Giorgio Giacinto MLDM 2009 - July 23-25, 2009 14
  15. 15. Unsupervised estimation of "i ( ) si* = !i max sij + (1 " !i ) min sij j j ( ) ! "i is estimated by an unsupervised procedure ! the estimation does not depend on a training set 1 N Mean rule !i = " sij N j =1 Max rule !i = max sij j ( ) Min rule !i = min sij j ( ) Giorgio Giacinto MLDM 2009 - July 23-25, 2009 15
  16. 16. Dataset ! The dataset used is the Biometric Scores Set Release 1 of the NIST http://www.itl.nist.gov/iad/894.03/biometricscores/ ! This dataset contains scores from 4 experts related to face and fingerprint recognition systems. ! The experiments were performed using all the possible combinations of 3 and 4 experts. ! The dataset has been divided into four parts, each one used for training and the remaining three for testing Giorgio Giacinto MLDM 2009 - July 23-25, 2009 16
  17. 17. Experimental Setup ! Experiments aimed at assessing the performance of ! The unsupervised Dynamic Score Combination (DSC) ! "i estimated by the Mean, Max, and Min rules ! The supervised Dynamic Score Combination ! "i estimated by k-NN, LDC, QDC, and SVM classifiers ! Comparisons with ! The Ideal Score Selector (ISS) ! The Optimal static Linear Combination (Opt LC) ! The Mean, Max, and Min rules ! The linear combination where coefficients are estimated by the LDA Giorgio Giacinto MLDM 2009 - July 23-25, 2009 17
  18. 18. Performance assessment ! Area Under the ROC Curve (AUC) ! Equal Error Rate (ERR) µ gen " µimp ! d! = # gen # imp 2 2 + 2 2 ! FNMR at 1% and 0% FMR ! FMR at 1% and 0% FNMR Giorgio Giacinto MLDM 2009 - July 23-25, 2009 18
  19. 19. Combination of three experts AUC EER d’ ISS 1.0000 (±0.0000) 0.0000 (±0.0000) 25.4451 (±8.7120) Opt LC 0.9997 (±0.0004) 0.0050 (±0.0031) 3.1231 (±0.2321) Mean 0.9982 (±0.0013) 0.0096 (±0.0059) 3.6272 (±0.4850) Max 0.9892 (±0.0022) 0.0450 (±0.0048) 3.0608 (±0.3803) Min 0.9708 (±0.0085) 0.0694 (±0.0148) 2.0068 (±0.1636) DSC Mean 0.9986 (±0.0011) 0.0064 (±0.0030) 3.8300 (±0.5049) DSC Max 0.9960 (±0.0015) 0.0214 (±0.0065) 3.8799 (±0.2613) DSC Min 0.9769 (±0.0085) 0.0634 (±0.0158) 2.3664 (±0.2371) LDA 0.9945 (±0.0040) 0.0296 (±0.0123) 2.3802 (±0.2036) DSC k-NN 0.9987 (±0.0016) 0.0104 (±0.0053) 6.9911 (±0.9653) DSC ldc 0.9741 (±0.0087) 0.0642 (±0.0149) 2.7654 (±0.2782) DSC qdc 0.9964 (±0.0039) 0.0147 (±0.0092) 9.1452 (±3.1002) DSC svm 0.9996 (±0.0004) 0.0048 (±0.0026) 4.8972 (±0.4911) Giorgio Giacinto MLDM 2009 - July 23-25, 2009 19
  20. 20. DSC Mean Vs. Mean rule Combination of three experts DSC Mean AUC !!0.9991 EER !!0.0052 d' !!!4.4199 Mean rule AUC !!0.9986 EER !!0.0129 d' !!!4.0732 Giorgio Giacinto MLDM 2009 - July 23-25, 2009 20
  21. 21. Unsupervised DSC Vs. fixed rules AUC Giorgio Giacinto MLDM 2009 - July 23-25, 2009 21
  22. 22. Unsupervised DSC Vs. fixed rules EER Giorgio Giacinto MLDM 2009 - July 23-25, 2009 22
  23. 23. Unsupervised DSC Vs. fixed rules FMR at 0% FNMR Giorgio Giacinto MLDM 2009 - July 23-25, 2009 23
  24. 24. DSC Mean Vs. supervised DSC AUC Giorgio Giacinto MLDM 2009 - July 23-25, 2009 24
  25. 25. DSC Mean Vs. supervised DSC EER Giorgio Giacinto MLDM 2009 - July 23-25, 2009 25
  26. 26. DSC Mean Vs. supervised DSC FMR at 0% FNMR Giorgio Giacinto MLDM 2009 - July 23-25, 2009 26
  27. 27. Conclusions ! The Dynamic Score Combination mechanism embeds different combination modalities ! Experiments show that the unsupervised DSC usually outperforms the related “fixed” combination rules ! The use of a classifier in the supervised DSC allows attaining better performance, at the expense of increased computational complexity ! Depending on the classifier, performance are very close to those of the optimal linear combiner Giorgio Giacinto MLDM 2009 - July 23-25, 2009 27
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×