SlideShare a Scribd company logo
Advanced Digital Image Processing
(EE-836)
National University of Sciences and
Technology (NUST)
School of Electrical Engineering and Computer Science
(SEECS)
Khawar Khurshid, 20121
Intro to Pattern
Recognition
Khawar Khurshid, 20122
What is a Pattern?
“A pattern is the opposite of a chaos; it is an entity vaguely
defined, that could be given a name.”
PR Definitions from Literature
“The assignment of a physical object or event to one of several pre-
specified categories” – Duda and Hart
Pattern Recognition is concerned with answering the question
“What is this?” –Morse
Examples of PR Applications
Optical Character Recognition (OCR)
– Sorting letters by postal code.
– Reconstructing text from printed
materials.
– Handwritten character recognition
Analysis and identification of human
patterns (Biometric Classification)
– Face recognition
– Handwriting recognition
– Finger prints and DNA mapping
– Iris scan identification
Examples of Pattern Recognition Problems
Computer aided diagnosis
– Medical imaging,
EEG, ECG signal
analysis
– Designed to assist (not
replace) physicians
Prediction systems
– Weather forecasting (based on satellite data).
– Analysis of seismic patterns
Other Examples
Speech and voice recognition/speaker identification.
Banking and insurance applications
– Credit cards applicants classified by income, credit worthiness,
mortgage amount, # of dependents, etc.
– Car insurance (pattern including make of car, # of accidents,
age, gender, driving habits, location, etc).
A Pattern Classification Example
A fish processing plant wants to automate the process of sorting incoming fish
according to species (salmon or sea bass)
The automation system consists of
– A conveyor belt for incoming products
– A vision system with an overhead CCD camera
– A computer to analyze images and control the robot arm
objects based on selected features
Recognize objects using
Computer vision, Image Processing and PR
Real
world
probabilistic techniques
Image Acquisition by
sensor
Image enhancement and
restoration
Filtering
Feature Extraction
Edge detection
Texture
Interest points
Classification and Recognition
Classification and interpretation of
Segmentation
Impose some order on group of
pixels to separate them from
each other
A Pattern Classification Example
“Sorting incoming Fish on a conveyor according to species using
optical sensing”
Purpose:
To classify the future samples based on the data of extracted
features from the training samples
Problem Analysis
Set up a camera and take some training samples to extract features
• Length
• Lightness
• Width
• Number and shape of fins
• Position of the mouth, etc…
– This is the set of all suggested features to explore for use in our
classifier!
Selection Criterion
Select the length of the fish as a possible feature for discrimination
The length is a poor feature alone!
Select the lightness as a possible feature.
Selection Criterion
Length or Lightness (weight), which one is a better feature?
No value of either feature will “classify” all fish correctly
Fish xT = [x1, x2]
Lightness Width
Selection Criterion and Decision Boundary
Adopt the lightness and add the width of the fish
Decision Boundary Selection
Which of the boundaries would you
choose?
Simple linear boundary (training error > 0)
Nonlinear complex boundary (tr. error = 0)
Simpler nonlinear boundary (tr. error > 0)
Ideally, the best decision boundary should be the one which provides
an optimal performance such as in the following figure:
However, the central aim of designing a classifier is to correctly
classify novel input (classify future samples)
Issue of generalization
Generalization and Decision Boundary
Selected decision boundary
A simpler but generalized boundary in this case may be the
simple nonlinear curve as shown below:
Further features can be added to enhance the performance
However redundancy in information may not be desirable
A classifier, intuitively, is designed to minimize classification error,
the total number of instances (fish) classified incorrectly.
– Is this the best objective (cost) function to minimize? What kinds
of error can be made? Are they all equally bad? What is the real
cost of making an error?
• Sea bass misclassified as salmon: Pleasant surprise for the
consumer, tastier fish
• Salmon misclassified as sea bass: Customer upset, paid too
much for inferior fish
– We may want to adjust our decision boundary to minimize overall
risk –in this case, second type error is more costly, so we may
want to minimize this error.
Adjustment of decision boundary
Adjustment of decision boundary
Error is minimized by moving the threshold to the left
Terminologies in PR
Features: a set of variables believed to carry discriminating and
characterizing information about the objects under consideration
Feature vector: A collection of d features, ordered in some
meaningful way into a d-dimensional column vector, that
represents the signature of the object to be identified.
Feature space: The d-dimensional space in which the feature
vectors lie. A d-dimensional vector in a d-dimensional space
constitutes a point in that space.
Terminologies in PR
Decision boundary:A boundary in the d-dimensional feature space
that separates patterns of different classes from each other.
Training Data: Data used during training of a classifier for which
the correct labels are known
Field Test Data: Unknown data to be classified –for measurements
from which the classifier is trained. The correct class of this data
are not known
Terminologies in PR
Cost Function: A quantitative measure that represents the cost of
making an error. The classifier is trained to minimize this
function.
Classifier: An algorithm which adjusts its parameters to find the
correct decision boundaries –through a learning algorithm using
a training dataset –such that a cost function is minimized.
Error: Incorrect labeling of the data by the classifier
Cost of error: Cost of making a decision, in particular an
incorrect one –not all errors are equally costly!
Training Performance: The ability / performance of the classifier
in correctly identifying the classes of the training data, which it
has already seen. It may not be a good indicator of the
generalization performance.
Cooperative vs. Uncooperative data
Good Features vs. Bad Features
Ideally, for a given group of patterns coming from the same class,
feature values should all be similar
For patterns coming from different classes, the feature values
should be different
Features Correlation
What is the cost of making a decision –what is the risk of
an incorrect decision?
Determine the right type of classifier, right type of
training algorithm and right type of classifier parameters.
How to choose a classifier that minimizes the risk and
maximizes the performance? Can we add prior
knowledge, context to the classifier?
Identify and extract features relevant to distinguishing sea
bass from salmon, but invariant to noise, occlusion
rotations, scale, shift. What if we already have irrelevant
features? Dimensionality reduction?
Separate & isolate fish from background image
CCD Camera on a conveyor belt
Fish
Components of Pattern Recognition System
Example of PR
Consider the problem of recognizing the letters L,P,O,E,Q
– Determine a sufficient set of features
– Design a tree-structured classifier
1
Bayesian Decision Theory
Bayes Rule
We now pose the following question: Given that the event A has
occurred. What is the probability that any single one of the event
B’s occur?
This is known as the Bayes rule
i
i i i
The Real World: The World of Many Dimensions
In most practical applications, we have more then one feature, and
therefore the random variable x must be replaced with a random
vector x. P(x) P(x)
The joint probability mass function P(x) still satisfies the axioms of
probability
The Bayes rule is then
While the notation changes only slightly, the implications are quite
substantial
Bayes Decision theory
Based on quantifying the trade offs betweens various classification
decisions using a probabilistic approach
The theory assumes:
– Decision problem can be posed in probabilistic terms
– All relevant probability values are known (in practice this is not
true)
The sea bass/salmon example
State of nature: w1: Sea bass, w2: Salmon
Assume that we know the probabilities of observing sea bass and
salmons, P(w1) and P(w2) (prior probabilities), for a particular
location of fishing and time of year
Based on this information, how would you guess that the type of
the next fish to be caught?
• Decide w1 if P(w1) > P(w2) otherwise decide w2
A reasonable decision rule?
If the catch of salmon and sea bass is equi-probable
P(w1) = P(w2) (uniform priors)
P(w1) + P( w2) = 1 (exclusivity and exhaustivity)
This rule will not be applicable
Class Conditional Probabilities
p(x|w1) and p(x|w2) describe the difference in lightness between
populations of sea bass and salmon
Posterior Probability
Suppose, we know P(w1), P(w2), p(x|w1) and p(x|w2) and that we have observed
the value of the feature (a random variable) x
– How would you decide on the “state of nature”–type of fish, based on this
info?
– Bayes theory allows us to compute the posterior probabilities from prior and
class-conditional probabilities
Posterior Probability: The
(conditional) probability of
correct class being ùj, given
that feature value x has been
observed
Likelihood: The (class-conditional) probability of
observing a feature value of x, given that the correct class is
wj. All things being equal, the category with higher class
conditional probability is more “likely”to be the correct
class.
Prior Probability:The total
probability of correct class
being class wj determined
based on prior experience
Evidence:The total
probability of observing
the feature value as x
j
Posterior Probabilities
Bayes rule allows us to compute the posterior probability (difficult to
determine) from prior probabilities, likelihood and the evidence (easier to
determine).
Posterior probabilities for
priors P(w1) = 2/3 and P(w2)
= 1/3. For example, given
that a pattern is measured
to have feature value x =14,
the probability it is in
category w2 is roughly 0.08,
and that it is in w1 is 0.92. At
every x, the posteriors sum
to 1.0.
Which class would you choose now?
What is probability of making an error with this decision?
Bayes Decision Rule
Decision given the posterior probabilities
X is an observation for which:
if P(w1 | x) > P(w2 | x) True state of nature = w1
if P(w1 | x) < P(w2 | x) True state of nature = w2
That is Choose the class that has the larger posterior probability !
If there are multiple features, x={x1, x2,…, xd} and multiple classes
Choose wi if P(wi| x) > P(wj| x) for all i = 1, 2, …, c
Error:
whenever we observe a particular x, the probability of error is :
P(error | x) = P(w1 | x) if we decide w2
P(error | x) = P(w2 | x) if we decide w1
Therefore:
P(error | x) = min [P(w1 | x), P(w2 | x)]
(Bayes decision)
Bayesian Decision Theory – Generalization
Generalization of the preceding ideas
– Use of more than one feature
– Use more than two states of nature
– Allowing other actions and not only decide on the state of
nature
– Unequal error cost
Allowing actions other than classification primarily allows the
possibility of rejection: refusing to make a decision in close or bad
cases!
The loss function states how costly each action taken is
Two-category classification
Definitions
a1 : deciding w1
a2 : deciding w2
eij = e(ai | wj), loss incurred for deciding wi when the true state of nature
is wj
Conditional risk:
R(a1 | x) = e11P(w1 | x) + e12P(w2 | x)
R(a2 | x) = e21P(w1| x) + e22P(w2 | x)
Our rule is the following:
if R(a1| x) < R(a2| x) action a1: “decide w1” is taken
End
Intro to Pattern
Recognition
Khawar Khurshid, 201239

More Related Content

What's hot

Support Vector machine
Support Vector machineSupport Vector machine
Support Vector machine
Anandha L Ranganathan
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
mohmedahmed23
 
Machine Learning and Real-World Applications
Machine Learning and Real-World ApplicationsMachine Learning and Real-World Applications
Machine Learning and Real-World Applications
MachinePulse
 
bag-of-words models
bag-of-words models bag-of-words models
bag-of-words models
Xiaotao Zou
 
An introduction to Machine Learning
An introduction to Machine LearningAn introduction to Machine Learning
An introduction to Machine Learningbutest
 
Image recognition
Image recognitionImage recognition
Image recognition
Nikhil Singh
 
A survey of deep learning approaches to medical applications
A survey of deep learning approaches to medical applicationsA survey of deep learning approaches to medical applications
A survey of deep learning approaches to medical applications
Joseph Paul Cohen PhD
 
PDF OCR
PDF OCRPDF OCR
Machine learning
Machine learningMachine learning
Machine learning
Wes Eklund
 
Unsupervised learning
Unsupervised learningUnsupervised learning
Unsupervised learning
amalalhait
 
Content Based Image Retrieval
Content Based Image Retrieval Content Based Image Retrieval
Content Based Image Retrieval
Swati Chauhan
 
Machine Learning in Cyber Security
Machine Learning in Cyber SecurityMachine Learning in Cyber Security
Machine Learning in Cyber Security
Rishi Kant
 
Birch Algorithm With Solved Example
Birch Algorithm With Solved ExampleBirch Algorithm With Solved Example
Birch Algorithm With Solved Example
kailash shaw
 
Medical image processing studies
Medical image processing studiesMedical image processing studies
Medical image processing studies
Bằng Nguyễn Kim
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
Tanjina Thakur
 
Machine learning
Machine learningMachine learning
Machine learning
Rajib Kumar De
 
OCR (Optical Character Recognition)
OCR (Optical Character Recognition) OCR (Optical Character Recognition)
OCR (Optical Character Recognition)
IstiaqueBinIslam
 
Markov Random Field (MRF)
Markov Random Field (MRF)Markov Random Field (MRF)
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
Shailesh Thakur
 

What's hot (20)

Support Vector machine
Support Vector machineSupport Vector machine
Support Vector machine
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
Machine Learning and Real-World Applications
Machine Learning and Real-World ApplicationsMachine Learning and Real-World Applications
Machine Learning and Real-World Applications
 
bag-of-words models
bag-of-words models bag-of-words models
bag-of-words models
 
An introduction to Machine Learning
An introduction to Machine LearningAn introduction to Machine Learning
An introduction to Machine Learning
 
Image recognition
Image recognitionImage recognition
Image recognition
 
A survey of deep learning approaches to medical applications
A survey of deep learning approaches to medical applicationsA survey of deep learning approaches to medical applications
A survey of deep learning approaches to medical applications
 
PDF OCR
PDF OCRPDF OCR
PDF OCR
 
Machine learning
Machine learningMachine learning
Machine learning
 
Unsupervised learning
Unsupervised learningUnsupervised learning
Unsupervised learning
 
Content Based Image Retrieval
Content Based Image Retrieval Content Based Image Retrieval
Content Based Image Retrieval
 
Machine Learning in Cyber Security
Machine Learning in Cyber SecurityMachine Learning in Cyber Security
Machine Learning in Cyber Security
 
Birch Algorithm With Solved Example
Birch Algorithm With Solved ExampleBirch Algorithm With Solved Example
Birch Algorithm With Solved Example
 
Medical image processing studies
Medical image processing studiesMedical image processing studies
Medical image processing studies
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
Machine learning
Machine learningMachine learning
Machine learning
 
OCR (Optical Character Recognition)
OCR (Optical Character Recognition) OCR (Optical Character Recognition)
OCR (Optical Character Recognition)
 
Markov Random Field (MRF)
Markov Random Field (MRF)Markov Random Field (MRF)
Markov Random Field (MRF)
 
Pattern recognition
Pattern recognitionPattern recognition
Pattern recognition
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
 

Similar to 12 pattern recognition

Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learningbutest
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Mohammed Bennamoun
 
Quality of Multimedia Experience: Past, Present and Future
Quality of Multimedia Experience: Past, Present and FutureQuality of Multimedia Experience: Past, Present and Future
Quality of Multimedia Experience: Past, Present and Future
Touradj Ebrahimi
 
PPT s09-machine vision-s2
PPT s09-machine vision-s2PPT s09-machine vision-s2
PPT s09-machine vision-s2
Binus Online Learning
 
[UMAP2013] Recommendation with Differential Context Weighting
[UMAP2013] Recommendation with Differential Context Weighting[UMAP2013] Recommendation with Differential Context Weighting
[UMAP2013] Recommendation with Differential Context Weighting
YONG ZHENG
 
Introduction to Data Mining
Introduction to Data MiningIntroduction to Data Mining
Introduction to Data Mining
Kai Koenig
 
NIPS2007: structured prediction
NIPS2007: structured predictionNIPS2007: structured prediction
NIPS2007: structured predictionzukun
 
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Simplilearn
 
talalalsubaie-1220737011220266-9.pdf
talalalsubaie-1220737011220266-9.pdftalalalsubaie-1220737011220266-9.pdf
talalalsubaie-1220737011220266-9.pdf
someyamohsen2
 
Recognition
RecognitionRecognition
Recognition
Neelam Soni
 
Clustering
ClusteringClustering
Automatic System for Detection and Classification of Brain Tumors
Automatic System for Detection and Classification of Brain TumorsAutomatic System for Detection and Classification of Brain Tumors
Automatic System for Detection and Classification of Brain Tumors
Fatma Sayed Ibrahim
 
A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...Yao Wu
 
Lec14: Evaluation Framework for Medical Image Segmentation
Lec14: Evaluation Framework for Medical Image SegmentationLec14: Evaluation Framework for Medical Image Segmentation
Lec14: Evaluation Framework for Medical Image Segmentation
Ulaş Bağcı
 
Datamining intro-iep
Datamining intro-iepDatamining intro-iep
Datamining intro-iep
aaryarun1999
 
Using rule based classifiers for the predictive analysis of breast cancer rec...
Using rule based classifiers for the predictive analysis of breast cancer rec...Using rule based classifiers for the predictive analysis of breast cancer rec...
Using rule based classifiers for the predictive analysis of breast cancer rec...
Alexander Decker
 

Similar to 12 pattern recognition (20)

Pattern-Analysis (1).pdf
Pattern-Analysis (1).pdfPattern-Analysis (1).pdf
Pattern-Analysis (1).pdf
 
Classification
ClassificationClassification
Classification
 
Introduction to Machine Learning
Introduction to Machine LearningIntroduction to Machine Learning
Introduction to Machine Learning
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
 
Quality of Multimedia Experience: Past, Present and Future
Quality of Multimedia Experience: Past, Present and FutureQuality of Multimedia Experience: Past, Present and Future
Quality of Multimedia Experience: Past, Present and Future
 
PPT s09-machine vision-s2
PPT s09-machine vision-s2PPT s09-machine vision-s2
PPT s09-machine vision-s2
 
forest-cover-type
forest-cover-typeforest-cover-type
forest-cover-type
 
[UMAP2013] Recommendation with Differential Context Weighting
[UMAP2013] Recommendation with Differential Context Weighting[UMAP2013] Recommendation with Differential Context Weighting
[UMAP2013] Recommendation with Differential Context Weighting
 
Introduction to Data Mining
Introduction to Data MiningIntroduction to Data Mining
Introduction to Data Mining
 
NIPS2007: structured prediction
NIPS2007: structured predictionNIPS2007: structured prediction
NIPS2007: structured prediction
 
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
 
talalalsubaie-1220737011220266-9.pdf
talalalsubaie-1220737011220266-9.pdftalalalsubaie-1220737011220266-9.pdf
talalalsubaie-1220737011220266-9.pdf
 
Recognition
RecognitionRecognition
Recognition
 
Es1
Es1Es1
Es1
 
Clustering
ClusteringClustering
Clustering
 
Automatic System for Detection and Classification of Brain Tumors
Automatic System for Detection and Classification of Brain TumorsAutomatic System for Detection and Classification of Brain Tumors
Automatic System for Detection and Classification of Brain Tumors
 
A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...A General Framework for Accurate and Fast Regression by Data Summarization in...
A General Framework for Accurate and Fast Regression by Data Summarization in...
 
Lec14: Evaluation Framework for Medical Image Segmentation
Lec14: Evaluation Framework for Medical Image SegmentationLec14: Evaluation Framework for Medical Image Segmentation
Lec14: Evaluation Framework for Medical Image Segmentation
 
Datamining intro-iep
Datamining intro-iepDatamining intro-iep
Datamining intro-iep
 
Using rule based classifiers for the predictive analysis of breast cancer rec...
Using rule based classifiers for the predictive analysis of breast cancer rec...Using rule based classifiers for the predictive analysis of breast cancer rec...
Using rule based classifiers for the predictive analysis of breast cancer rec...
 

Recently uploaded

Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
Kamal Acharya
 
AP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specificAP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specific
BrazilAccount1
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
Jayaprasanna4
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 
space technology lecture notes on satellite
space technology lecture notes on satellitespace technology lecture notes on satellite
space technology lecture notes on satellite
ongomchris
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
Kamal Acharya
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
Massimo Talia
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
TeeVichai
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
JoytuBarua2
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
AJAYKUMARPUND1
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
manasideore6
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
BrazilAccount1
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
AmarGB2
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 

Recently uploaded (20)

Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
 
AP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specificAP LAB PPT.pdf ap lab ppt no title specific
AP LAB PPT.pdf ap lab ppt no title specific
 
ethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.pptethical hacking-mobile hacking methods.ppt
ethical hacking-mobile hacking methods.ppt
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
space technology lecture notes on satellite
space technology lecture notes on satellitespace technology lecture notes on satellite
space technology lecture notes on satellite
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 

12 pattern recognition

  • 1. Advanced Digital Image Processing (EE-836) National University of Sciences and Technology (NUST) School of Electrical Engineering and Computer Science (SEECS) Khawar Khurshid, 20121
  • 3. What is a Pattern? “A pattern is the opposite of a chaos; it is an entity vaguely defined, that could be given a name.”
  • 4. PR Definitions from Literature “The assignment of a physical object or event to one of several pre- specified categories” – Duda and Hart Pattern Recognition is concerned with answering the question “What is this?” –Morse
  • 5. Examples of PR Applications Optical Character Recognition (OCR) – Sorting letters by postal code. – Reconstructing text from printed materials. – Handwritten character recognition Analysis and identification of human patterns (Biometric Classification) – Face recognition – Handwriting recognition – Finger prints and DNA mapping – Iris scan identification
  • 6. Examples of Pattern Recognition Problems Computer aided diagnosis – Medical imaging, EEG, ECG signal analysis – Designed to assist (not replace) physicians Prediction systems – Weather forecasting (based on satellite data). – Analysis of seismic patterns
  • 7. Other Examples Speech and voice recognition/speaker identification. Banking and insurance applications – Credit cards applicants classified by income, credit worthiness, mortgage amount, # of dependents, etc. – Car insurance (pattern including make of car, # of accidents, age, gender, driving habits, location, etc).
  • 8. A Pattern Classification Example A fish processing plant wants to automate the process of sorting incoming fish according to species (salmon or sea bass) The automation system consists of – A conveyor belt for incoming products – A vision system with an overhead CCD camera – A computer to analyze images and control the robot arm
  • 9. objects based on selected features Recognize objects using Computer vision, Image Processing and PR Real world probabilistic techniques Image Acquisition by sensor Image enhancement and restoration Filtering Feature Extraction Edge detection Texture Interest points Classification and Recognition Classification and interpretation of Segmentation Impose some order on group of pixels to separate them from each other
  • 10. A Pattern Classification Example “Sorting incoming Fish on a conveyor according to species using optical sensing”
  • 11. Purpose: To classify the future samples based on the data of extracted features from the training samples Problem Analysis Set up a camera and take some training samples to extract features • Length • Lightness • Width • Number and shape of fins • Position of the mouth, etc… – This is the set of all suggested features to explore for use in our classifier!
  • 12. Selection Criterion Select the length of the fish as a possible feature for discrimination
  • 13. The length is a poor feature alone! Select the lightness as a possible feature. Selection Criterion Length or Lightness (weight), which one is a better feature? No value of either feature will “classify” all fish correctly
  • 14. Fish xT = [x1, x2] Lightness Width Selection Criterion and Decision Boundary Adopt the lightness and add the width of the fish
  • 15. Decision Boundary Selection Which of the boundaries would you choose? Simple linear boundary (training error > 0) Nonlinear complex boundary (tr. error = 0) Simpler nonlinear boundary (tr. error > 0)
  • 16. Ideally, the best decision boundary should be the one which provides an optimal performance such as in the following figure: However, the central aim of designing a classifier is to correctly classify novel input (classify future samples) Issue of generalization Generalization and Decision Boundary
  • 17. Selected decision boundary A simpler but generalized boundary in this case may be the simple nonlinear curve as shown below: Further features can be added to enhance the performance However redundancy in information may not be desirable
  • 18. A classifier, intuitively, is designed to minimize classification error, the total number of instances (fish) classified incorrectly. – Is this the best objective (cost) function to minimize? What kinds of error can be made? Are they all equally bad? What is the real cost of making an error? • Sea bass misclassified as salmon: Pleasant surprise for the consumer, tastier fish • Salmon misclassified as sea bass: Customer upset, paid too much for inferior fish – We may want to adjust our decision boundary to minimize overall risk –in this case, second type error is more costly, so we may want to minimize this error. Adjustment of decision boundary
  • 19. Adjustment of decision boundary Error is minimized by moving the threshold to the left
  • 20. Terminologies in PR Features: a set of variables believed to carry discriminating and characterizing information about the objects under consideration Feature vector: A collection of d features, ordered in some meaningful way into a d-dimensional column vector, that represents the signature of the object to be identified. Feature space: The d-dimensional space in which the feature vectors lie. A d-dimensional vector in a d-dimensional space constitutes a point in that space.
  • 21. Terminologies in PR Decision boundary:A boundary in the d-dimensional feature space that separates patterns of different classes from each other. Training Data: Data used during training of a classifier for which the correct labels are known Field Test Data: Unknown data to be classified –for measurements from which the classifier is trained. The correct class of this data are not known
  • 22. Terminologies in PR Cost Function: A quantitative measure that represents the cost of making an error. The classifier is trained to minimize this function. Classifier: An algorithm which adjusts its parameters to find the correct decision boundaries –through a learning algorithm using a training dataset –such that a cost function is minimized. Error: Incorrect labeling of the data by the classifier Cost of error: Cost of making a decision, in particular an incorrect one –not all errors are equally costly! Training Performance: The ability / performance of the classifier in correctly identifying the classes of the training data, which it has already seen. It may not be a good indicator of the generalization performance.
  • 24. Good Features vs. Bad Features Ideally, for a given group of patterns coming from the same class, feature values should all be similar For patterns coming from different classes, the feature values should be different
  • 26. What is the cost of making a decision –what is the risk of an incorrect decision? Determine the right type of classifier, right type of training algorithm and right type of classifier parameters. How to choose a classifier that minimizes the risk and maximizes the performance? Can we add prior knowledge, context to the classifier? Identify and extract features relevant to distinguishing sea bass from salmon, but invariant to noise, occlusion rotations, scale, shift. What if we already have irrelevant features? Dimensionality reduction? Separate & isolate fish from background image CCD Camera on a conveyor belt Fish Components of Pattern Recognition System
  • 27. Example of PR Consider the problem of recognizing the letters L,P,O,E,Q – Determine a sufficient set of features – Design a tree-structured classifier 1
  • 29. Bayes Rule We now pose the following question: Given that the event A has occurred. What is the probability that any single one of the event B’s occur? This is known as the Bayes rule i i i i
  • 30. The Real World: The World of Many Dimensions In most practical applications, we have more then one feature, and therefore the random variable x must be replaced with a random vector x. P(x) P(x) The joint probability mass function P(x) still satisfies the axioms of probability The Bayes rule is then While the notation changes only slightly, the implications are quite substantial
  • 31. Bayes Decision theory Based on quantifying the trade offs betweens various classification decisions using a probabilistic approach The theory assumes: – Decision problem can be posed in probabilistic terms – All relevant probability values are known (in practice this is not true)
  • 32. The sea bass/salmon example State of nature: w1: Sea bass, w2: Salmon Assume that we know the probabilities of observing sea bass and salmons, P(w1) and P(w2) (prior probabilities), for a particular location of fishing and time of year Based on this information, how would you guess that the type of the next fish to be caught? • Decide w1 if P(w1) > P(w2) otherwise decide w2 A reasonable decision rule? If the catch of salmon and sea bass is equi-probable P(w1) = P(w2) (uniform priors) P(w1) + P( w2) = 1 (exclusivity and exhaustivity) This rule will not be applicable
  • 33. Class Conditional Probabilities p(x|w1) and p(x|w2) describe the difference in lightness between populations of sea bass and salmon
  • 34. Posterior Probability Suppose, we know P(w1), P(w2), p(x|w1) and p(x|w2) and that we have observed the value of the feature (a random variable) x – How would you decide on the “state of nature”–type of fish, based on this info? – Bayes theory allows us to compute the posterior probabilities from prior and class-conditional probabilities Posterior Probability: The (conditional) probability of correct class being ùj, given that feature value x has been observed Likelihood: The (class-conditional) probability of observing a feature value of x, given that the correct class is wj. All things being equal, the category with higher class conditional probability is more “likely”to be the correct class. Prior Probability:The total probability of correct class being class wj determined based on prior experience Evidence:The total probability of observing the feature value as x j
  • 35. Posterior Probabilities Bayes rule allows us to compute the posterior probability (difficult to determine) from prior probabilities, likelihood and the evidence (easier to determine). Posterior probabilities for priors P(w1) = 2/3 and P(w2) = 1/3. For example, given that a pattern is measured to have feature value x =14, the probability it is in category w2 is roughly 0.08, and that it is in w1 is 0.92. At every x, the posteriors sum to 1.0. Which class would you choose now? What is probability of making an error with this decision?
  • 36. Bayes Decision Rule Decision given the posterior probabilities X is an observation for which: if P(w1 | x) > P(w2 | x) True state of nature = w1 if P(w1 | x) < P(w2 | x) True state of nature = w2 That is Choose the class that has the larger posterior probability ! If there are multiple features, x={x1, x2,…, xd} and multiple classes Choose wi if P(wi| x) > P(wj| x) for all i = 1, 2, …, c Error: whenever we observe a particular x, the probability of error is : P(error | x) = P(w1 | x) if we decide w2 P(error | x) = P(w2 | x) if we decide w1 Therefore: P(error | x) = min [P(w1 | x), P(w2 | x)] (Bayes decision)
  • 37. Bayesian Decision Theory – Generalization Generalization of the preceding ideas – Use of more than one feature – Use more than two states of nature – Allowing other actions and not only decide on the state of nature – Unequal error cost Allowing actions other than classification primarily allows the possibility of rejection: refusing to make a decision in close or bad cases! The loss function states how costly each action taken is
  • 38. Two-category classification Definitions a1 : deciding w1 a2 : deciding w2 eij = e(ai | wj), loss incurred for deciding wi when the true state of nature is wj Conditional risk: R(a1 | x) = e11P(w1 | x) + e12P(w2 | x) R(a2 | x) = e21P(w1| x) + e22P(w2 | x) Our rule is the following: if R(a1| x) < R(a2| x) action a1: “decide w1” is taken