Feature Selection inMachine
Learning
Presented by
R.Sridevi
Assistant Professor,
Department of Computer Science,
Tagore Govt.Arts & Science College,
Puducherry.
2.
Agenda
Feature Selection
Study onFeature Selection Methods
Feature Selection in Machine Learning
Application of Feature Selection
Conclusion
3.
Feature Selection
Techniquefor identifying the most important features for learning.
Increases learning performance by reducing the dimensionality of
data.
Subset
Validation
4.
Filter Approach
Filteringmethods are independent of the induction algorithm.
Evaluate each feature individually based on its correlation with the target function.
Filter out irrelevant attributes before induction occurs
Input Features Feature Subset Selection Induction Algorithm
5.
Wrapper Approach
Input FeaturesInduction Algorithm
Wrapper Approach
Feature Subset Evaluation
Feature Subset Search
Induction Algorithm
A generic approach for feature selection occurs outside the basic
induction method but uses that method as a subroutine.
Searches the same space of feature subsets as filter methods, but it
evaluates alternative sets by running some induction algorithm on the
training data.
6.
Relevant, Irrelevant &Redundant Features
purpose of a Feature Selection is to identify relevant features according to
a definition of relevance.
Relevance with respect to Target
Strong Relevance to sample
Weak Relevance to the sample
presence of irrelevant attributes should considerably slow the rate of
learning.
Redundant features exists whenever a feature can take the role of another.
7.
Existing Feature SelectionMethods
Feature weighting/ ranking algorithms.
Floating Search Methods in Feature Selection.
Mutual Information Based Algorithms.
Correlation based Algorithms.
Feature Interaction based Algorithms.
8.
Categorization of FeatureSelection
Methods
FSA Supporting Aspects
Relief, Relief-F,ABB, LVF Noise Tolerant
FOCUS, FCBF, Consistency,
mRMR
Eliminate Redundant Features
Filter Model based on GA Eliminate Irrelevant Features
MIFS, FSBAR, ABB,
CBFS,FRFS,IWFS
Eliminate Redundant & Irrelevant
Features
SAGA, INTERACT,
FSBAR,FRFS,IWFS Feature Interaction
9.
Neural Network basedMachine Learning
Supervised-Learning NN
Feed-Forward NN
Perceptron
Backpropagation (BP)
LearningVector Quantization (LVQ)
Recurrent NN
Fuzzy Cognitive Map (FCM)
Boltzmann Machine (BM)
Unsupervised-Learning NN
Feed-Forward NN
Learning Matrix(LM)
Fuzzy Associative Memory (FAM)
Recurrent NN
Binary Adaptive ResonanceTheory (ART1)
Kohonen Self- Organizing Map (SOM)
Conclusion
Feature Selection improvesaccuracy during classification.
Enables the machine learning algorithm to train faster.
Avoid the curse of high dimensionality.