• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
PhD Defense -- Ashish Mangalampalli
 

PhD Defense -- Ashish Mangalampalli

on

  • 519 views

My PhD defense slide-deck.

My PhD defense slide-deck.

Title: A Fuzzy Associative Rule-based Approach for Pattern Mining and Pattern-based Classification

Advisor: Dr. Vikram Pudi

Statistics

Views

Total Views
519
Views on SlideShare
519
Embed Views
0

Actions

Likes
0
Downloads
20
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    PhD Defense -- Ashish Mangalampalli PhD Defense -- Ashish Mangalampalli Presentation Transcript

    • A Fuzzy Associative Rule- based Approach for Pattern Mining and Pattern-based Classification Ashish Mangalampalli Advisor: Dr. Vikram Pudi Centre for Data Engineering International Institute of Information Technology (IIIT) Hyderabad1
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  FACISME – Fuzzy Adaption of ACME (Maximum Entropy Associative Classifier)  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 2
    • Introduction Associative classification  Mines huge amounts of data  Integrates Association Rule Mining (ARM) with Classification A = a, B = b, C = c → X = x Associative classifiers have several advantages  Frequent itemsets capture dominant relationships between items/features  Statistically significant associations make classification framework robust  Low-frequency patterns (noise) are eliminated during ARM  Rules are very transparent and easily understood  Unlike black-box-like approach used in popular classifiers, such as SVMs and Artificial Neural Networks 3
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 4
    • Crisp Associative Classification Most associative classifiers are crisp  Most real-life datasets contain binary and numerical attributes  Use sharp partitioning  Transform numerical attributes to binary ones, e.g. Income = [100K and above] Drawbacks of sharp partitioning  Introduces uncertainty, especially at partition boundaries  Small changes in intervals lead to misleading results  Gives rise to polysemy and synonymy  Intervals do not generally have clear semantics associated For example, sharp partitions for the attribute Income  Up to 20K, 20K-100K, 100K and above  Income = 50K would fit in the second partition  But, so would Income = 99K 5
    • Fuzzy Associative Classification Fuzzy logic  Used to convert numerical attributes to fuzzy attributes (e.g. Income = High)  Maintains integrity of information conveyed by numerical attributes  Attribute values belong to partitions with some membership - interval [0, 1] 6
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 7
    • Pre-Processing and Mining Fuzzy pre-processing  Convert crisp dataset (binary and numerical attributes) into fuzzy dataset (binary and fuzzy attributes)  FPrep Algorithm used Efficient and robust Fuzzy ARM algorithms  Web-scale datasets mandate such algorithms  Fuzzy Apriori is most popular  Many efficient crisp ARM algorithms exist like ARMOR and FP-Growth  Algorithms used  FAR-Miner for normal transactional datasets  FAR-HD for high dimensional datasets 8
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 13
    • Associative Classification – OurApproach AC algorithms like CPAR and CMAR only mine frequent itemsets  Processed using additional (greedy) algorithms like FOIL and PRM  Overhead in running time; process more complex Association rules directly used for training and scoring  Exhaustive approach  Controlled by appropriate support  Not a time-intensive process  Rule pruning and ranking take care of huge volume and redundancy Classifier built in a two-phased manner  Global rule-mining and training  Local rule-mining and training  Provides better accuracy and representation/coverage 14
    • Associative Classification – OurApproach (cont’d) Pre-processing to generate fuzzy dataset (for fuzzy associative classifiers) using FPrep Classification Association Rules (CARs) mining using FAR-Miner or FAR-HD CARs pruning and classifier training using SEAC or FSEAC Rule ranking and application (scoring) techniques 15
    • Simple and Effective AssociativeClassifier (SEAC) Direct mining of CARs – faster and simpler training CARs used directly through effective pruning and sorting Pruning and rule-ranking based on  Information gain  Rule-length Two-phased manner  Global rule-mining and training  Local rule-mining and training 16
    • SEAC - ExampleExample DatasetScoring ExampleUnlabeled: B=2, C=2X=1 → 16, 17, 19 (IG=0.534)X=2 → 13, 14, 20 (IG=0.657) Ruleset 17
    • Fuzzy Simple and Effective AssociativeClassifier (FSEAC) Amalgamates Fuzzy Logic with Associative Classification Pre-processed using FPreP CARs mined using FAR-Miner / FAR-HD CARs pruned based on Fuzzy Information Gain (FIG) and rule length - no sorting required Scoring – rules applied taking µ into account  Sorting done then  Final score computed 18
    • FSEAC - Example Format for Fuzzy Version of Dataset Example Dataset Fuzzy Version of Example Dataset19
    • FSEAC – Example (cont’d) Ruleset20
    • SEAC and FSEAC Experimental Setup SEAC  12 classifiers (Associative and non-associative)  14 UCI ML datasets  100-5000 records per dataset  2-10 classes per dataset  Up to 20 features per dataset  10-fold Cross Validation FSEAC  17 classifiers (Associative and non-associative; fuzzy and crisp)  23 UCI ML datasets  100-5000 records per dataset  2-10 classes per dataset  Up to 60 features per dataset  10-fold Cross Validation 21
    • SEAC – Results (10 fold-CV) continued22
    • SEAC - Results (10 fold-CV)23
    • FSEAC - Results (10 fold-CV) continued24
    • FSEAC - Results (10 fold-CV)25
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 26
    • Efficient Fuzzy Associative Classifier forObject Classes in Images (I-FAC) Adapts fuzzy associative classification for Object Class Detection in images  Speeded-Up Robust Features (SURF) - interest point detector and descriptor for images  Fuzzy clusters used as opposed to hard clustering used in Bag- of-words Only positive class (CP) examples used for mining  Negative class (CN) in object class detection is very vague  CN = U – CP Rules are pruned and ranked based on Information Gain  Other AC algorithms use third-party algorithms for rule- generation from frequent itemsets  Top k rules are used for scoring and classification 27 ICPR 2010
    • I-FAC SURF points extracted from positive class images  FCM applied to derive clusters  Clusters (with µs) used to generate dataset for mining  100 fuzzy clusters as opposed to1000-2000 crisp clusters-based algorithms ARM generates Classification Association Rules (CARs) associated with positive class CARs are pruned and sorted using  Fuzzy Information Gain (FIG) of each rule  Length of each rule i.e. number of attributes in each rule Scoring based on rule-match and FIG 28 ICPR 2010
    • I-FAC - Performance Study Performs well when compared to BOW or SVM  Very well at low FPRs (≤0.3) Fuzzy nature helps avoid polysemy and synonymy Uses only positive class for training 30 ICPR 2010
    • Visual Concept Detection on MIR Flickr Revamped version of I-FAC Multi-class detection  38 visual concepts  e.g. car, sky, clouds, water, building, sea, face Experimental evaluation  First 10K images of MIR Flick dataset  AUC values for each concept 31
    • Experimental Results (3-fold CV) continued32
    • Experimental Results (3-fold CV)33
    • Look-alike Modeling using Feature-Pair-based Associative Classification Display-ad targeting currently done using methods which rely on publisher-defined segments like Behavior-targeting (BT) Look-alike model trained to identify similar users  Similarity is based on historical user behavior  Model iteratively rebuilt as more users are added  Advertiser supplies seed list of users Approach for building advertiser specific audience segments  Complements publisher defined segments such as BT  Provides advertisers control over the audience definition Given a list of target users (e.g., people who clicked or converted on a particular category or ad campaign), find other similar users. 34 WWW 2011
    • Look-alike Modeling using Feature-Pair-based Associative Classification – cont’d Enumerate all feature-pairs in training set occurring in at least 5 positive-class records  Feature-pairs modelled as AC rules  Only rules for positive class used  Works well in Tail Campaigns Affinity measured by Frequency-weighted LLR (F-LLR)  FLLR = P(f) log(P(f | conv) / P(f | non-conv))  Rules sorted in descending order by F-LLRs Scoring - Top k rules are applied  Cumulative score from all rules used for classification 35 WWW 2011
    • Performance Study Two pilot campaigns  300K records each Lift Baseline (Conversion Lift (AUC)  One record per user Rate)  Training window - 14 Random days 82% – Targeting  Scoring window - seven Linear SVM 301% 11% days GBDT 100% 2% Works very well for Tail Results on a Tail Campaign Campaigns  Can find meaningful Lift Baseline Lift (Conversion Rate) associations in extremely (AUC) sparse and skewed data Random 48% – Targeting SVM and GBDT work Linear SVM -12% -6% well for Head Campaigns GBDT -40% -14% Results on a Head Campaign 36 WWW 2011
    • Outline Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining  Fuzzy Pre-Processing – FPrep  Fuzzy ARM – FAR-Miner and FAR-HD Associative Classification – Our Approach  Simple and Effective Associative Classifier (SEAC)  Fuzzy Simple and Effective Associative Classifier (FSEAC) Associative Classification – Applications  Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)  Associative Classifier for Ad-targeting Conclusions 37
    • Conclusions Fuzzy pre-processing for dataset transformation Fuzzy ARM for various types of datasets Fuzzy and Crisp Associative Classifiers for various domains  Customizations required for different domains  Pre-processing  Pruning  Rule ranking techniques  Rule application (scoring) techniques 38
    • References Ashish Mangalampalli, Adwait Ratnaparkhi, Andrew O. Hatch, Abraham Bagherjeiran, Rajesh Parekh, and Vikram Pudi. A Feature-Pair-based Associative Classification Approach to Look-alike Modeling for Conversion-Oriented User-Targeting in Tail Campaigns. In International World Wide Web Conference (WWW), 2011. Ashish Mangalampalli, Vineet Chaoji, and Subhajit Sanyal. I-FAC: Efficient fuzzy associative classifier for object classes in images. In International Conference on Pattern Recognition (ICPR), 2010. Ashish Mangalampalli and Vikram Pudi. FPrep: Fuzzy clustering driven efficient automated pre-processing for fuzzy association rule mining. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2010. Ashish Mangalampalli and Vikram Pudi. FACISME: Fuzzy associative classification using iterative scaling and maximum entropy. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2010. Ashish Mangalampalli and Vikram Pudi. Fuzzy Association Rule Mining Algorithm for Fast and Efficient Performance on Very Large Datasets. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2009. 39
    • Thank You, and Questions40