1.
Outline
1. Data Mining (DM) ~ KDD [Definition]
2. DM Technique
-> Association rules [support & confidence]
3. Example
(4. Apriori Algorithm)
2.
1. Data Mining ~ KDD [Definition]
- "Data mining (DM), also called KnowledgeDiscovery in Databases (KDD), is the process
of automatically searching large volumes of
data for patterns using specific DM
technique."
- [more formal definition] KDD ~ "the non-trivial
extraction of implicit, previously unknown
and potentially useful knowledge from data"
3.
1. Data Mining ~ KDD [Definition]
Data Mining techniques
•
•
•
•
•
•
Information Visualization
k-nearest neighbor
decision trees
neural networks
association rules
…
4.
2. Association rules
Support
Every association rule has a support and a confidence.
“The support is the percentage of transactions that demonstrate the rule.”
Example: Database with transactions ( customer_# : item_a1, item_a2,
…)
1:
2:
3:
4:
1, 3, 5.
1, 8, 14, 17, 12.
4, 6, 8, 12, 9, 104.
2, 1, 8.
support {8,12} = 2 (,or 50% ~ 2 of 4 customers)
support {1, 5} = 1 (,or 25% ~ 1 of 4 customers )
support {1} = 3 (,or 75% ~ 3 of 4 customers)
5.
2. Association rules
Support
An itemset is called frequent if its support is equal or
greater than an agreed upon minimal value – the support
threshold
add to previous example:
if threshold 50%
then itemsets {8,12} and {1} called frequent
6.
2. Association rules
Confidence
Every association rule has a support and a confidence.
An association rule is of the form:
X => Y
• X => Y: if someone buys X, he also buys Y
The confidence is the conditional probability that, given X
present in a transition , Y will also be present.
Confidence measure, by definition:
Confidence(X=>Y) equals support(X,Y) / support(X)
7.
2. Association rules
Confidence
We should only consider rules derived from
itemsets with high support, and that also have
high confidence.
“A rule with low confidence is not meaningful.”
Rules don’t explain anything, they just point out
hard facts in data volumes.
12.
3. Example
Example: Database with transactions ( customer_# : item_a1, item_a2, … )
Conf( {9} => {3} ) = 100%. Done.
Notice: High Confidence, Low Support.
-> Rule ( {9} => {3} ) not meaningful
13.
Apriori Algorithm
• In computer science and data mining, Apriori is
a classic algorithm for learning association rules.
• Apriori is designed to operate on databases
containing transactions (for example, collections
of items bought by customers, or details of a
website frequentation).
• The algorithm attempts to find subsets which are
common to at least a minimum number C (the
cutoff, or confidence threshold) of the itemsets.
13
14.
Definition (contd.)
• Apriori uses a "bottom up" approach, where
frequent subsets are extended one item at a
time (a step known as candidate generation, and
groups of candidates are tested against the
data.
• The algorithm terminates when no further
successful extensions are found.
• Apriori uses breadth-first search and a hash
tree structure to count candidate item sets
efficiently.
14
17.
Apriori Algorithm Examples
Problem Decomposition
Transaction ID Items Bought
1
Shoes, Shirt, Jacket
2
Shoes,Jacket
3
Shoes, Jeans
4
Shirt, Sweatshirt
If the minimum support is 50%, then {Shoes, Jacket} is the only 2itemset that satisfies the minimum support.
Frequent Itemset
{Shoes}
{Shirt}
{Jacket}
{Shoes, Jacket}
Support
75%
50%
50%
50%
If the minimum confidence is 50%, then the only two rules generated from this 2itemset, that have confidence greater than 50%, are:
Shoes ⇒ Jacket Support=50%, Confidence=66%
Jacket ⇒ Shoes Support=50%, Confidence=100%
17
20.
Apriori
Advantages/Disadvantages
• Advantages
– Uses large itemset property
– Easily parallelized
– Easy to implement
• Disadvantages
– Assumes transaction database is memory
resident.
– Requires many database scans.
20
21.
Summary
•
•
•
•
•
•
Association Rules form an very applied data mining
approach.
Association Rules are derived from frequent itemsets.
The Apriori algorithm is an efficient algorithm for
finding all frequent itemsets.
The Apriori algorithm implements level-wise search
using frequent item property.
The Apriori algorithm can be additionally optimized.
There are many measures for association rules.
21
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.
Be the first to comment