Upcoming SlideShare
×

# Quick Look At Classification

502 views

Published on

Quick Look At Classification

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
502
On SlideShare
0
From Embeds
0
Number of Embeds
22
Actions
Shares
0
0
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Quick Look At Classification

1. 1. Quick Look at CLASSIFICATION<br />
2. 2. Classification<br />Each object is assigned to precisely one class<br />Naïve Bayes Classifiers <br />Uses the probabilistic theory to find the most likely class<br />Nearest neighbor classification <br />Mainly used when all attribute values are continuous. It is also called as (k-Nearest neighbor or k-NN classification)<br />
3. 3. Basic K – NN Classification Algorithm <br />Find ‘k’ training instances that are closest to the unknown instance<br />Take the most commonly occurring classification for these ‘k’ instances<br />The neighbors can be weighted to improve classification<br />
4. 4. Normalization<br />Large magnitudes get more weight while calculating distances and thus nearest neighbors are not properly chosen.<br />Normalization ensures that units chosen don’t affect the selection of nearest neighbors<br />
5. 5. Eager & Lazy learning<br />Eager learning<br />Training data is ‘eagerly’ generalized into some representation model without waiting for unknown instances. Eg. Naïve Bayes algorithm<br />Lazy learning<br />Training data is not converted to a representation model until an unknown instance is presented for classification. Eg. Nearest neighbor algorithm<br />
6. 6. Visit more self help tutorials<br />Pick a tutorial of your choice and browse through it at your own pace.<br />The tutorials section is free, self-guiding and will not involve any additional support.<br />Visit us at www.dataminingtools.net<br />