View stunning SlideShares in full-screen with the new iOS app!Introducing SlideShare for AndroidExplore all your favorite topics in the SlideShare appGet the SlideShare app to Save for Later — even offline
View stunning SlideShares in full-screen with the new Android app!View stunning SlideShares in full-screen with the new iOS app!
Classification Each object is assigned to precisely one class Naïve Bayes Classifiers Uses the probabilistic theory to find the most likely class Nearest neighbor classification Mainly used when all attribute values are continuous. It is also called as (k-Nearest neighbor or k-NN classification)
Basic K – NN Classification Algorithm Find ‘k’ training instances that are closest to the unknown instance Take the most commonly occurring classification for these ‘k’ instances The neighbors can be weighted to improve classification
Normalization Large magnitudes get more weight while calculating distances and thus nearest neighbors are not properly chosen. Normalization ensures that units chosen don’t affect the selection of nearest neighbors
Eager & Lazy learning Eager learning Training data is ‘eagerly’ generalized into some representation model without waiting for unknown instances. Eg. Naïve Bayes algorithm Lazy learning Training data is not converted to a representation model until an unknown instance is presented for classification. Eg. Nearest neighbor algorithm
Visit more self help tutorials Pick a tutorial of your choice and browse through it at your own pace. The tutorials section is free, self-guiding and will not involve any additional support. Visit us at www.dataminingtools.net