RapidMiner: Learning Schemes In Rapid Miner


Published on

RapidMiner: Learning Schemes In Rapid Miner

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

RapidMiner: Learning Schemes In Rapid Miner

  1. 1. RapidMiner5<br />2.5 - Learning Schemes<br />
  2. 2. Learning Schemes<br />Acquiring knowledge is fundamental for the development of intelligent systems. The operators described in this section were designed to automatically discover hypotheses to be used for future decisions.<br />
  3. 3. Learning Schemes<br />They can learn models from the given data and apply them to new data to predict a label for each observation in an unpredicted example set. The ModelApplier can be used to apply these models to unlabelled data.<br />
  4. 4. Learning Schemes<br />Additionally to some learning schemes and meta learning schemes directly implemented in RapidMiner, all learning operators provided by Weka are also available as RapidMiner learning operators.<br />
  5. 5. Examples<br />AdaBoost<br /> This AdaBoost implementation can be used with all learners available in RapidMiner, not only the ones which originally are part of the Wekapackage.<br />2. AdditiveRegression<br />This operator uses regression learner as a base learner. The learner starts with a default model (mean or mode) as a first prediction model. In each iteration it learns a new base model and applies it to the example set. Then, the residuals of the labels are calculated and the next base model is learned. The learned meta model predicts the label by adding all base model predictions.<br />
  6. 6. Examples<br />3. AgglomerativeClustering<br />This operator implements agglomerative clustering, providing the three different strategies SingleLink, CompleteLink and AverageLink. The last is also called UPGMA. The result will be a hierarchical cluster model, providing distance information to plot as a dendogram.<br />
  7. 7. Examples<br />4. Bagging<br />This Bagging implementation can be used with all learners available in RapidMiner, not only the ones which originally are part of the Weka package.<br />
  8. 8. Examples<br />5. BasicRuleLearner<br />This operator builds an unpruned rule set of classification rules. It is based on the paper Cendrowska, 1987: PRISM: An algorithm for inducing modular rules.<br />
  9. 9. Examples<br />6. BayesianBoosting<br />This operator trains an ensemble of classifiers for boolean target attributes. In each iteration the training set is reweighted, so that previously discovered patterns and other kinds of prior knowledge are sampled out“. An inner classifier, typically a rule or decision tree induction algorithm, is se quentiallyapplied several times, and the models are combined to a single global<br />model. <br />
  10. 10. Examples<br />7. CHAID<br />The CHAID decision tree learner works like the DecisionTree with one exception: it used a chi squared based criterion instead of the information gain or gain ratio criteria.<br />
  11. 11. More Questions?<br />Reach us at support@dataminingtools.net<br />Visit: www.dataminingtools.net<br />