2. Parametric Machine Learning Algorithms
A learning model that summarizes data with a set of parameters of fixed size (independent of the
number of training examples) is called a parametric model. No matter how much data you throw at
a parametric model, it won’t change its mind about how many parameters it needs.
3. The algorithms involve two steps:
Select a form for the function.
Learn the coefficients for the function from the training data.
4. An easy to understand functional form for the mapping function is a line, as is used in linear
regression:
b0 + b1*x1 + b2*x2 = 0
Where b0, b1 and b2 are the coefficients of the line that control the intercept and slope, and x1 and
x2 are two input variables.
5. Some more examples of parametric machine
learning algorithms include:
Logistic Regression
Linear Discriminant Analysis
Perceptron
Naive Bayes
Simple Neural Networks
6. Benefits of Parametric Machine Learning
Algorithms:
Simpler: These methods are easier to understand and interpret results.
Speed: Parametric models are very fast to learn from data.
Less Data: They do not require as much training data and can work well even if the fit to the data is
not perfect.
7. Limitations of Parametric Machine Learning
Algorithms:
Constrained: By choosing a functional form these methods are highly constrained to the specified
form.
Limited Complexity: The methods are more suited to simpler problems.
Poor Fit: In practice the methods are unlikely to match the underlying mapping function.
8. Nonparametric Machine Learning Algorithms
Nonparametric methods are good when you have a lot of data and no prior knowledge, and when
you don’t want to worry too much about choosing just the right features.
9. An easy to understand nonparametric model is the k-nearest neighbors algorithm that makes
predictions based on the k most similar training patterns for a new data instance. The method does
not assume anything about the form of the mapping function other than patterns that are close are
likely to have a similar output variable.
10. Some more examples of popular nonparametric
machine learning algorithms are:
k-Nearest Neighbors
Decision Trees like CART and C4.5
Support Vector Machines
11. Benefits of Nonparametric Machine Learning
Algorithms:
Flexibility: Capable of fitting a large number of functional forms.
Power: No assumptions (or weak assumptions) about the underlying function.
Performance: Can result in higher performance models for prediction.
12. Limitations of Nonparametric Machine
Learning Algorithms:
More data: Require a lot more training data to estimate the mapping function.
Slower: A lot slower to train as they often have far more parameters to train.
Overfitting: More of a risk to overfit the training data and it is harder to explain why specific
predictions are made.