PARAMETRIC & NON-
PARAMETRIC SUPERVISED
MACHINE LEARNING
Rehan Guha
Senior Machine Learning Researcher
Pramati Technologies Pvt. Ltd.
! Pramati: A culture of building the agile enterprise
! Founded in 1998
! “Product DNA” : Pramati has built and scaled several independent product
companies
! Imaginea : Engineering Services wing of Pramati
! WaveMaker: Flagship product
! more than 350 Open source Commits
! Serving from 5 global locations
! Agile methodology
! 13 Home Grown Products
! Over 200 product companies as customers
! Design Exploration Incubation Lab
! M&A’s of leading global products
! more than 23 Open Source Contributions
! Unique products & services
TYPES OF MACHINE
LEARNING
PARAMETRIC SUPERVISED LEARNING
A parametric algorithm has a fixed number of parameters.
A parametric algorithm is computationally faster, but makes stronger
assumptions about the data; the algorithm may work well if the
assumptions turn out to be correct, but it may perform badly if the
assumptions are wrong.
A learning model that summarises data with a set of parameters of fixed
size (predefined mapped function) (independent of the number of
training examples). No matter how much data you throw at a parametric
model, it won’t change its mind about how many parameters it needs.
A common example of a parametric algorithm is Linear Regression,
Linear Support Vector Machines, Perceptron, Logistic Regression.
y= m.x + b
HOW TO CALCULATE “M” OR
“GRADIENT” ?
Ordinary Least Mean Square
“What if Data does not follow the pre-
defined algorithm?
NON-PARAMETRIC SUPERVISED LEARNING
In contrast, a non-parametric algorithm uses a flexible number of
parameters, and the number of parameters often grows as it learns
from more data. 
A non-parametric algorithm is computationally slower, but
makes fewer assumptions about the data. 
Non-parametric methods are good when you have a lot of data and no
prior knowledge, and when you don’t want to worry too much about
choosing just the right features.
A common example of a non-parametric algorithm is K-nearest
neighbour, Decision Trees, Artificial Neural Networks, Support
Vector Machines with Gaussian Kernels.
THE END

Parametric & Non-Parametric Machine Learning (Supervised ML)

  • 1.
    PARAMETRIC & NON- PARAMETRICSUPERVISED MACHINE LEARNING Rehan Guha Senior Machine Learning Researcher Pramati Technologies Pvt. Ltd.
  • 2.
    ! Pramati: Aculture of building the agile enterprise ! Founded in 1998 ! “Product DNA” : Pramati has built and scaled several independent product companies ! Imaginea : Engineering Services wing of Pramati ! WaveMaker: Flagship product ! more than 350 Open source Commits ! Serving from 5 global locations ! Agile methodology ! 13 Home Grown Products ! Over 200 product companies as customers ! Design Exploration Incubation Lab ! M&A’s of leading global products ! more than 23 Open Source Contributions ! Unique products & services
  • 5.
  • 11.
    PARAMETRIC SUPERVISED LEARNING Aparametric algorithm has a fixed number of parameters. A parametric algorithm is computationally faster, but makes stronger assumptions about the data; the algorithm may work well if the assumptions turn out to be correct, but it may perform badly if the assumptions are wrong. A learning model that summarises data with a set of parameters of fixed size (predefined mapped function) (independent of the number of training examples). No matter how much data you throw at a parametric model, it won’t change its mind about how many parameters it needs. A common example of a parametric algorithm is Linear Regression, Linear Support Vector Machines, Perceptron, Logistic Regression.
  • 12.
  • 13.
    HOW TO CALCULATE“M” OR “GRADIENT” ? Ordinary Least Mean Square
  • 14.
    “What if Datadoes not follow the pre- defined algorithm?
  • 15.
    NON-PARAMETRIC SUPERVISED LEARNING Incontrast, a non-parametric algorithm uses a flexible number of parameters, and the number of parameters often grows as it learns from more data.  A non-parametric algorithm is computationally slower, but makes fewer assumptions about the data.  Non-parametric methods are good when you have a lot of data and no prior knowledge, and when you don’t want to worry too much about choosing just the right features. A common example of a non-parametric algorithm is K-nearest neighbour, Decision Trees, Artificial Neural Networks, Support Vector Machines with Gaussian Kernels.
  • 17.