Matlab: Regression

Uploaded on

Matlab: Regression

Matlab: Regression

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. Matlab:Regression analysis
  • 2. Regression analysis
     Regression analysis includes any techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables.
  • 3. Linear Regression Models
    In statistics, linear regression models often take the form of something like this:
  • 4. Linear Regression Models
    Examples of linear regression models with a scalar predictor variable x include:
    Linear additive (straight-line) models — Terms are f1(x) = 1 and f2(x) = x.
    Polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = xp–1.
    Chebyshev orthogonal polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = 2xfp–1(x) – fp–2(x).
    Fourier trigonometric polynomial models — Terms are f1(x) = 1/2 and sines and cosines of different frequencies.
  • 5. Linear Regression Models
    Examples of linear regression models with a vector of predictor variables x = (x1, ..., xN) include:
    Linear additive (hyperplane) models — Terms are f1(x) = 1 and fk(x) = xk (k = 1, ..., N).
    Pairwise interaction models — Terms are linear additive terms plus gk1k2(x) = xk1xk2 (k1, k2 = 1, ..., N, k1 ≠ k2).
    Quadratic models — Terms are pairwise interaction terms plus hk(x) = xk2 (k = 1, ..., N).
    Pure quadratic models — Terms are quadratic terms minus the gk1k2(x) terms.
  • 6. Multiple Linear Regression
    Multiple linear regression models are useful for:
    Understanding which terms fj(x) have greatest effect on the response (coefficients βj with greatest magnitude)
    Finding the direction of the effects (signs of the βj)
    Predicting unobserved values of the response (y(x) for new x)
  • 7. Robust Regression
    If the distribution of errors is asymmetric or prone to outliers, model assumptions are invalidated, and parameter estimates, confidence intervals, and other computed statistics become unreliable. The Statistics Toolbox function robustfit is useful in these cases. The function implements a robust fitting method that is less sensitive than ordinary least squares to large changes in small parts of the data.
  • 8. Stepwise Regression
    Stepwise regression is a systematic method for adding and removing terms from a multilinear model based on their statistical significance in a regression. The method begins with an initial model and then compares the explanatory power of incrementally larger and smaller models.
  • 9. Ridge Regression
    This situation of multicollinearity can arise, for example, when data are collected without an experimental design.
    Ridge regression addresses the problem by estimating regression coefficients using
    where k is the ridge parameter and I is the identity matrix.
  • 10. Partial Least Squares
    Partial least-squares (PLS) regression is a technique used with data that contain correlated predictor variables. This technique constructs new predictor variables, known as components, as linear combinations of the original predictor variables.
  • 11. Polynomial Models
    Polynomial models have the advantages of being simple, familiar in their properties, and reasonably flexible for following data trends. They are also robust with respect to changes in the location and scale of the data. However, polynomial models may be poor predictors of new values.
  • 12. Response Surface Models
    Response surface models are multivariate polynomial models. They typically arise in the design of experiments where they are used to determine a set of design variables that optimize a response. Linear terms alone produce models with response surfaces that are hyper-planes.
  • 13. Generalized Linear Models
    Linear regression models describe a linear relationship between a response and one or more predictive terms. Many times, however, a nonlinear relationship exists. Nonlinear Regression describes general nonlinear models. A special class of nonlinear models, known as generalized linear models, makes use of linear methods.
  • 14. Multivariate Regression
    Whether or not the predictor x is a vector of predictor variables, multivariate regression refers to the case where the response y = (y1,..., yM) is a vector of M response variables.
  • 15. Visit more self help tutorials
    Pick a tutorial of your choice and browse through it at your own pace.
    The tutorials section is free, self-guiding and will not involve any additional support.
    Visit us at