• Save
Matlab: Regression
Upcoming SlideShare
Loading in...5
×
 

Matlab: Regression

on

  • 4,471 views

Matlab: Regression

Matlab: Regression

Statistics

Views

Total Views
4,471
Views on SlideShare
4,421
Embed Views
50

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 50

http://www.dataminingtools.net 29
http://dataminingtools.net 18
http://www.slideshare.net 3

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Matlab: Regression Matlab: Regression Presentation Transcript

  • Matlab:Regression analysis
  • Regression analysis
     Regression analysis includes any techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables.
  • Linear Regression Models
    In statistics, linear regression models often take the form of something like this:
  • Linear Regression Models
    Examples of linear regression models with a scalar predictor variable x include:
    Linear additive (straight-line) models — Terms are f1(x) = 1 and f2(x) = x.
    Polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = xp–1.
    Chebyshev orthogonal polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = 2xfp–1(x) – fp–2(x).
    Fourier trigonometric polynomial models — Terms are f1(x) = 1/2 and sines and cosines of different frequencies.
  • Linear Regression Models
    Examples of linear regression models with a vector of predictor variables x = (x1, ..., xN) include:
    Linear additive (hyperplane) models — Terms are f1(x) = 1 and fk(x) = xk (k = 1, ..., N).
    Pairwise interaction models — Terms are linear additive terms plus gk1k2(x) = xk1xk2 (k1, k2 = 1, ..., N, k1 ≠ k2).
    Quadratic models — Terms are pairwise interaction terms plus hk(x) = xk2 (k = 1, ..., N).
    Pure quadratic models — Terms are quadratic terms minus the gk1k2(x) terms.
  • Multiple Linear Regression
    Multiple linear regression models are useful for:
    Understanding which terms fj(x) have greatest effect on the response (coefficients βj with greatest magnitude)
    Finding the direction of the effects (signs of the βj)
    Predicting unobserved values of the response (y(x) for new x)
  • Robust Regression
    If the distribution of errors is asymmetric or prone to outliers, model assumptions are invalidated, and parameter estimates, confidence intervals, and other computed statistics become unreliable. The Statistics Toolbox function robustfit is useful in these cases. The function implements a robust fitting method that is less sensitive than ordinary least squares to large changes in small parts of the data.
  • Stepwise Regression
    Stepwise regression is a systematic method for adding and removing terms from a multilinear model based on their statistical significance in a regression. The method begins with an initial model and then compares the explanatory power of incrementally larger and smaller models.
  • Ridge Regression
    This situation of multicollinearity can arise, for example, when data are collected without an experimental design.
    Ridge regression addresses the problem by estimating regression coefficients using
    where k is the ridge parameter and I is the identity matrix.
  • Partial Least Squares
    Partial least-squares (PLS) regression is a technique used with data that contain correlated predictor variables. This technique constructs new predictor variables, known as components, as linear combinations of the original predictor variables.
  • Polynomial Models
    Polynomial models have the advantages of being simple, familiar in their properties, and reasonably flexible for following data trends. They are also robust with respect to changes in the location and scale of the data. However, polynomial models may be poor predictors of new values.
  • Response Surface Models
    Response surface models are multivariate polynomial models. They typically arise in the design of experiments where they are used to determine a set of design variables that optimize a response. Linear terms alone produce models with response surfaces that are hyper-planes.
  • Generalized Linear Models
    Linear regression models describe a linear relationship between a response and one or more predictive terms. Many times, however, a nonlinear relationship exists. Nonlinear Regression describes general nonlinear models. A special class of nonlinear models, known as generalized linear models, makes use of linear methods.
  • Multivariate Regression
    Whether or not the predictor x is a vector of predictor variables, multivariate regression refers to the case where the response y = (y1,..., yM) is a vector of M response variables.
  • Visit more self help tutorials
    Pick a tutorial of your choice and browse through it at your own pace.
    The tutorials section is free, self-guiding and will not involve any additional support.
    Visit us at www.dataminingtools.net