Your SlideShare is downloading. ×
0
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Matlab: Regression
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Matlab: Regression

3,483

Published on

Matlab: Regression

Matlab: Regression

Published in: Technology, Economy & Finance
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
3,483
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Matlab:Regression analysis<br />
  • 2. Regression analysis<br /> Regression analysis includes any techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables. <br />
  • 3. Linear Regression Models<br />In statistics, linear regression models often take the form of something like this:<br />
  • 4. Linear Regression Models<br />Examples of linear regression models with a scalar predictor variable x include:<br />Linear additive (straight-line) models — Terms are f1(x) = 1 and f2(x) = x.<br />Polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = xp–1.<br />Chebyshev orthogonal polynomial models — Terms are f1(x) = 1, f2(x) = x, …, fp(x) = 2xfp–1(x) – fp–2(x).<br />Fourier trigonometric polynomial models — Terms are f1(x) = 1/2 and sines and cosines of different frequencies.<br />
  • 5. Linear Regression Models<br />Examples of linear regression models with a vector of predictor variables x = (x1, ..., xN) include:<br />Linear additive (hyperplane) models — Terms are f1(x) = 1 and fk(x) = xk (k = 1, ..., N).<br />Pairwise interaction models — Terms are linear additive terms plus gk1k2(x) = xk1xk2 (k1, k2 = 1, ..., N, k1 ≠ k2).<br />Quadratic models — Terms are pairwise interaction terms plus hk(x) = xk2 (k = 1, ..., N).<br />Pure quadratic models — Terms are quadratic terms minus the gk1k2(x) terms.<br />
  • 6. Multiple Linear Regression<br />Multiple linear regression models are useful for:<br />Understanding which terms fj(x) have greatest effect on the response (coefficients βj with greatest magnitude)<br />Finding the direction of the effects (signs of the βj)<br />Predicting unobserved values of the response (y(x) for new x)<br />
  • 7. Robust Regression<br />If the distribution of errors is asymmetric or prone to outliers, model assumptions are invalidated, and parameter estimates, confidence intervals, and other computed statistics become unreliable. The Statistics Toolbox function robustfit is useful in these cases. The function implements a robust fitting method that is less sensitive than ordinary least squares to large changes in small parts of the data.<br />
  • 8. Stepwise Regression<br />Stepwise regression is a systematic method for adding and removing terms from a multilinear model based on their statistical significance in a regression. The method begins with an initial model and then compares the explanatory power of incrementally larger and smaller models.<br />
  • 9. Ridge Regression<br />This situation of multicollinearity can arise, for example, when data are collected without an experimental design.<br />Ridge regression addresses the problem by estimating regression coefficients using<br />where k is the ridge parameter and I is the identity matrix.<br />
  • 10. Partial Least Squares<br />Partial least-squares (PLS) regression is a technique used with data that contain correlated predictor variables. This technique constructs new predictor variables, known as components, as linear combinations of the original predictor variables.<br />
  • 11. Polynomial Models<br />Polynomial models have the advantages of being simple, familiar in their properties, and reasonably flexible for following data trends. They are also robust with respect to changes in the location and scale of the data. However, polynomial models may be poor predictors of new values.<br />
  • 12. Response Surface Models<br />Response surface models are multivariate polynomial models. They typically arise in the design of experiments where they are used to determine a set of design variables that optimize a response. Linear terms alone produce models with response surfaces that are hyper-planes. <br />
  • 13. Generalized Linear Models<br />Linear regression models describe a linear relationship between a response and one or more predictive terms. Many times, however, a nonlinear relationship exists. Nonlinear Regression describes general nonlinear models. A special class of nonlinear models, known as generalized linear models, makes use of linear methods.<br />
  • 14. Multivariate Regression<br />Whether or not the predictor x is a vector of predictor variables, multivariate regression refers to the case where the response y = (y1,..., yM) is a vector of M response variables.<br />
  • 15. Visit more self help tutorials<br />Pick a tutorial of your choice and browse through it at your own pace.<br />The tutorials section is free, self-guiding and will not involve any additional support.<br />Visit us at www.dataminingtools.net<br />

×