Join CMT Level 1, 2 & 3 Program Courses & become a professional Technical Analyst, CMT USA Best COACHING CLASSES. CMT Institute Live Classes by Expert Faculty. Exams are available in India. Best Career in Financial Market.
https://www.ptaindia.com/chartered-market-technician/
3. Learning Objective Statements
• Assess values generated by regression , multiple
regression, and tolerance calculations
• Select meaningful predictor variables for multiple
regression studies based on correlation values among
them and with the dependent variable
3
4. Regression Analysis
4
▪ Regression involves the use of the concept of correlation in predicting future
values of one security (dependent variable), in terms of another (independent or
predictor variable).
The Regression Equation
▪ The general form of a simple linear regression is:
where a (regression constant) is the intercept and b (regression coefficient) is the slope of the
line, y is the dependent variable and x is the independent variable.
▪ regression coefficient b can be alternatively expressed in terms of Pearson‘s correlation
coefficient r:
▪ where sx is the standard deviation of the predictor variable x and sy is the standard deviation
of the dependent variable y.
5. Regression Analysis
5
Multiple Regression
▪ The concept of linear regression with a single predictor variable can be extended
for more than one variable combined into the following equation:
▪ we want to include predictor variables that are highly correlated with the
dependent variable but have low correlations among themselves.
▪ Rule of thumb, intercorrelation among the independents above 0.80 signals a
possible problem and may cause the system to become unstable. The
statistically preferred method of assessing multicollinearity is to calculate the
tolerance coefficient for each independent variable.
6. Regression Analysis
6
Multiple Regression
▪ Tolerance is 1 − r2 (one minus the coefficient of determination) for the regression
of each independent variable on all the other independents, ignoring the
dependent. There will be as many tolerance coefficients as there are
independents.
▪ The higher the intercorrelation of the independents, the more the tolerance will
approach zero. As a rule of thumb, if tolerance is less than 0.20, a problem with
multicollinearity is indicated.
▪ High multicollinearity can be ignored when two or more independents are
components of an index, and high intercorrelation among them is intentional and
desirable.
▪ The first step is to calculate the correlation between the S&P 500 with major
international equity, commodity and financial indices.
7. Regression Analysis
7
Assumptions
- Linearity assumption is met is an essential task before using a
regression model
▪ Simple inspection of scatterplots is a common, if non-statistical,
method of determining if nonlinearity exists in a relationship.
▪ Normality can be visually assessed by looking at a histogram of
frequencies.
▪ Transforming one of the variables (for example by taking differences or
logs) can sometimes help linearize a nonlinear relationship between
two financial series.
8. Regression Analysis
8
Nonparametric Regression
▪ Nonparametric regression relaxes the usual assumption of
linearity and makes no assumptions about the population
distribution.
▪ Two common methods of nonparametric regression are kernel
regression and smoothing splines
▪ The smoothing splines method minimizes the sum of squared
residuals, adding a term which penalizes the roughness of the fit.
▪ The kernel regression algorithm falls into a class of algorithms
called ―SVMs or ―Support Vector Machines.
9. Points to be Remember
Regression analysis is a set of statistical methods used
for the estimation of relationships between a dependent
variable and one or more independent variables.
It can be utilized to assess the strength of the
relationship between variables and for modeling the
future relationship between them.
Simple Linear Regression - Simple linear regression is a
model that assesses the relationship between a
dependent variable and an independent variable.
Multiple Linear Regression analysis is essentially similar
to the simple linear model, with the exception that
multiple independent variables are used in the model.
10. Points to be Remember
Multiple linear regression follows the same conditions as
the simple linear model. However, since there are
several independent variables in multiple linear analysis,
there is another mandatory condition for the model:
Non-collinearity: Independent variables should show
a minimum correlation with each other. If the
independent variables are highly correlated with each
other, it will be difficult to assess the true relationships
between the dependent and independent variables.
Regression Analysis in Finance
- Beta and CAPM
- Forecasting Revenues and Expenses
11. Points to be Remember
Nonlinear regression is a mathematical function that
uses a generated line – typically a curve – to fit an
equation to some data.
The sum of squares is used to determine the fitness of a
regression model, which is computed by calculating the
difference between the mean and every point of data.
Nonparametric regression is a category of regression
analysis in which the predictor does not take a
predetermined form but is constructed according to
information derived from the data.
Kernel regression is a non-parametric technique to estimate the
conditional expectation of a random variable. The objective is to find a non-
linear relation between a pair of random variables X and Y
The smoothing splines method minimizes the sum of squared residuals,
adding a term which penalizes the roughness of the fit.