SlideShare a Scribd company logo
Time Series
Vitalii Radchenko
What is Econometrics?
What is Econometrics?
• is the application of statistical methods to economic data
and is described as the branch of economics that aims to
give empirical content to economic relations
What is Econometrics?
• is the application of statistical methods to economic data
and is described as the branch of economics that aims to
give empirical content to economic relations
• Basic tools:
What is Econometrics?
• is the application of statistical methods to economic data
and is described as the branch of economics that aims to
give empirical content to economic relations
• Basic tools:
• linear regression models
What is Econometrics?
• is the application of statistical methods to economic data
and is described as the branch of economics that aims to
give empirical content to economic relations
• Basic tools:
• linear regression models
• statistical theory
The main goals
The main goals
• To find estimators that have desirable statistical properties:
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
• forecasting macroeconomic indexes
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
• forecasting macroeconomic indexes
• predicting revenue
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
• forecasting macroeconomic indexes
• predicting revenue
• estimating the impact of something
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
• forecasting macroeconomic indexes
• predicting revenue
• estimating the impact of something
Interpretability and Statistical robustness
Government
The main goals
• To find estimators that have desirable statistical properties:
• unbiasedness
• efficiency
• consistency
• Applied for
• assessing economic theories
• forecasting macroeconomic indexes
• predicting revenue
• estimating the impact of something
Interpretability and Statistical robustness
Basic approach
Basic approach
Linear regression
Basic approach
Linear regression
• Residuals assumptions:
Basic approach
Linear regression
• Residuals assumptions:
1. Mean equals 0
Basic approach
Linear regression
• Residuals assumptions:
1. Mean equals 0
2. Deviation is constant (homoscedasticity)
Basic approach
Linear regression
• Residuals assumptions:
1. Mean equals 0
2. Deviation is constant (homoscedasticity)
3. Independent residuals (covariation equals 0)
Basic approach
Linear regression
• Residuals assumptions:
1. Mean equals 0
2. Deviation is constant (homoscedasticity)
3. Independent residuals (covariation equals 0)
4. Independence of residuals and regressors
Basic approach
Linear regression
• Residuals assumptions:
1. Mean equals 0
2. Deviation is constant (homoscedasticity)
3. Independent residuals (covariation equals 0)
4. Independence of residuals and regressors
5. Residuals should be normally distributed
Gauss-Markov theorem
If all five assumptions are satisfied for a simple linear
regression, then the variance of the OLS estimates will be
the smallest among all unbiased estimates
Hypothesis estimation
• If the assumption of the residuals normality is satisfied,
then we test the hypotheses by comparing with the
values of the Fisher distribution

• If not satisfied – Chi-square
Hypothesis
Hypothesis
• The adequacy of the model (H0: R^2 = 0)
Hypothesis
• The adequacy of the model (H0: R^2 = 0)
• The significance of the correlation between the variables
(H0: rxy = 0)
Hypothesis
• The adequacy of the model (H0: R^2 = 0)
• The significance of the correlation between the variables
(H0: rxy = 0)
• The significance of the regression coefficients (H0: B = 0)
Hypothesis
• The adequacy of the model (H0: R^2 = 0)
• The significance of the correlation between the variables
(H0: rxy = 0)
• The significance of the regression coefficients (H0: B = 0)
• Multicollinearity (VIF, Farr-Glauber criterion)
Hypothesis
• The adequacy of the model (H0: R^2 = 0)
• The significance of the correlation between the variables
(H0: rxy = 0)
• The significance of the regression coefficients (H0: B = 0)
• Multicollinearity (VIF, Farr-Glauber criterion)
• Check of the functional form (criterion RESET)
Heteroskedasticity
Heteroskedasticity
• Identification: Golffred-Quondt criterion, White criterion,
Broyush-Pagan criterion, Glaser criterion
Heteroskedasticity
• Identification: Golffred-Quondt criterion, White criterion,
Broyush-Pagan criterion, Glaser criterion
• Second condition is not satisfied for the variance equality
Heteroskedasticity
• Identification: Golffred-Quondt criterion, White criterion,
Broyush-Pagan criterion, Glaser criterion
• Second condition is not satisfied for the variance equality
• Weighted least squares
Heteroskedasticity
• Identification: Golffred-Quondt criterion, White criterion,
Broyush-Pagan criterion, Glaser criterion
• Second condition is not satisfied for the variance equality
• Weighted least squares
Autocorrelation
Autocorrelation
• Correlated residuals
Autocorrelation
• Correlated residuals
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
• The generalized least-squares method
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
• The generalized least-squares method
• Select correlation coefficient:
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
• The generalized least-squares method
• Select correlation coefficient:
• Durbin-Watson estimate
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
• The generalized least-squares method
• Select correlation coefficient:
• Durbin-Watson estimate
• Cochrane-Orcatte method
Autocorrelation
• Correlated residuals
• Identification: the Durbin-Watson criterion, Broych-Godfrey
• The generalized least-squares method
• Select correlation coefficient:
• Durbin-Watson estimate
• Cochrane-Orcatte method
• Hildretha-Lou method
Time Series Patterns
Time Series Patterns
Trend
Time Series Patterns
Trend
• A trend exists when there is a long-term increase or decrease in the data. It does
not have to be linear. Sometimes we will refer to a trend “changing direction”
when it might go from an increasing trend to a decreasing trend.
Time Series Patterns
Trend
• A trend exists when there is a long-term increase or decrease in the data. It does
not have to be linear. Sometimes we will refer to a trend “changing direction”
when it might go from an increasing trend to a decreasing trend.
Seasonal
Time Series Patterns
Trend
• A trend exists when there is a long-term increase or decrease in the data. It does
not have to be linear. Sometimes we will refer to a trend “changing direction”
when it might go from an increasing trend to a decreasing trend.
Seasonal
• A seasonal pattern exists when a series is influenced by seasonal factors (e.g.,
the quarter of the year, the month, or day of the week). Seasonality is always of a
fixed and known period.
Time Series Patterns
Trend
• A trend exists when there is a long-term increase or decrease in the data. It does
not have to be linear. Sometimes we will refer to a trend “changing direction”
when it might go from an increasing trend to a decreasing trend.
Seasonal
• A seasonal pattern exists when a series is influenced by seasonal factors (e.g.,
the quarter of the year, the month, or day of the week). Seasonality is always of a
fixed and known period.
Cyclic
Time Series Patterns
Trend
• A trend exists when there is a long-term increase or decrease in the data. It does
not have to be linear. Sometimes we will refer to a trend “changing direction”
when it might go from an increasing trend to a decreasing trend.
Seasonal
• A seasonal pattern exists when a series is influenced by seasonal factors (e.g.,
the quarter of the year, the month, or day of the week). Seasonality is always of a
fixed and known period.
Cyclic
• A cyclic pattern exists when data exhibit rises and falls that are not of fixed
period. The duration of these fluctuations is usually of at least 2 years.
Time series decomposition
Time series decomposition
• additive model
yt = St +Tt + Et
St - seasonal component, Tt – trend-cycle component, Et – reminder
Time series decomposition
• additive model
• multiplicative model
yt = St +Tt + Et
yt = St iTt i Et
St - seasonal component, Tt – trend-cycle component, Et – reminder
Time series decomposition
• additive model
• multiplicative model
yt = St +Tt + Et
yt = St iTt i Et
St - seasonal component, Tt – trend-cycle component, Et – reminder
Time Series Decomposition
• What is the problem here?
Time Series Decomposition
• What is the problem here? Seasonal component is not meaningful
Stationarity and differencing
Stationarity and differencing
• Stationary data:
Stationarity and differencing
• Stationary data:
• mean is a constant
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
Problem is that the absence of a unit root is not a proof of stationarity
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
• Make data stationary:
Problem is that the absence of a unit root is not a proof of stationarity
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
• Make data stationary:
• log-transformation
Problem is that the absence of a unit root is not a proof of stationarity
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
• Make data stationary:
• log-transformation
• differencing
Problem is that the absence of a unit root is not a proof of stationarity
Stationarity and differencing
• Stationary data:
• mean is a constant
• variance is a constant
• covariance is not a function of time
• Tests:
• Augmented Dickey-Fuller (a unit root is present)
• KPSS (trend stationary)
• Make data stationary:
• log-transformation
• differencing
• log-transformation and differencing
Problem is that the absence of a unit root is not a proof of stationarity
Autoregressive models
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
• For an AR(1) model:
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
• For an AR(1) model:
• When ϕ1=0, yt is equivalent to white noise
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
• For an AR(1) model:
• When ϕ1=0, yt is equivalent to white noise
• When ϕ1=1 and c=0, yt is equivalent to a random walk
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
• For an AR(1) model:
• When ϕ1=0, yt is equivalent to white noise
• When ϕ1=1 and c=0, yt is equivalent to a random walk
• When ϕ1=1 and c≠0, yt is equivalent to a random walk with drift
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Autoregressive models
• forecast the variable of interest using a linear combination of past
values of the variable
• For an AR(1) model:
• When ϕ1=0, yt is equivalent to white noise
• When ϕ1=1 and c=0, yt is equivalent to a random walk
• When ϕ1=1 and c≠0, yt is equivalent to a random walk with drift
• When ϕ1<0, yt tends to oscillate between positive and negative
values.
yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
Moving average models
• Rather than use past values of the forecast variable in a
regression, a moving average model uses past forecast
errors in a regression-like model
yt = c + et +θ1et−1 +θ2et−2 +…+θpet−p
ARIMA
ARIMA
• combine differencing with autoregression and a moving average model, we obtain
a non-seasonal ARIMA model
yt
'
= c +φ1yt−1
'
+…+φpyt−p
'
+θ1et−1 +…+θqet−q + et
ARIMA
• combine differencing with autoregression and a moving average model, we obtain
a non-seasonal ARIMA model
• Information Criteria
yt
'
= c +φ1yt−1
'
+…+φpyt−p
'
+θ1et−1 +…+θqet−q + et
ARIMA
• combine differencing with autoregression and a moving average model, we obtain
a non-seasonal ARIMA model
• Information Criteria
• Akaike’s Information Criterion (AIC)
yt
'
= c +φ1yt−1
'
+…+φpyt−p
'
+θ1et−1 +…+θqet−q + et
AIC = −2log(L)+ 2(p + q + k +1)
ARIMA
• combine differencing with autoregression and a moving average model, we obtain
a non-seasonal ARIMA model
• Information Criteria
• Akaike’s Information Criterion (AIC)
• Bayesian Information Criterion (BIC)
yt
'
= c +φ1yt−1
'
+…+φpyt−p
'
+θ1et−1 +…+θqet−q + et
AIC = −2log(L)+ 2(p + q + k +1)
BIC = AIC + (log(L)− 2)(p + q + k +1)
ARIMA
• combine differencing with autoregression and a moving average model, we obtain
a non-seasonal ARIMA model
• Information Criteria
• Akaike’s Information Criterion (AIC)
• Bayesian Information Criterion (BIC)
• corrected Akaike’s Information Criterion (AICc)
yt
'
= c +φ1yt−1
'
+…+φpyt−p
'
+θ1et−1 +…+θqet−q + et
AIC = −2log(L)+ 2(p + q + k +1)
BIC = AIC + (log(L)− 2)(p + q + k +1)
AICc = AIC +
2(p + q + k +1)(p + q + k + 2)
T − p − q − k − 2
Other econometrics models
Other econometrics models
• SARIMA – ARIMA with seasonal component
Other econometrics models
• SARIMA – ARIMA with seasonal component
• ARFIMA – ARIMA allowing non-integer values in differencing
parameter
Other econometrics models
• SARIMA – ARIMA with seasonal component
• ARFIMA – ARIMA allowing non-integer values in differencing
parameter
• VAR – vector autoregression (allows to predict multiple target
variables and to learn dynamic relationships between them)
Other econometrics models
• SARIMA – ARIMA with seasonal component
• ARFIMA – ARIMA allowing non-integer values in differencing
parameter
• VAR – vector autoregression (allows to predict multiple target
variables and to learn dynamic relationships between them)
• ARCH (GARCH) – (assume that we have heteroskedastisity,
predicting mean and deviation separately)
Other econometrics models
• SARIMA – ARIMA with seasonal component
• ARFIMA – ARIMA allowing non-integer values in differencing
parameter
• VAR – vector autoregression (allows to predict multiple target
variables and to learn dynamic relationships between them)
• ARCH (GARCH) – (assume that we have heteroskedastisity,
predicting mean and deviation separately)
• Hierarchical time series (predict high and low level)
Other econometrics models
• SARIMA – ARIMA with seasonal component
• ARFIMA – ARIMA allowing non-integer values in differencing
parameter
• VAR – vector autoregression (allows to predict multiple target
variables and to learn dynamic relationships between them)
• ARCH (GARCH) – (assume that we have heteroskedastisity,
predicting mean and deviation separately)
• Hierarchical time series (predict high and low level)
Forecasting: principles and practice (Rob J Hyndman)
Why econometrics models
are good?
Why econometrics models
are good?
Simple
Why econometrics models
are good?
Simple
Statistically robust
Why econometrics models
are good?
Simple
Statistically robust
Interpretable
Why econometrics models
are bad?
Why econometrics models
are bad?
Simple
Why econometrics models
are bad?
• mostly linear and can catch non-linear dependencies
Simple
Why econometrics models
are bad?
• mostly linear and can catch non-linear dependencies
• takes long time to optimize and train models
Simple
Why econometrics models
are bad?
• mostly linear and can catch non-linear dependencies
• takes long time to optimize and train models
• accuracy is not very good
Simple
Why econometrics models
are bad?
• mostly linear and can catch non-linear dependencies
• takes long time to optimize and train models
• accuracy is not very good
• the same features could be generated manually and used
with more complex models
Simple
Neural Networks
Neural Networks
• RNN is a hype theme
Neural Networks
• RNN is a hype theme
• Sequence-to-Sequence modeling
.. 3 2 1
LSTM
1 2 3 ..
Neural Networks
• RNN is a hype theme
• Sequence-to-Sequence modeling
• Experiment with number of lag features and adding “external” data
.. 3 2 1
LSTM
1 2 3 ..
.. 3 2 1
LSTM
1 2 3 ..
Dense
External features
Neural Networks
• RNN is a hype theme
• Sequence-to-Sequence modeling
• Experiment with number of lag features and adding “external” data
• Works with multiple time series with a long history
.. 3 2 1
LSTM
1 2 3 ..
.. 3 2 1
LSTM
1 2 3 ..
Dense
External features
Neural Networks
• RNN is a hype theme
• Sequence-to-Sequence modeling
• Experiment with number of lag features and adding “external” data
• Works with multiple time series with a long history
• Usually works worse than linear or boosting models
.. 3 2 1
LSTM
1 2 3 ..
.. 3 2 1
LSTM
1 2 3 ..
Dense
External features
Stacking
Validation
Linear Regressions
Boosting
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4Step 1
• train linear regression and optimize parameters on Fold 1
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4Step 1
• train linear regression and optimize parameters on Fold 1
Step 2
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4Step 1
• train linear regression and optimize parameters on Fold 1
Step 2
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
Step 3
• train linear regression and optimize parameters on Fold 2
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4Step 1
• train linear regression and optimize parameters on Fold 1
Step 2
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
Step 3
• train linear regression and optimize parameters on Fold 2
Step 4
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
…
Stacking
Train Test
Train Test
Train Test
Train Test
Fold 1
Fold 2
Fold 3
Fold 4Step 1
• train linear regression and optimize parameters on Fold 1
Step 2
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
Step 3
• train linear regression and optimize parameters on Fold 2
Step 4
• predict train and test on Fold 2. Use predictions as a new feature and apply boosting
…
Last step
• validate final results on Fold 2, Fold 3 and Fold 4
Interpretation
Interpretation
• Linear regression - the easiest way to interpret features
Interpretation
• Linear regression - the easiest way to interpret features
• MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear
relationships
Interpretation
• Linear regression - the easiest way to interpret features
• MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear
relationships
• parameters: number of interactions, regularization, smoothing etc
Interpretation
• Linear regression - the easiest way to interpret features
• MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear
relationships
• parameters: number of interactions, regularization, smoothing etc
• ELI5 – support Lightgbm and Xgboost sklearn api
Interpretation
• Linear regression - the easiest way to interpret features
• MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear
relationships
• parameters: number of interactions, regularization, smoothing etc
• ELI5 – support Lightgbm and Xgboost sklearn api
• show weights and explain predictions
Interpretation
• Linear regression - the easiest way to interpret features
• MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear
relationships
• parameters: number of interactions, regularization, smoothing etc
• ELI5 – support Lightgbm and Xgboost sklearn api
• show weights and explain predictions
• as it is linear approximation, <BIAS> is usually big (misleading)
Summary
Summary
• Choose metric and validation approach based on business task
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
• Generate default features
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
• Generate default features
• Try linear and boosting models (and RNN)
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
• Generate default features
• Try linear and boosting models (and RNN)
• Add more features
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
• Generate default features
• Try linear and boosting models (and RNN)
• Add more features
• Don’t forget about ensembles and stacking :)
Summary
• Choose metric and validation approach based on business task
• Do basic EDA and start with log-transformation
• Create a simple baseline
• Generate default features
• Try linear and boosting models (and RNN)
• Add more features
• Don’t forget about ensembles and stacking :)
• Check feature weights and prediction explanation
Contacts
• ODS-slack: @vradchenko

• Email: radchenko.vitaliy.o@gmail.com

• Facebook: https://www.facebook.com/
vitaliyradchenko127
Thank you

More Related Content

Similar to Time Series, Vitalii Radchenko

Qm 0809
Qm 0809 Qm 0809
Qm 0809
8430025979
 
3. Statistical Analysis.pptx
3. Statistical Analysis.pptx3. Statistical Analysis.pptx
3. Statistical Analysis.pptx
jeyanthisivakumar
 
O M Unit 3 Forecasting
O M Unit 3 ForecastingO M Unit 3 Forecasting
O M Unit 3 Forecasting
RASHMIPANWAR10
 
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher TrainingONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
Office for National Statistics
 
Demand forecasting
Demand forecastingDemand forecasting
Demand forecasting
PT Education, Indore
 
Regression
Regression Regression
Regression
JayeshGadhave1
 
Chapter34
Chapter34Chapter34
Chapter34
Ying Liu
 
The Strange World of Bibliometric Numbers: Implications for Professional Prac...
The Strange World of Bibliometric Numbers: Implications for Professional Prac...The Strange World of Bibliometric Numbers: Implications for Professional Prac...
The Strange World of Bibliometric Numbers: Implications for Professional Prac...
Sam Gray
 
Understanding statistics in research
Understanding statistics in researchUnderstanding statistics in research
Understanding statistics in research
Dr. Senthilvel Vasudevan
 
12 rhl gta
12 rhl gta12 rhl gta
12 rhl gta
Nabhoneil Basu
 
prediction of_inventory_management
prediction of_inventory_managementprediction of_inventory_management
prediction of_inventory_management
FEG
 
Economic NotesLipsey ppt ch02
Economic NotesLipsey ppt ch02Economic NotesLipsey ppt ch02
Economic NotesLipsey ppt ch02
Thangarajah Kopiram
 
Regression analysis made easy
Regression analysis made easyRegression analysis made easy
Regression analysis made easy
Weam Banjar
 
Biostatistics
BiostatisticsBiostatistics
Intro to Statistics.pptx
Intro to Statistics.pptxIntro to Statistics.pptx
Intro to Statistics.pptx
Elyada Wigati Pramaresti
 
FORECASTING ERRORS (2) (2).pptx
FORECASTING ERRORS (2) (2).pptxFORECASTING ERRORS (2) (2).pptx
FORECASTING ERRORS (2) (2).pptx
PradipDulal2
 
Day 1_ Introduction.pptx
Day 1_ Introduction.pptxDay 1_ Introduction.pptx
Day 1_ Introduction.pptx
AyushiAgarwal265370
 
A presentation for Multiple linear regression.ppt
A presentation for Multiple linear regression.pptA presentation for Multiple linear regression.ppt
A presentation for Multiple linear regression.ppt
vigia41
 
ststs nw.pptx
ststs nw.pptxststs nw.pptx
ststs nw.pptx
MrymNb
 
Time Series Analysis and Forecasting.ppt
Time Series Analysis and Forecasting.pptTime Series Analysis and Forecasting.ppt
Time Series Analysis and Forecasting.ppt
ssuser220491
 

Similar to Time Series, Vitalii Radchenko (20)

Qm 0809
Qm 0809 Qm 0809
Qm 0809
 
3. Statistical Analysis.pptx
3. Statistical Analysis.pptx3. Statistical Analysis.pptx
3. Statistical Analysis.pptx
 
O M Unit 3 Forecasting
O M Unit 3 ForecastingO M Unit 3 Forecasting
O M Unit 3 Forecasting
 
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher TrainingONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
ONS Guide to Social and Economic Research – Welsh Baccalaureate Teacher Training
 
Demand forecasting
Demand forecastingDemand forecasting
Demand forecasting
 
Regression
Regression Regression
Regression
 
Chapter34
Chapter34Chapter34
Chapter34
 
The Strange World of Bibliometric Numbers: Implications for Professional Prac...
The Strange World of Bibliometric Numbers: Implications for Professional Prac...The Strange World of Bibliometric Numbers: Implications for Professional Prac...
The Strange World of Bibliometric Numbers: Implications for Professional Prac...
 
Understanding statistics in research
Understanding statistics in researchUnderstanding statistics in research
Understanding statistics in research
 
12 rhl gta
12 rhl gta12 rhl gta
12 rhl gta
 
prediction of_inventory_management
prediction of_inventory_managementprediction of_inventory_management
prediction of_inventory_management
 
Economic NotesLipsey ppt ch02
Economic NotesLipsey ppt ch02Economic NotesLipsey ppt ch02
Economic NotesLipsey ppt ch02
 
Regression analysis made easy
Regression analysis made easyRegression analysis made easy
Regression analysis made easy
 
Biostatistics
BiostatisticsBiostatistics
Biostatistics
 
Intro to Statistics.pptx
Intro to Statistics.pptxIntro to Statistics.pptx
Intro to Statistics.pptx
 
FORECASTING ERRORS (2) (2).pptx
FORECASTING ERRORS (2) (2).pptxFORECASTING ERRORS (2) (2).pptx
FORECASTING ERRORS (2) (2).pptx
 
Day 1_ Introduction.pptx
Day 1_ Introduction.pptxDay 1_ Introduction.pptx
Day 1_ Introduction.pptx
 
A presentation for Multiple linear regression.ppt
A presentation for Multiple linear regression.pptA presentation for Multiple linear regression.ppt
A presentation for Multiple linear regression.ppt
 
ststs nw.pptx
ststs nw.pptxststs nw.pptx
ststs nw.pptx
 
Time Series Analysis and Forecasting.ppt
Time Series Analysis and Forecasting.pptTime Series Analysis and Forecasting.ppt
Time Series Analysis and Forecasting.ppt
 

More from Sigma Software

Fast is Best. Using .NET MinimalAPIs
Fast is Best. Using .NET MinimalAPIsFast is Best. Using .NET MinimalAPIs
Fast is Best. Using .NET MinimalAPIs
Sigma Software
 
"Are you developing or declining? Don't become an IT-dinosaur"
"Are you developing or declining? Don't become an IT-dinosaur""Are you developing or declining? Don't become an IT-dinosaur"
"Are you developing or declining? Don't become an IT-dinosaur"
Sigma Software
 
Michael Smolin, "Decrypting customer's cultural code"
Michael Smolin, "Decrypting customer's cultural code"Michael Smolin, "Decrypting customer's cultural code"
Michael Smolin, "Decrypting customer's cultural code"
Sigma Software
 
Max Kunytsia, “Why is continuous product discovery better than continuous del...
Max Kunytsia, “Why is continuous product discovery better than continuous del...Max Kunytsia, “Why is continuous product discovery better than continuous del...
Max Kunytsia, “Why is continuous product discovery better than continuous del...
Sigma Software
 
Marcelino Moreno, "Product Management Mindset"
Marcelino Moreno, "Product Management Mindset"Marcelino Moreno, "Product Management Mindset"
Marcelino Moreno, "Product Management Mindset"
Sigma Software
 
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
Sigma Software
 
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
Sigma Software
 
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
Sigma Software
 
Stoyan Atanasov “How crucial is the BA role in an IT Project"
Stoyan Atanasov “How crucial is the BA role in an IT Project"Stoyan Atanasov “How crucial is the BA role in an IT Project"
Stoyan Atanasov “How crucial is the BA role in an IT Project"
Sigma Software
 
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
Sigma Software
 
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
Sigma Software
 
VOLVO x HACK SPRINT
VOLVO x HACK SPRINTVOLVO x HACK SPRINT
VOLVO x HACK SPRINT
Sigma Software
 
Business digitalization trends and challenges
Business digitalization trends and challengesBusiness digitalization trends and challenges
Business digitalization trends and challenges
Sigma Software
 
Дмитро Терещенко, "How to secure your application with Secure SDLC"
Дмитро Терещенко, "How to secure your application with Secure SDLC"Дмитро Терещенко, "How to secure your application with Secure SDLC"
Дмитро Терещенко, "How to secure your application with Secure SDLC"Sigma Software
 
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
Sigma Software
 
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
Sigma Software
 
Training solutions and content creation
Training solutions and content creationTraining solutions and content creation
Training solutions and content creation
Sigma Software
 
False news - false truth: tips & tricks how to avoid them
False news - false truth: tips & tricks how to avoid themFalse news - false truth: tips & tricks how to avoid them
False news - false truth: tips & tricks how to avoid them
Sigma Software
 
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
Sigma Software
 
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
Sigma Software
 

More from Sigma Software (20)

Fast is Best. Using .NET MinimalAPIs
Fast is Best. Using .NET MinimalAPIsFast is Best. Using .NET MinimalAPIs
Fast is Best. Using .NET MinimalAPIs
 
"Are you developing or declining? Don't become an IT-dinosaur"
"Are you developing or declining? Don't become an IT-dinosaur""Are you developing or declining? Don't become an IT-dinosaur"
"Are you developing or declining? Don't become an IT-dinosaur"
 
Michael Smolin, "Decrypting customer's cultural code"
Michael Smolin, "Decrypting customer's cultural code"Michael Smolin, "Decrypting customer's cultural code"
Michael Smolin, "Decrypting customer's cultural code"
 
Max Kunytsia, “Why is continuous product discovery better than continuous del...
Max Kunytsia, “Why is continuous product discovery better than continuous del...Max Kunytsia, “Why is continuous product discovery better than continuous del...
Max Kunytsia, “Why is continuous product discovery better than continuous del...
 
Marcelino Moreno, "Product Management Mindset"
Marcelino Moreno, "Product Management Mindset"Marcelino Moreno, "Product Management Mindset"
Marcelino Moreno, "Product Management Mindset"
 
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
Andrii Pastushok, "Product Discovery in Outsourcing - What, When, and How"
 
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
Elena Turkenych “BA vs PM: Who' the right person, for the right job, with the...
 
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
Eleonora Budanova “BA+PM+DEV team: how to build the synergy”
 
Stoyan Atanasov “How crucial is the BA role in an IT Project"
Stoyan Atanasov “How crucial is the BA role in an IT Project"Stoyan Atanasov “How crucial is the BA role in an IT Project"
Stoyan Atanasov “How crucial is the BA role in an IT Project"
 
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
Olexandra Kovalyova, "Equivalence Partitioning, Boundary Values ​​Analysis, C...
 
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
Yana Lysa — "Decision Tables, State-Transition testing, Pairwase Testing"
 
VOLVO x HACK SPRINT
VOLVO x HACK SPRINTVOLVO x HACK SPRINT
VOLVO x HACK SPRINT
 
Business digitalization trends and challenges
Business digitalization trends and challengesBusiness digitalization trends and challenges
Business digitalization trends and challenges
 
Дмитро Терещенко, "How to secure your application with Secure SDLC"
Дмитро Терещенко, "How to secure your application with Secure SDLC"Дмитро Терещенко, "How to secure your application with Secure SDLC"
Дмитро Терещенко, "How to secure your application with Secure SDLC"
 
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
Яна Лиса, “Ефективні методи написання хороших мануальних тестових сценаріїв”
 
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
Тетяна Осетрова, “Модель зрілості розподіленної проектної команди”
 
Training solutions and content creation
Training solutions and content creationTraining solutions and content creation
Training solutions and content creation
 
False news - false truth: tips & tricks how to avoid them
False news - false truth: tips & tricks how to avoid themFalse news - false truth: tips & tricks how to avoid them
False news - false truth: tips & tricks how to avoid them
 
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
Анна Бойко, "Хороший контракт vs очікування клієнтів. Що вбереже вас, якщо вд...
 
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
Дмитрий Лапшин, "The importance of TEX and Internal Quality. How explain and ...
 

Recently uploaded

KuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
KuberTENes Birthday Bash Guadalajara - Introducción a Argo CDKuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
KuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
rodomar2
 
socradar-q1-2024-aviation-industry-report.pdf
socradar-q1-2024-aviation-industry-report.pdfsocradar-q1-2024-aviation-industry-report.pdf
socradar-q1-2024-aviation-industry-report.pdf
SOCRadar
 
Artificia Intellicence and XPath Extension Functions
Artificia Intellicence and XPath Extension FunctionsArtificia Intellicence and XPath Extension Functions
Artificia Intellicence and XPath Extension Functions
Octavian Nadolu
 
DDS-Security 1.2 - What's New? Stronger security for long-running systems
DDS-Security 1.2 - What's New? Stronger security for long-running systemsDDS-Security 1.2 - What's New? Stronger security for long-running systems
DDS-Security 1.2 - What's New? Stronger security for long-running systems
Gerardo Pardo-Castellote
 
Microservice Teams - How the cloud changes the way we work
Microservice Teams - How the cloud changes the way we workMicroservice Teams - How the cloud changes the way we work
Microservice Teams - How the cloud changes the way we work
Sven Peters
 
A Study of Variable-Role-based Feature Enrichment in Neural Models of Code
A Study of Variable-Role-based Feature Enrichment in Neural Models of CodeA Study of Variable-Role-based Feature Enrichment in Neural Models of Code
A Study of Variable-Role-based Feature Enrichment in Neural Models of Code
Aftab Hussain
 
Vitthal Shirke Java Microservices Resume.pdf
Vitthal Shirke Java Microservices Resume.pdfVitthal Shirke Java Microservices Resume.pdf
Vitthal Shirke Java Microservices Resume.pdf
Vitthal Shirke
 
Neo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
Neo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
Neo4j
 
What is Augmented Reality Image Tracking
What is Augmented Reality Image TrackingWhat is Augmented Reality Image Tracking
What is Augmented Reality Image Tracking
pavan998932
 
UI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsUI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
UI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
Peter Muessig
 
ALGIT - Assembly Line for Green IT - Numbers, Data, Facts
ALGIT - Assembly Line for Green IT - Numbers, Data, FactsALGIT - Assembly Line for Green IT - Numbers, Data, Facts
ALGIT - Assembly Line for Green IT - Numbers, Data, Facts
Green Software Development
 
Hand Rolled Applicative User Validation Code Kata
Hand Rolled Applicative User ValidationCode KataHand Rolled Applicative User ValidationCode Kata
Hand Rolled Applicative User Validation Code Kata
Philip Schwarz
 
E-commerce Application Development Company.pdf
E-commerce Application Development Company.pdfE-commerce Application Development Company.pdf
E-commerce Application Development Company.pdf
Hornet Dynamics
 
Oracle Database 19c New Features for DBAs and Developers.pptx
Oracle Database 19c New Features for DBAs and Developers.pptxOracle Database 19c New Features for DBAs and Developers.pptx
Oracle Database 19c New Features for DBAs and Developers.pptx
Remote DBA Services
 
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Atelier - Innover avec l’IA Générative et les graphes de connaissancesAtelier - Innover avec l’IA Générative et les graphes de connaissances
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Neo4j
 
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
mz5nrf0n
 
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOMLORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
lorraineandreiamcidl
 
Oracle 23c New Features For DBAs and Developers.pptx
Oracle 23c New Features For DBAs and Developers.pptxOracle 23c New Features For DBAs and Developers.pptx
Oracle 23c New Features For DBAs and Developers.pptx
Remote DBA Services
 
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptxLORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
lorraineandreiamcidl
 
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
Crescat
 

Recently uploaded (20)

KuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
KuberTENes Birthday Bash Guadalajara - Introducción a Argo CDKuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
KuberTENes Birthday Bash Guadalajara - Introducción a Argo CD
 
socradar-q1-2024-aviation-industry-report.pdf
socradar-q1-2024-aviation-industry-report.pdfsocradar-q1-2024-aviation-industry-report.pdf
socradar-q1-2024-aviation-industry-report.pdf
 
Artificia Intellicence and XPath Extension Functions
Artificia Intellicence and XPath Extension FunctionsArtificia Intellicence and XPath Extension Functions
Artificia Intellicence and XPath Extension Functions
 
DDS-Security 1.2 - What's New? Stronger security for long-running systems
DDS-Security 1.2 - What's New? Stronger security for long-running systemsDDS-Security 1.2 - What's New? Stronger security for long-running systems
DDS-Security 1.2 - What's New? Stronger security for long-running systems
 
Microservice Teams - How the cloud changes the way we work
Microservice Teams - How the cloud changes the way we workMicroservice Teams - How the cloud changes the way we work
Microservice Teams - How the cloud changes the way we work
 
A Study of Variable-Role-based Feature Enrichment in Neural Models of Code
A Study of Variable-Role-based Feature Enrichment in Neural Models of CodeA Study of Variable-Role-based Feature Enrichment in Neural Models of Code
A Study of Variable-Role-based Feature Enrichment in Neural Models of Code
 
Vitthal Shirke Java Microservices Resume.pdf
Vitthal Shirke Java Microservices Resume.pdfVitthal Shirke Java Microservices Resume.pdf
Vitthal Shirke Java Microservices Resume.pdf
 
Neo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
Neo4j - Product Vision and Knowledge Graphs - GraphSummit Paris
 
What is Augmented Reality Image Tracking
What is Augmented Reality Image TrackingWhat is Augmented Reality Image Tracking
What is Augmented Reality Image Tracking
 
UI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsUI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
UI5con 2024 - Boost Your Development Experience with UI5 Tooling Extensions
 
ALGIT - Assembly Line for Green IT - Numbers, Data, Facts
ALGIT - Assembly Line for Green IT - Numbers, Data, FactsALGIT - Assembly Line for Green IT - Numbers, Data, Facts
ALGIT - Assembly Line for Green IT - Numbers, Data, Facts
 
Hand Rolled Applicative User Validation Code Kata
Hand Rolled Applicative User ValidationCode KataHand Rolled Applicative User ValidationCode Kata
Hand Rolled Applicative User Validation Code Kata
 
E-commerce Application Development Company.pdf
E-commerce Application Development Company.pdfE-commerce Application Development Company.pdf
E-commerce Application Development Company.pdf
 
Oracle Database 19c New Features for DBAs and Developers.pptx
Oracle Database 19c New Features for DBAs and Developers.pptxOracle Database 19c New Features for DBAs and Developers.pptx
Oracle Database 19c New Features for DBAs and Developers.pptx
 
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Atelier - Innover avec l’IA Générative et les graphes de connaissancesAtelier - Innover avec l’IA Générative et les graphes de connaissances
Atelier - Innover avec l’IA Générative et les graphes de connaissances
 
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
原版定制美国纽约州立大学奥尔巴尼分校毕业证学位证书原版一模一样
 
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOMLORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
LORRAINE ANDREI_LEQUIGAN_HOW TO USE ZOOM
 
Oracle 23c New Features For DBAs and Developers.pptx
Oracle 23c New Features For DBAs and Developers.pptxOracle 23c New Features For DBAs and Developers.pptx
Oracle 23c New Features For DBAs and Developers.pptx
 
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptxLORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
LORRAINE ANDREI_LEQUIGAN_HOW TO USE WHATSAPP.pptx
 
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...
 

Time Series, Vitalii Radchenko

  • 3. What is Econometrics? • is the application of statistical methods to economic data and is described as the branch of economics that aims to give empirical content to economic relations
  • 4. What is Econometrics? • is the application of statistical methods to economic data and is described as the branch of economics that aims to give empirical content to economic relations • Basic tools:
  • 5. What is Econometrics? • is the application of statistical methods to economic data and is described as the branch of economics that aims to give empirical content to economic relations • Basic tools: • linear regression models
  • 6. What is Econometrics? • is the application of statistical methods to economic data and is described as the branch of economics that aims to give empirical content to economic relations • Basic tools: • linear regression models • statistical theory
  • 8. The main goals • To find estimators that have desirable statistical properties:
  • 9. The main goals • To find estimators that have desirable statistical properties: • unbiasedness
  • 10. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency
  • 11. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency
  • 12. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for
  • 13. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories
  • 14. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories • forecasting macroeconomic indexes
  • 15. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories • forecasting macroeconomic indexes • predicting revenue
  • 16. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories • forecasting macroeconomic indexes • predicting revenue • estimating the impact of something
  • 17. The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories • forecasting macroeconomic indexes • predicting revenue • estimating the impact of something Interpretability and Statistical robustness
  • 18. Government The main goals • To find estimators that have desirable statistical properties: • unbiasedness • efficiency • consistency • Applied for • assessing economic theories • forecasting macroeconomic indexes • predicting revenue • estimating the impact of something Interpretability and Statistical robustness
  • 21. Basic approach Linear regression • Residuals assumptions:
  • 22. Basic approach Linear regression • Residuals assumptions: 1. Mean equals 0
  • 23. Basic approach Linear regression • Residuals assumptions: 1. Mean equals 0 2. Deviation is constant (homoscedasticity)
  • 24. Basic approach Linear regression • Residuals assumptions: 1. Mean equals 0 2. Deviation is constant (homoscedasticity) 3. Independent residuals (covariation equals 0)
  • 25. Basic approach Linear regression • Residuals assumptions: 1. Mean equals 0 2. Deviation is constant (homoscedasticity) 3. Independent residuals (covariation equals 0) 4. Independence of residuals and regressors
  • 26. Basic approach Linear regression • Residuals assumptions: 1. Mean equals 0 2. Deviation is constant (homoscedasticity) 3. Independent residuals (covariation equals 0) 4. Independence of residuals and regressors 5. Residuals should be normally distributed
  • 27. Gauss-Markov theorem If all five assumptions are satisfied for a simple linear regression, then the variance of the OLS estimates will be the smallest among all unbiased estimates
  • 28. Hypothesis estimation • If the assumption of the residuals normality is satisfied, then we test the hypotheses by comparing with the values of the Fisher distribution • If not satisfied – Chi-square
  • 30. Hypothesis • The adequacy of the model (H0: R^2 = 0)
  • 31. Hypothesis • The adequacy of the model (H0: R^2 = 0) • The significance of the correlation between the variables (H0: rxy = 0)
  • 32. Hypothesis • The adequacy of the model (H0: R^2 = 0) • The significance of the correlation between the variables (H0: rxy = 0) • The significance of the regression coefficients (H0: B = 0)
  • 33. Hypothesis • The adequacy of the model (H0: R^2 = 0) • The significance of the correlation between the variables (H0: rxy = 0) • The significance of the regression coefficients (H0: B = 0) • Multicollinearity (VIF, Farr-Glauber criterion)
  • 34. Hypothesis • The adequacy of the model (H0: R^2 = 0) • The significance of the correlation between the variables (H0: rxy = 0) • The significance of the regression coefficients (H0: B = 0) • Multicollinearity (VIF, Farr-Glauber criterion) • Check of the functional form (criterion RESET)
  • 36. Heteroskedasticity • Identification: Golffred-Quondt criterion, White criterion, Broyush-Pagan criterion, Glaser criterion
  • 37. Heteroskedasticity • Identification: Golffred-Quondt criterion, White criterion, Broyush-Pagan criterion, Glaser criterion • Second condition is not satisfied for the variance equality
  • 38. Heteroskedasticity • Identification: Golffred-Quondt criterion, White criterion, Broyush-Pagan criterion, Glaser criterion • Second condition is not satisfied for the variance equality • Weighted least squares
  • 39. Heteroskedasticity • Identification: Golffred-Quondt criterion, White criterion, Broyush-Pagan criterion, Glaser criterion • Second condition is not satisfied for the variance equality • Weighted least squares
  • 43. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey
  • 44. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey • The generalized least-squares method
  • 45. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey • The generalized least-squares method • Select correlation coefficient:
  • 46. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey • The generalized least-squares method • Select correlation coefficient: • Durbin-Watson estimate
  • 47. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey • The generalized least-squares method • Select correlation coefficient: • Durbin-Watson estimate • Cochrane-Orcatte method
  • 48. Autocorrelation • Correlated residuals • Identification: the Durbin-Watson criterion, Broych-Godfrey • The generalized least-squares method • Select correlation coefficient: • Durbin-Watson estimate • Cochrane-Orcatte method • Hildretha-Lou method
  • 51. Time Series Patterns Trend • A trend exists when there is a long-term increase or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend.
  • 52. Time Series Patterns Trend • A trend exists when there is a long-term increase or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend. Seasonal
  • 53. Time Series Patterns Trend • A trend exists when there is a long-term increase or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend. Seasonal • A seasonal pattern exists when a series is influenced by seasonal factors (e.g., the quarter of the year, the month, or day of the week). Seasonality is always of a fixed and known period.
  • 54. Time Series Patterns Trend • A trend exists when there is a long-term increase or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend. Seasonal • A seasonal pattern exists when a series is influenced by seasonal factors (e.g., the quarter of the year, the month, or day of the week). Seasonality is always of a fixed and known period. Cyclic
  • 55. Time Series Patterns Trend • A trend exists when there is a long-term increase or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend. Seasonal • A seasonal pattern exists when a series is influenced by seasonal factors (e.g., the quarter of the year, the month, or day of the week). Seasonality is always of a fixed and known period. Cyclic • A cyclic pattern exists when data exhibit rises and falls that are not of fixed period. The duration of these fluctuations is usually of at least 2 years.
  • 57. Time series decomposition • additive model yt = St +Tt + Et St - seasonal component, Tt – trend-cycle component, Et – reminder
  • 58. Time series decomposition • additive model • multiplicative model yt = St +Tt + Et yt = St iTt i Et St - seasonal component, Tt – trend-cycle component, Et – reminder
  • 59. Time series decomposition • additive model • multiplicative model yt = St +Tt + Et yt = St iTt i Et St - seasonal component, Tt – trend-cycle component, Et – reminder
  • 60. Time Series Decomposition • What is the problem here?
  • 61. Time Series Decomposition • What is the problem here? Seasonal component is not meaningful
  • 64. Stationarity and differencing • Stationary data: • mean is a constant
  • 65. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant
  • 66. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time
  • 67. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time
  • 68. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests:
  • 69. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present)
  • 70. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary)
  • 71. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary)
  • 72. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary) Problem is that the absence of a unit root is not a proof of stationarity
  • 73. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary) • Make data stationary: Problem is that the absence of a unit root is not a proof of stationarity
  • 74. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary) • Make data stationary: • log-transformation Problem is that the absence of a unit root is not a proof of stationarity
  • 75. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary) • Make data stationary: • log-transformation • differencing Problem is that the absence of a unit root is not a proof of stationarity
  • 76. Stationarity and differencing • Stationary data: • mean is a constant • variance is a constant • covariance is not a function of time • Tests: • Augmented Dickey-Fuller (a unit root is present) • KPSS (trend stationary) • Make data stationary: • log-transformation • differencing • log-transformation and differencing Problem is that the absence of a unit root is not a proof of stationarity
  • 78. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable
  • 79. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 80. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable • For an AR(1) model: yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 81. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable • For an AR(1) model: • When ϕ1=0, yt is equivalent to white noise yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 82. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable • For an AR(1) model: • When ϕ1=0, yt is equivalent to white noise • When ϕ1=1 and c=0, yt is equivalent to a random walk yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 83. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable • For an AR(1) model: • When ϕ1=0, yt is equivalent to white noise • When ϕ1=1 and c=0, yt is equivalent to a random walk • When ϕ1=1 and c≠0, yt is equivalent to a random walk with drift yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 84. Autoregressive models • forecast the variable of interest using a linear combination of past values of the variable • For an AR(1) model: • When ϕ1=0, yt is equivalent to white noise • When ϕ1=1 and c=0, yt is equivalent to a random walk • When ϕ1=1 and c≠0, yt is equivalent to a random walk with drift • When ϕ1<0, yt tends to oscillate between positive and negative values. yt = c +φ1yt−1 +φ2yt−2 +…+φpyt−p + et
  • 85. Moving average models • Rather than use past values of the forecast variable in a regression, a moving average model uses past forecast errors in a regression-like model yt = c + et +θ1et−1 +θ2et−2 +…+θpet−p
  • 86. ARIMA
  • 87. ARIMA • combine differencing with autoregression and a moving average model, we obtain a non-seasonal ARIMA model yt ' = c +φ1yt−1 ' +…+φpyt−p ' +θ1et−1 +…+θqet−q + et
  • 88. ARIMA • combine differencing with autoregression and a moving average model, we obtain a non-seasonal ARIMA model • Information Criteria yt ' = c +φ1yt−1 ' +…+φpyt−p ' +θ1et−1 +…+θqet−q + et
  • 89. ARIMA • combine differencing with autoregression and a moving average model, we obtain a non-seasonal ARIMA model • Information Criteria • Akaike’s Information Criterion (AIC) yt ' = c +φ1yt−1 ' +…+φpyt−p ' +θ1et−1 +…+θqet−q + et AIC = −2log(L)+ 2(p + q + k +1)
  • 90. ARIMA • combine differencing with autoregression and a moving average model, we obtain a non-seasonal ARIMA model • Information Criteria • Akaike’s Information Criterion (AIC) • Bayesian Information Criterion (BIC) yt ' = c +φ1yt−1 ' +…+φpyt−p ' +θ1et−1 +…+θqet−q + et AIC = −2log(L)+ 2(p + q + k +1) BIC = AIC + (log(L)− 2)(p + q + k +1)
  • 91. ARIMA • combine differencing with autoregression and a moving average model, we obtain a non-seasonal ARIMA model • Information Criteria • Akaike’s Information Criterion (AIC) • Bayesian Information Criterion (BIC) • corrected Akaike’s Information Criterion (AICc) yt ' = c +φ1yt−1 ' +…+φpyt−p ' +θ1et−1 +…+θqet−q + et AIC = −2log(L)+ 2(p + q + k +1) BIC = AIC + (log(L)− 2)(p + q + k +1) AICc = AIC + 2(p + q + k +1)(p + q + k + 2) T − p − q − k − 2
  • 93. Other econometrics models • SARIMA – ARIMA with seasonal component
  • 94. Other econometrics models • SARIMA – ARIMA with seasonal component • ARFIMA – ARIMA allowing non-integer values in differencing parameter
  • 95. Other econometrics models • SARIMA – ARIMA with seasonal component • ARFIMA – ARIMA allowing non-integer values in differencing parameter • VAR – vector autoregression (allows to predict multiple target variables and to learn dynamic relationships between them)
  • 96. Other econometrics models • SARIMA – ARIMA with seasonal component • ARFIMA – ARIMA allowing non-integer values in differencing parameter • VAR – vector autoregression (allows to predict multiple target variables and to learn dynamic relationships between them) • ARCH (GARCH) – (assume that we have heteroskedastisity, predicting mean and deviation separately)
  • 97. Other econometrics models • SARIMA – ARIMA with seasonal component • ARFIMA – ARIMA allowing non-integer values in differencing parameter • VAR – vector autoregression (allows to predict multiple target variables and to learn dynamic relationships between them) • ARCH (GARCH) – (assume that we have heteroskedastisity, predicting mean and deviation separately) • Hierarchical time series (predict high and low level)
  • 98. Other econometrics models • SARIMA – ARIMA with seasonal component • ARFIMA – ARIMA allowing non-integer values in differencing parameter • VAR – vector autoregression (allows to predict multiple target variables and to learn dynamic relationships between them) • ARCH (GARCH) – (assume that we have heteroskedastisity, predicting mean and deviation separately) • Hierarchical time series (predict high and low level) Forecasting: principles and practice (Rob J Hyndman)
  • 101. Why econometrics models are good? Simple Statistically robust
  • 102. Why econometrics models are good? Simple Statistically robust Interpretable
  • 105. Why econometrics models are bad? • mostly linear and can catch non-linear dependencies Simple
  • 106. Why econometrics models are bad? • mostly linear and can catch non-linear dependencies • takes long time to optimize and train models Simple
  • 107. Why econometrics models are bad? • mostly linear and can catch non-linear dependencies • takes long time to optimize and train models • accuracy is not very good Simple
  • 108. Why econometrics models are bad? • mostly linear and can catch non-linear dependencies • takes long time to optimize and train models • accuracy is not very good • the same features could be generated manually and used with more complex models Simple
  • 110. Neural Networks • RNN is a hype theme
  • 111. Neural Networks • RNN is a hype theme • Sequence-to-Sequence modeling .. 3 2 1 LSTM 1 2 3 ..
  • 112. Neural Networks • RNN is a hype theme • Sequence-to-Sequence modeling • Experiment with number of lag features and adding “external” data .. 3 2 1 LSTM 1 2 3 .. .. 3 2 1 LSTM 1 2 3 .. Dense External features
  • 113. Neural Networks • RNN is a hype theme • Sequence-to-Sequence modeling • Experiment with number of lag features and adding “external” data • Works with multiple time series with a long history .. 3 2 1 LSTM 1 2 3 .. .. 3 2 1 LSTM 1 2 3 .. Dense External features
  • 114. Neural Networks • RNN is a hype theme • Sequence-to-Sequence modeling • Experiment with number of lag features and adding “external” data • Works with multiple time series with a long history • Usually works worse than linear or boosting models .. 3 2 1 LSTM 1 2 3 .. .. 3 2 1 LSTM 1 2 3 .. Dense External features
  • 116. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4
  • 117. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4Step 1 • train linear regression and optimize parameters on Fold 1
  • 118. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4Step 1 • train linear regression and optimize parameters on Fold 1 Step 2 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting
  • 119. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4Step 1 • train linear regression and optimize parameters on Fold 1 Step 2 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting Step 3 • train linear regression and optimize parameters on Fold 2
  • 120. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4Step 1 • train linear regression and optimize parameters on Fold 1 Step 2 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting Step 3 • train linear regression and optimize parameters on Fold 2 Step 4 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting …
  • 121. Stacking Train Test Train Test Train Test Train Test Fold 1 Fold 2 Fold 3 Fold 4Step 1 • train linear regression and optimize parameters on Fold 1 Step 2 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting Step 3 • train linear regression and optimize parameters on Fold 2 Step 4 • predict train and test on Fold 2. Use predictions as a new feature and apply boosting … Last step • validate final results on Fold 2, Fold 3 and Fold 4
  • 123. Interpretation • Linear regression - the easiest way to interpret features
  • 124. Interpretation • Linear regression - the easiest way to interpret features • MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear relationships
  • 125. Interpretation • Linear regression - the easiest way to interpret features • MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear relationships • parameters: number of interactions, regularization, smoothing etc
  • 126. Interpretation • Linear regression - the easiest way to interpret features • MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear relationships • parameters: number of interactions, regularization, smoothing etc • ELI5 – support Lightgbm and Xgboost sklearn api
  • 127. Interpretation • Linear regression - the easiest way to interpret features • MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear relationships • parameters: number of interactions, regularization, smoothing etc • ELI5 – support Lightgbm and Xgboost sklearn api • show weights and explain predictions
  • 128. Interpretation • Linear regression - the easiest way to interpret features • MARS (Earth) – a flexible regression method that automatically searches for interactions and non-linear relationships • parameters: number of interactions, regularization, smoothing etc • ELI5 – support Lightgbm and Xgboost sklearn api • show weights and explain predictions • as it is linear approximation, <BIAS> is usually big (misleading)
  • 130. Summary • Choose metric and validation approach based on business task
  • 131. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation
  • 132. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline
  • 133. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline • Generate default features
  • 134. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline • Generate default features • Try linear and boosting models (and RNN)
  • 135. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline • Generate default features • Try linear and boosting models (and RNN) • Add more features
  • 136. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline • Generate default features • Try linear and boosting models (and RNN) • Add more features • Don’t forget about ensembles and stacking :)
  • 137. Summary • Choose metric and validation approach based on business task • Do basic EDA and start with log-transformation • Create a simple baseline • Generate default features • Try linear and boosting models (and RNN) • Add more features • Don’t forget about ensembles and stacking :) • Check feature weights and prediction explanation
  • 138. Contacts • ODS-slack: @vradchenko • Email: radchenko.vitaliy.o@gmail.com • Facebook: https://www.facebook.com/ vitaliyradchenko127