2. Data + Algorithms + Computing
=
Machine Learning
• Gradient descent is a widely-used optimization algorithm
that optimizes the parameters of a Machine learning
model by minimizing the cost function.
• Gradient descent updates the parameters iteratively
during the learning process by calculating the gradient of
the cost function with respect to the parameters.
3. Data + Algorithms + Computing
=
Machine Learning
Gradient descent is often used as an optimization algorithm
to find the optimal solution for the parameters in a Linear
regression model.
The simple Linear regression consists of only one
independent variable ( x) and one dependent variable ( y)
4. Data + Algorithms + Computing
=
Machine Learning
• What we actually do in Linear
regression is basically to find the best
parameters that fits the straight line to
the observations by minimizing the
errors between the predicted outcomes
and the observations.
• Meaning that, the best model in Linear
regression is the linear equation model
with the best parameters.
6. Data + Algorithms + Computing
=
Machine Learning
• What we actually do in Linear regression is that we’re trying to
estimate the output from the estimated parameters and the
given inputs.
• Each time we’re trying to fit the model to the observed data, we
always chose/estimate new parameters.
• And after that, we use the hypothesis to validate whether the
chosen parameters are better fits with the observations or not by
calculating the errors between the hypothesis and the
observations.
7. Data + Algorithms + Computing
=
Machine Learning
In Linear regression, we usually have one dependent
variable as a target output. The inputs, however, can consist
of one or more independent variables.
16. Data + Algorithms + Computing
=
Machine Learning
• In more detail, if the gradient of the cost
function is positive, increasing the value
of the parameters will increase the cost.
Therefore, to minimize the cost, the
parameters should be decreased.
• On the other hand, if the gradient of the
cost function is negative, decreasing the
value of the parameters will increase the
cost. In this case, the parameters should
be decrease in order to minimize the
cost.