Linear regression is a machine learning algorithm that finds a linear relationship between a dependent variable and one or more independent variables. It can be simple linear regression with one independent variable or multiple linear regression with more than one. The goal is to find the "best fit" line that minimizes the error between predicted and actual values. This is done by calculating coefficients using a cost function and gradient descent. Model performance is evaluated using metrics like the R-squared value, with a higher value indicating a better fit. Key assumptions for linear regression include a linear relationship between variables, little multicollinearity, homoscedasticity, normal error term distribution, and no autocorrelation.