In linear regression model, how can we define cost function. also after defining cost function how to minimize the error term?
Asked
Active
Viewed 543 times
1 Answers
1
Statistical programs, such as R, typicall use Least Squares estimation. It's a deterministic algorithm that makes a linear model find its optimal tuple of parameters. Because of this, you don't have to worry about the choice of a loss function.
In case you wanted to train your linear regression with a gradient descent algorithm, instead, you'd have to specify a loss function to run it. Classical loss functions for regression are: Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE).
Leevo
- 6,445
- 3
- 18
- 52