3

Suppose I'm training a linear regression model using k-fold cross-validation. I'm training K times each time with a different training and test data set. So each time I train, I get different parameters (feature coefficients in the linear regression case). So I will have K parameters at the end of cross-validation. How do I arrive at the final parameters for my model?

If I'm using it to tune hyperparameters as well, do I have to do another cross-validation after fixing the parameters of my model?

NAS_2339
  • 263
  • 2
  • 13

1 Answers1

5

Usually, the aim of K-fold cross-validation is to check how a model performs (both on average and how much it varies across folds) given some hyper-parameters setting. We then pick the "best" set of hyper-parameters.

Afterwards, we fix the hyper-parameters and train the model with full dataset to squeeze all the juice.

In the case where there is no hyper-parameters to tune e.g. simple linear regression, cross-validation can give you an estimate of how your model will perform. You then train a final model with all data.

lpounng
  • 1,197
  • 4
  • 21