Member-only story
Key Concepts of Machine Learning | Day (2/45) | A2Z ML | Mohd Saqib
Read my previous blog if you have not covered yet — Prev

Day 2
Index:
- Variance vs Bias
- Overfitting and Under-fitting
- Regularization
- Parameter vs Hyperparameter

Variance vs Bias
In machine learning, variance and bias are two concepts that describe the errors that can occur in a model.
Bias refers to the error that is introduced by approximating a real-world problem, which may be extremely complex, by a simpler model. A model with high bias pays little attention to the training data and oversimplifies the problem, resulting in underfitting. A high bias model will have a high training error.
Variance, on the other hand, refers to the error that is introduced by the model’s sensitivity to small fluctuations in the training data. A model with high variance pays a lot of attention to the training data and is too closely fit to the training data, resulting in overfitting. A high variance model will have a high test error and a low…