## JIO Interview Question | Modelling

Question

Why is it important to have information about the bias-variance trade off while modelling?

in progress
0

Machine Learning
1 year
3 Answers
401 views
Great Grand Master 0
## Answers ( 3 )

Bias Variance Trade Off :- The most important concept in Machine Learning.

This trade-off occurs because of 2 errors Bias Error and Variance Error.

Q1. What is Bias Error??

A1. Bias error is nothing but the error done by the model while training on the train data. Bias error occurs when our model tends to over-fit the train data.So high bias means that our model is over-fitting. This is always calculated on the train data.

Q2. What is Variance Error?

A2. Variance error is nothing but the error done by the model while testing on the test data. Variance error occurs when our model tends to under-fit the test data.So high variance means model is under-fitting. This is always calculated on the test data.

So, now if we look at the above chart we can see that as the model complexity increases training error aka bias error is reduced and when model complexity decreases then the testing error aka variance error is reduced.

So while model building or before finalizing our model for deployment we should check for both bias and variance error. And the best model is which that has low bias and low variance error. But in reality it is hard to achieve both simultaneously so we have to compromise either bias error for variance error or vice-versa. This is generally called Bias-Variance Trade Off.

Hence Bias -Variance Trade-off is important because it helps us to generalize our model much better.

Bias Variance Trade-Off:- The most important concept in Machine Learning.

This trade-off occurs because of 2 errors Bias Error and Variance Error.

Q1. What is Bias Error??

A1. Bias error is nothing but the error done by the model while training on the train data. Bias error occurs when our model tends to under-fit the train data. So high bias means that our model is under-fitting. This is always calculated on the train data.

Q2. What is Variance Error?

A2. Variance error is nothing but the error done by the model while testing on the test data. Variance error occurs when our model tends to over-fit the test data. So high variance means the model is over-fitting. This is always calculated on the test data.

So, now if we look at the above chart we can see that as the model complexity increases training error aka bias error is reduced and when model complexity decreases then the testing error aka variance error is reduced.

So while model building or before finalizing our model for deployment we should check for both bias and variance error. And the best model in which that has low bias and low variance error. But in reality, it is hard to achieve both simultaneously so we have to compromise either bias error for variance error or vice-versa. This is generally called Bias-Variance Trade-Off.

Hence, Bias -Variance Trade-off is important because it helps us to generalize our model much better.

Yash’s answer is perfect, but let me make it simpler for answering during an interview, where not too much time would be given for explaining from scratch.

Bias-Variance would have to be tested to see if our model is underfitting or overfitting. If the bias is high, we are underfitting and is the variance is high, we are overfitting. If there is a problem of underfitting, we may have to do variable transformations or tune our model better and in the case of overfitting, we would have to look at L1 or L2 regularisation.