# Machine Learning Quiz 02: Ridge, Lasso

Let’s check your basic knowledge of regularization techniques — Ridge and Lasso Regression. Here are 10 multiple-choice questions for you and there’s no time limit. Have fun!

*Question 1: For Ridge Regression, if the regularization parameter = 0, what does it mean?*

(A) Large coefficients are not penalized

(B) Overfitting problems are not accounted for

(C) The loss function is as same as the ordinary least square loss function

(D) All of the above

*Question 2: For Ridge Regression, if the regularization parameter is very high, which options are true? (Select two)*

(A) Large coefficients are significantly penalized

(B) Can lead to a model that is too simple and ends up underfitting the data

(C) Large coefficients are not penalized

(D) Can lead to a model that is too simple and ends up overfitting the data

*Question 3: For Lasso Regression, if the regularization parameter = 0, what does it mean?*

(A) The loss function is as same as the ordinary least square loss function

(B) Can be used to select important features of a dataset

(C ) Shrinks the coefficients of less important features to exactly 0

(D) All of the above

*Question 4: For Lasso Regression, if the regularization parameter is very high, which options are true? (Select two)*

(A) Can be used to select important features of a dataset

(B) Shrinks the coefficients of less important features to exactly 0

(C) The loss function is as same as the ordinary least square loss function

(D) The loss function is as same as the Ridge Regression loss function

*Question 5: What’s the penalty term for the Ridge regression?*

(A) the square of the magnitude of the coefficients

(B) the square root of the magnitude of the coefficients

(C) the absolute sum of the coefficients

(D) the sum of the coefficients

*Question 6: What’s the penalty term for the Lasso regression?*

(A) the square of the magnitude of the coefficients

(B) the square root of the magnitude of the coefficients

(C ) the absolute sum of the coefficients

(D) the sum of the coefficients

*Question 7: Which one is true?*

(A) Lasso regression stands for Least Absolute Shrinkage and Selection Operator.

(B) The difference between ridge and lasso regression is that lasso tends to make coefficients to absolute zero as compared to Ridge which never sets the value of the coefficient to absolute zero

(C) Lasso can be used to select important features of a dataset

(D) All of the above

*Question 8: Which one is true?*

(A) Ridge regression decreases the complexity of a model but does not reduce the number of variables since it never leads to a coefficient been zero rather only minimizes it

(B) Ridge regression is not good for feature reduction

(C) As the regularization parameter increases, the value of the coefficient tends towards zero. This leads to both low variance (as some coefficient leads to negligible effect on prediction) and low bias (minimization of coefficient reduces the dependency of prediction on a particular variable)

(D) All of the above

*Question 9: What are the limitations of Lasso Regression? (Select two)*

(A) If the number of features (p) is greater than the number of observations (n), Lasso will pick at most n features as non-zero, even if all features are relevant

(B) If there are two or more highly collinear feature variables, then LASSO regression selects one of them randomly which is not good for the interpretation of the data

(C) Lasso can be used to select important features of a dataset

(D) The difference between ridge and lasso regression is that lasso tends to make coefficients to absolute zero as compared to Ridge which never sets the value of the coefficient to absolute zero

*Question 10: Which one is true?*

(A) Ridge and Lasso regression are techniques to reduce the model complexity and prevent over-fitting which may result from simple linear regression

(B) Ridge regression shrinks the coefficients and it helps to reduce the model complexity and multi-collinearity.

(C) Lasso regression not only helps in reducing over-fitting but it can help us in feature selection

(D) All of the above

The solutions will be published in the next quiz **Machine Learning Quiz 03**. Happy learning. If you like the questions and enjoy taking the test, leave a clap for me. Feel free to discuss/share your thoughts on these questions in the comment section.

The solution of previous **Machine Learning Quiz 01****:** **Linear Regression** 1(B), 2(B,C), 3(A), 4(A), 5(A), 6(C), 7(D), 8(D), 9(D), 10(D).