Question: One way to tune your hyperparameters for any given Machine Learning algorithm is to perform a grid search over all the possible combinations of values.
One way to tune your hyperparameters for any given Machine Learning algorithm is to perform a grid search over all the possible combinations of values. If your hyperparameters can be any real number, you will need to limit the search to some finite set of possible values for each hyperparameter. For efficiency reasons, often you might want to tune one individual parameter, keeping all others constant, and then move onto the next one; Compared to a full grid search there are many fewer possible combinations to check, and this is what you'll be doing for the questions below. In main.py uncomment Problem 8 to run the staff-provided tuning algorithm from utils.py. For the purposes of this assignment, please try the following values for T : [1, 5, 10, 15, 25, 50] and the following values for [0.001, 0.01, 0.1, 1, 10]. For pegasos algorithm, first fix =0.01 to tune T , and then use the best T to tune
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
