Question: Problem 4 Parameters to be tuned for XGBoost: 1. n_estimators 2. max_depth 3. lambda 4. learning_rate 5. missing 6. objective Parameters to be tuned for
Problem 4
Parameters to be tuned for XGBoost: 1. n_estimators 2. max_depth 3. lambda 4. learning_rate 5. missing 6. objective Parameters to be tuned for SVM: 1. kernel_type 2. gamma 3. C Parameters to be tuned for Random Forests: 1. n_estimators 2. bootstrap 3. max_depth 4. min_impurity_decrease 5. min_samples_leafExperiment with non-linear classifiers Problem 4 (40 points) For this problem, you will need to learn to use software libraries for at least two of the following non-linear classifier types: . Boosted Decision Trees (i.e., boosting with decision trees as weak learner) . Random Forests . Support Vector Machines with Gaussian Kernel All of these are available in acikit-learn, although you may also use other external libraries (e.g., XGBoost ' for boosted decision trees and LibSVM for SVMs). You are welcome to implement learning algorithms for these classifiers yourself, but this is neither required nor recommended. Pick two different types of non-linear classifiers from above for classification of Adult dataset. You can the download the data from aga in libSVM data repository. The aga data set comes with two files: the training data file a9a with 32,561 samples each with 123 features, and aga. t with 16,281 test samples. Note that a9a data is in LibSVM for- mat. In this format, each line takes the form
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
Students Have Also Explored These Related Programming Questions!