Question: Part Three: Implement cross _ validation [ Graded ] Use grid _ search to implement the cross _ validation function, which takes in the training
Part Three: Implement crossvalidation Graded
Use gridsearch to implement the crossvalidation function, which takes in the training set xTr yTr a list of depth candidates depths and performs kFold Cross Validation on the training set.
We will use generatekFold to generate the k trainingvalidation splits and pass in the indices for the splits to the crossvalidation function. Therefore, for each trainingindices, validationindices element in indices, you need to perform grid search to find the training and validation loss for each depth for that fold. Finally, take the average training and validation loss across folds to get the "average" loss. Your implementation should return these loss vectors and the depth with the minimum average validation loss.
I've already written generatekFold and gridsearch, but Im having trouble piecing them together in code for crossvalidation. Here's what I have for the first two functions:
def generatekFoldn k:
Generates trainingindices, validationindices for kfold validation.
Input:
n: number of training examples
k: number of folds
Output:
kfoldindices: a list of length k Each entry takes the form training indices, validation indices
assert k
kfoldindices
# YOUR CODE HERE
indices nparangen
parts nparraysplitindices k
for i in rangek:
validationindices partsi
trainingindices npconcatenateparts:i partsi:
kfoldindices.appendlisttrainingindices listvalidationindices
return kfoldindices
def gridsearchxTr yTr xVal, yVal, depths:
Calculates the training and validation loss for trees trained on xTr and validated on yTr with a number of depths.
Input:
xTr: nxd training data matrix
yTr: ndimensional vector of training labels
xVal: mxd validation data matrix
yVal: mdimensional vector of validation labels
depths: a list of len k of depths
Output:
bestdepth, traininglosses, validationlosses
bestdepth: the depth that yields that lowest validation loss
traininglosses: a list of len k the ith entry corresponds to the the training loss of the tree of depthdepthsi
validationlosses: a list of len k the ith entry corresponds to the the validation loss of the tree of depthdepthsi
traininglosses
validationlosses
bestdepth None
# YOUR CODE HERE
bestloss floatinf
for depth in depths:
tree RegressionTreedepthdepth
tree.fitxTr yTr
trainingloss squarelossyTr tree.predictxTr
validationloss squarelossyVal tree.predictxVal
traininglosses.appendtrainingloss
validationlosses.appendvalidationloss
if validationloss bestloss:
bestloss validationloss
bestdepth depth
return bestdepth, traininglosses, validationlosses
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
