Question: how would i do this 1: Question 1c: Weighted Meftrics In the lecture, we discussed weighted entropy as a loss function to help determine the

how would i do this

how would i do this 1: Question 1c: Weighted
1: Question 1c: Weighted Meftrics In the lecture, we discussed weighted entropy as a loss function to help determine the best split. Recall that the weighted entropy is given by: _ NSO + N> S(Y) L N+ N, where N, is the number of samples in the left node X, and N, is the number of samples in the right node Y. This concept of a weighted average can be extended to other metrics such as Gini impurity and variance by simply changing the .S (entropy) function to G (Gini impurity) or o (variance). First, implement the weighted_metric function. The left parameter is a list of labels or values in the left node X, and the right parameter is a list of labels or values in the right node YT he metric parameter is a function which can be entropy , gini_impurity ,or variance . For entropy and gini_impurity , you may assume that left and right con con ain discrete labels. For variance , you may assume that left and right contain inuous values. Then, assign we_pos3_age_3@ fothe weighted entropy (in the Pos3 column) of a split that partitions nba_data into two groups: one group with players who are 30 years old or older and ano M her group with players who are younger than 30 years old. def weighted metric(left, right, metric): we_pos3_age 30 - we_pos3_age 30

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Business Writing Questions!