Question: [ Logistic Regression ] ( 8 pts ) Figure 1 : Data for Logistic Regression Question Let the data distribution, as shown in Figure 1
Logistic Regression pts
Figure : Data for Logistic Regression Question
Let the data distribution, as shown in Figure represent the binary classification
problem where we fit the model pyxtheta sigma theta theta xtheta x As seen in class,
we do this by minimizing the negative log loss same as maximizing the likelihood as
shown below:
Ltheta ltheta Dtrain
where ltheta Dtrain represents the log likelihood on the training set.
For the questions below, submit the answer to each question as a separate figure. We
just expect an approximation of the figures if you submit handdrawn solutions, also
be careful about the clarity of your submitted figures.
a Show a decision boundary that possibly would correspond to hatwfinal weights
after training the regressor. How many datapoints are wrongly classified on the
training data?
b For this part, consider that a strong regularization is applied to the theta parameter
and we minimize
Ltheta ltheta Dtrain lambda theta
Since we apply a strong regularization, assume that lambda is a very large, sotheta is
pulled down all the way to but all other parameters are unregularized. Show
a decision boundary that possibly would correspond to hatwtheta theta xtheta x when :xx
c Now, heavy regularization is performed only on the theta parameter, ie we minimize
Ltheta ltheta Dtrain lambda theta
Show a decision boundary that possibly would correspond to hatw How many
datapoints are wrongly classified on the training data?
d Finally, heavy regularization is done only on the w parameter. Show a decision
boundary that possibly would correspond to hatw How many datapoints are wrongly
classified on the training data?
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
