Question: 9. Gradient Descent [8 points] For a homework assignment, you have implemented a logistic regression model y() = o(Wx() + b) You decide to
9. Gradient Descent [8 points] For a homework assignment, you have implemented a logistic regression model y() = o(Wx() + b) You decide to train your model on a multi-class problem using gradient descent. How- ever, before you can turn your assignment in, your friends stop by to give you some suggestions. (a) [2 points] Pat sees your regressor implementation and says there's a much simpler way! Just leave out sigmoids, and let f(x) = Wx+b. The derivatives are a lot less hassle and it runs faster. What's wrong with Pat's approach? (b) [2 points] Chris comes in and says that your regressor is too complicated, but for a different reason. The sigmoids are confusing and basically the same answer would be computed if we used step functions instead. What's wrong with Chris's approach?
Step by Step Solution
3.37 Rating (141 Votes )
There are 3 Steps involved in it
The provided image includes two questions related to a logistic regression model that uses gradient descent Here are the explanations for each part a ... View full answer
Get step-by-step solutions from verified subject matter experts
