Question: (35 pts) Given data {(xi,yi)Rd{1}:i[1,n]}, logistic regression in the linear case amounts to the following minimization problem: minwRd{F(w):=Ci=1nlog(1+eyiw,xi)+21w2}, where C>0 is a trade-off parameter. -
![(35 pts) Given data {(xi,yi)Rd{1}:i[1,n]}, logistic regression in the linear case](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2024/09/66f679af13854_03866f679ae8d8dd.jpg)
(35 pts) Given data {(xi,yi)Rd{1}:i[1,n]}, logistic regression in the linear case amounts to the following minimization problem: minwRd{F(w):=Ci=1nlog(1+eyiw,xi)+21w2}, where C>0 is a trade-off parameter. - (10 pts) Compute the gradient and Hessian of F(w), i.e. F(w) and 2F(w). Show that the Hessian matrix 2F(w) is positive semi-definite for any w (which indicates that the objective function F for logistic regression is convex). - (5 pts) Show that the objective function F(w) is strongly convex. - (10 pts) Using the representer theorem, write down the formulation of the logistic regression in the kernelized (nonlinear) case. - (10 pts; this is a challenging part!) What optimization algorithms do you suggest to solve the minimization problem in the logistic regression in the kernelized case? Code out the logistic regression in the kernelized case and run your code on the Tremor dataset
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
