Question: Problem 9 Let g n be an arbitrary ( data - dependent ) classifier. The leave - one - out error estimate is defined as

Problem 9 Let gn be an arbitrary (data-dependent) classifier. The leave-one-out error estimate
is defined as
Rn(D)(gn)=1ni=1n1{gn-1(xi,Dn,i))Yi
where Dn,i=((x1,Y1),dots,(xi-1,Yi-1),(xi+1,Yi+1),dots,(xn,Yn)). Show that the estimate is
nearly unbiased in the sense that
ERn(D)(gn)=ER(gn-1).
Use this to derive a bound for the expected risk of a perceptron classifier when the data are linearly
separable (i.e.,L**=0 and the Bayes classifier is linear). In particular, prove that if gn is the linear
classifier obtained by running the perceptron algorithm, then
ER(gn-1)1nE[(R)2],
where R=maxin-1||xi|| and is the margin.
 Problem 9 Let gn be an arbitrary (data-dependent) classifier. The leave-one-out

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!