Question: Fix a single arbitrary hypothesis h : X - > Y produced by A and determine a lower bound on the number of examples, k
Fix a single arbitrary hypothesis h : X Y produced by A and determine a lower
bound on the number of examples, k such that Perrhepsi delta The contrapositive
view would be: with probability at least delta it must be the case that errhepsi
Make sure this makes sense.
b From part a we know that as long as a block is at least of size k then if that block is
classified correctly by a fixed arbitrary hypothesis h we can effectively upper bound the
probility of the bad eventie A outputs h st errhepsi by delta However, our bound
must apply to every h that our algorith B could output for an arbitrary distribution
D over examples. With this in mind, how large should m be so that we can bound
all hypotheses that could be output? You may assue that algorithm B will know the
mistake bound throughout the question.
c Put everything together and fully describe with proof a PAC learner that is able to
output a hypothesis with a true error at most epsi with probability at least delta given
a mistake bounded learner A To do this you should first describe your pseudocode for
algorithm B which will use A as a subroutine no need for minute details or code, broad
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
