Question: 2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g and let p be a probability measure over

2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g and let p be a probability measure over H. This probability measure represents the prior probability over the hypothesis class, i.e. the probability that a particular hypothesis is selected by the learning algorithm. Use Hoe ding's inequality to show that for any  > 0, with probability at least 1 ???? , the following inequality holds:

8h 2 H;R(h)  bR S(h) +

s log 1 p(h) + log 1



2m

: (2.26)

Compare this result with the bound given in the inconsistent case for nite hypothesis sets (Hint: you could use 0 = p(h) as con dence parameter in Hoe ding's inequality).

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Pattern Recognition And Machine Learning Questions!

Q:

a