Question: 2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g and let p be a probability measure over
2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g and let p be a probability measure over H. This probability measure represents the prior probability over the hypothesis class, i.e. the probability that a particular hypothesis is selected by the learning algorithm. Use Hoeding's inequality to show that for any > 0, with probability at least 1 ???? , the following inequality holds:
8h 2 H;R(h) bR S(h) +
s log 1 p(h) + log 1
2m
: (2.26)
Compare this result with the bound given in the inconsistent case for nite hypothesis sets (Hint: you could use 0 = p(h) as condence parameter in Hoeding's inequality).
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
