Question: 8.7 Second-order regret bound. Consider the randomized algorithm that di ers from the RWMalgorithm only by the weight update, i.e., wt+1;i (1????(1???? )lt;i)wt;i, t 2

8.7 Second-order regret bound. Consider the randomized algorithm that di ers from the RWMalgorithm only by the weight update, i.e., wt+1;i (1????(1???? )lt;i)wt;i, t 2 [T], which is applied to all i 2 [N] with 1=2  < 1. This algorithm can be used in a more general setting than RWM since the losses lt;i are only assumed to be in [0; 1]. The objective of this problem is to show that a similar upper bound can be shown for the regret.

(a) Use the same potential Wt as for the RWM algorithm and derive a simple upper bound for logWT+1:

logWT+1  logN ???? (1 ???? )LT :

(Hint: Use the inequality log(1 ???? x)  ????x for x 2 [0; 1=2].)

(b) Prove the following lower bound for the potential for all i 2 [N]:

logWT+1  ????(1 ???? )LT;i ???? (1 ???? )2 XT t=1 l2 t;i :

(Hint: Use the inequality log(1 ???? x)  ????x ???? x2, which is valid for all x 2

[0; 1=2].)

(c) Use upper and lower bounds to derive the following regret bound for the algorithm: RT  2pT logN.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Pattern Recognition And Machine Learning Questions!