Question: 6.13 High-dimensional mapping. Let : X ! H be a feature mapping such that the dimension N of H is very large and let K:

6.13 High-dimensional mapping. Let : X ! H be a feature mapping such that the dimension N of H is very large and let K: XX ! R be a PDS kernel de ned by K(x; x0) = E iD



[(x)]i[(x0)]i



; (6.27)

where [(x)]i is the ith component of (x) (and similarly for 0(x)) and where D is a distribution over the indices i. We shall assume that j[(x)]ij  R for all x 2 X and i 2 [N]. Suppose that the only method available to compute K(x; x0)

involved direct computation of the inner product (6.27), which would require O(N) time. Alternatively, an approximation can be computed based on random selection of a subset I of the N components of (x) and (x0) according to D, that is:

K0(x; x0) =

1 n

X i2I D(i)[(x)]i[(x0)]i; (6.28)

where jIj = n.

(a) Fix x and x0 in X. Prove that P

IDn

[jK(x; x0) ???? K0(x; x0)j > ]  2e????n2 2r2 : (6.29)

(Hint: use McDiarmid's inequality).

(b) Let K and K0 be the kernel matrices associated to K and K0. Show that for any ;  > 0, for n > r2 2 log m(m+1)
 , with probability at least 1 ???? , jK0ij ????Kij j   for all i; j 2 [m].

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Pattern Recognition And Machine Learning Questions!

Q:

a