Question: 2 Linear Example w 2 Linear Example when Phi is trivial Suppose we are given the following positively labeled data points in < 2
Linear Example w Linear Example when Phi is trivial
Suppose we are given the following positively labeled data points in
:
and the following negatively labeled data points in
see Figure :
Figure : Sample data points in
Blue diamonds are positive examples and
red squares are negative examples.
We would like to discover a simple SVM that accurately discriminates the
two classes. Since the data is linearly separable, we can use a linear SVM that
is one whose mapping function Phi is the identity function By inspection, it
should be obvious that there are three support vectors see Figure :
s
s
s
In what follows we will use vectors augmented with a as a bias input, and
for clarity we will differentiate these with an overtilde. So if s then
s Figure shows the SVM architecture, and our task is to find values
for the alpha i such that
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
Since for now we have let Phi I, this reduces to
alpha s salpha s salpha s s
alpha s salpha s salpha s s
alpha s salpha s salpha s s
Now, computing the dot products results in
Figure : The three support vectors are marked as yellow circles.
Figure : The SVM architecture.
alpha alpha alpha
alpha alpha alpha
alpha alpha alpha
A little algebra reveals that the solution to this system of equations is alpha
alpha and alpha
Now, we can look at how these alpha values relate to the discriminating hyperplane; or in other words, now that we have the alpha i
how do we find the hyperplane that discriminates the positive from the negative examples? It turns out
that
w
X
i
alpha isi
Finally, remembering that our vectors are augmented with a bias, we can
equate the last entry in w as the hyperplane offset b and write the separating
hyperplane equation y wxb with w
and b Plotting the line
gives the expected decision surface see Figure hen Phi is trivial
Suppose we are given the following positively labeled data points in
:
and the following negatively labeled data points in
see Figure :
Figure : Sample data points in
Blue diamonds are positive examples and
red squares are negative examples.
We would like to discover a simple SVM that accurately discriminates the
two classes. Since the data is linearly separable, we can use a linear SVM that
is one whose mapping function Phi is the identity function By inspection, it
should be obvious that there are three support vectors see Figure :
s
s
s
In what follows we will use vectors augmented with a as a bias input, and
for clarity we will differentiate these with an overtilde. So if s then
s Figure shows the SVM architecture, and our task is to find values
for the alpha i such that
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
alpha Phi sPhi salpha Phi sPhi salpha Phi sPhi s
Since for now we have let Phi I, this reduces to
alpha s salpha s salpha s s
alpha s salpha s salpha s s
alpha s salpha s salpha s s
Now, computing the dot products results in
Figure : The three support vectors are marked as yellow circles.
Figure : The SVM architecture.
alpha alpha alpha
alpha alpha alpha
alpha alpha alpha
A little algebra reveals that the solution to this system of equations is alpha
alpha and alpha
