Question: Answer Consider a sequence of 2-dimensional data points, 71,x2,...,xm and their corresponding labels y(1),y(2), ,y(n). Recall the perceptron algorithm updates the parameters whenever y(i)h(z(i); where

Answer

Consider a sequence of 2-dimensional data points, 71,x2,...,xm and their corresponding labels y(1),y(2), ,y(n). Recall the perceptron algorithm updates the parameters whenever y(i)h(z(i); where h(z(i). ) = sign( . (i) + b). Assume that the points are linearly separable, and that both and b are initialized to zero. Let denote the number of times x(i) is misclassified during training. (a) (1pt) Derive the final decision boundary for the perceptron in terms of ai, and yf") (b) (1pt) Show that the shortest signed distance from the boundary to the origin is equal to (c) (2pts) The following table shows a dataset and the number of times each point is misclassified when one 1 applies the perceptron algorithm (with offset) Assuming and b are initialized to zero, what are the values for and b post training? (d) (1pt) Given a set of linearly separable points, does the order in which the points are presented to the algorithms affect whether or not the algorithm will converge? In general, could the order affect the total number of mistakes made?

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!