Question: Previous code: # Some settings. learning _ rate = 0 . 0 0 0 1 iterations = 1 0 0 0 0 losses = [

Previous code:
# Some settings.
learning_rate =0.0001
iterations =10000
losses =[]
# Gradient descent algorithm for linear SVM classifier.
# Step 1. Initialize the parameters W, b.
W = np.zeros(2)
b =0
C =1000
for i in range(iterations):
# Step 2. Compute the partial derivatives.
grad_W, grad_b = grad_L_W_b(X_train, Y_train, W, b, C)
# Step 3. Update the parameters.
W = W - learning_rate * grad_W
b = b - learning_rate * grad_b
# Track the training losses.
losses.append(L_W_b(X_train, Y_train, W, b, C))
In [17]: # Some settings.
learning_rate =0.0001
iterations =10000
losses =[]
Gradient descent algorithm for linear SVM classifier.
Step 1. Initialize the parameters W, b.
W = np.zeros(2)
b =0
C =1000
for i in range(iterations):
Step 2. Compute the partial derivatives.
grad_W, grad_b = grad_L_W_b(X_train, Y_train, W, b, C)
Step 3. Update the parameters.
W = W - learning_rate * grad_W
b = b - learning_rate * grad_b
Track the training losses.
losses.append(L_W_b(X_train, Y_train, W, b, C))
Visualize the results
Please complete the following codes to visualize the decision boundary of the perceptron model. You may use the vis function defined above.
Also, please plot the training error curve with respect to the number of iterations.
You should only insert your code in the part.
Points: 4
In []: # Show decision boundary, training error and test error.
print('Decision boundary: {:.3f}\times0+{:.3f}x1+{:.3f}=0'.format(W[0],W[1],b))
...
print('Training error: {}'.format(calc_error(X_train, Y_train, W, b)))
...
print('Test error: {}'.format(calc_error(X_test, Y_test, W, b)))|```
In [17]: # Some settings.
learning_rate =0.0001
iterations =10000
losses =[]
# Gradient descent algorithm for linear SVM classifier.
# Step 1. Initialize the parameters W, b.
W = np.zeros(2)
b =0
C =1000
for i in range(iterations):
# Step 2. Compute the partial derivatives.
grad_W, grad_b = grad_L_W_b(X_train, Y_train, W, b, C)
# Step 3. Update the parameters.
W = W - learning_rate * grad_W
b = b - learning_rate * grad_b
# Track the training losses.
losses.append(L_W_b(X_train, Y_train, W, b, C))
```
Previous code: # Some settings. learning _ rate =

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!