Question: ### Task 2 . 2 - Backward propagation In this task, you will start with random weights for ` w 0 ` and ` w
### Task Backward propagation
In this task, you will start with random weights for w and w and iteratively perform forward passes and backward propagation multiple times to converge on a solution.
Submit your values of ww and loss value onto Coursemology. Your loss value should be less than
torch.manualseed # Set seed to some fixed value
w torch.randn requiresgradTrue
w torch.randn requiresgradTrue
learningrate e
printiter 'loss',
sept
for t in range:
# Forward pass: compute predicted y
ypred forwardpassx w w torch.relu
loss torch.meantorchsquarey ypred
loss.backward
if t :
printt loss.item sept
with torch.nograd:
# Update weights and then reset the gradients to zero
raise NotImplementedError
print w w sep
print w w sep
ypred forwardpassx w w torch.relu
pltplotx y linestyle'solid', labelx
pltplotx ypred.detachnumpy linestyle'dashed', label'perceptron'
pltaxisequal
plttitleFit NN on abs function'
pltlegend
pltshow
# Task : Submit the values of ww and loss values after fitting
# Note: An acceptable loss value should be less than
# You should try adjusting the random seed, learning rate, or
# number of iterations to improve your model.
w # to be computed
w # to be computed
loss # to be computed
w torch.tensorw
w torch.tensorw
x torch.linspacereshape
y torch.absx
#IMPORTANT: Your forward pass above have to be correctly implemented
ypred forwardpassx w w torch.relu
computedmseloss torch.meantorchsquarey ypreditem
assert loss
assert isclosecomputedmseloss, loss, atole rtole
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
