Question: Incorporate L 2 - norm regularization into the loss function: L = 1 2 ( ( y 1 t 1 ) 2 + ( y

Incorporate L2-norm regularization into the loss function:
L=
1
2((y1t1)2+(y2t2)2)+ w2
ij.
Assume a neural network with one hidden layer, two hidden nodes h1,h2, and two
output nodes y1,y2. The activation function is sigmoid. Derive the modified weight
update rule for the weights connecting the input layer to the hidden layer using gradient
descent. Assume >0 is the regularization parameter.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!