Question: * * Part A * * : Suppose each of the weights is initialized to $W ^ k = 1 . 0 $ and each

**Part A**: Suppose each of the weights is initialized to $W^k =1.0$ and each bias is initialized to $b^k =-0.5$. Use forward propagation to find the activities and activations associated with each hidden and output neuron for the training example $(x, y)=(0.5,0)$. Show your work. Answer the Peer Review question about this section. **Part B**: Use Back-Propagation to compute the weight and bias derivatives $\partial \ell /\partial W^k$ and $\partial \ell /\partial b^k$ for $k=1,2,3$. Show all work. Answer the Peer Review question about this section. **PART C** Implement following activation functions:
Formulas for activation functions
* Relu: f($x$)= max(0, $x$)
* Sigmoid: f($x$)= $\frac{1}{1+ e^{-x}}$
* Softmax: f($x_i$)= $\frac{e^x_i}{\sum_{j=1}^{n} e^{x_j}}$

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!