Question: Forward Mode Differentiation: The backpropagation algorithm needs to compute node-to-node derivatives of output nodes with respect to all other nodes, and therefore computing gradients in
Forward Mode Differentiation: The backpropagation algorithm needs to compute node-to-node derivatives of output nodes with respect to all other nodes, and therefore computing gradients in the backwards direction makes sense. Consequently, the pseudocode on page 228 propagates gradients in the backward direction. However, consider the case where we want to compute the node-to-node derivatives of all nodes with respect to source (input) nodes s1 . . . sk. In other words, we want to compute ∂x
∂si for each non-input node variable x and each input node si in the network. Propose a variation of the pseudocode of page 228 that computes node-to-node gradients in the forward direction.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
