Question: Which statement about a simple recurrent (RNN) layer is incorrect? Select one: a. The previous activation is fed back into the layer input b. A

Which statement about a simple recurrent (RNN) layer is incorrect? Select one: a. The previous activation is fed back into the layer input b. A simple RNN layer has two weight tensors c. A simple RNN layer has a sigmoid activation d. A simple RNN layer suffers from vanishing gradients during backpropagation
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
