Question: True/False As an activation function in deep learning, ReLU function can cause the vanishing gradient problem, whereas tanh function cannot. Consider the derivatives of the

True/False As an activation function in deep learning, ReLU function can cause the vanishing gradient problem, whereas tanh function cannot. Consider the derivatives of the functions. The derivative of ReLU is 0 (x < 0) or 1 (x > 0), and the derivative of tanh(x) is defined as 1 tanh2 (x), where tanh(x) is between -1 and 1

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!