Question: Question 6 ( 1 . 5 points ) Mark all that is true about activation functions. The derivative of softmax with z as input is

Question 6(1.5 points)
Mark all that is true about activation functions.
The derivative of softmax with z as input is maximum at z=0.5, while its value is maximum
at z=0.25.
Leaky ReLU can be used to mitigate the dying ReLU problem.
tanh, sigmoid, and ReLU all squash the input to a value in the positive quadrant.
Derivative of tanh(z) is maximum at z=0 and its value is 1.
 Question 6(1.5 points) Mark all that is true about activation functions.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!