Question: Q-1. Consider the case of the XOR function in which the two points {(0, 0),(1, 1)} belong to one class, and the other two points

Q-1.

Consider the case of the XOR function in which the two points {(0, 0),(1, 1)} belong to one class, and the other two points {(1, 0),(0, 1)} belong to the other class. Show how you can use the ReLU activation function to separate the two classes in a manner similar to the example in the chapter.

Q-1. Consider the case of the XOR function in which the two

Partial credit for answers to questions number of units needed in the hidden layer and input-to-hidden weights of each hidden unit representation of classes by hidden layer hidden-to-output weights and activation of classifier.

Q-2.

Consider a data set in which the two points {(1, 1),(1, 1)} belong to one class, and the other two points {(1, 1),(1, 1)} belong to the other class. Start with perceptron parameter values at (0, 0), and work out a few stochastic gradient-descent updates with = 1. While performing the stochastic gradient-descent updates, cycle through the training points in any order.

(a) Does the algorithm converge in the sense that the change in objective function becomes extremely small over time?

(b) Explain why the situation in (a) occurs

use this book: neural networks and deep learning by Charu c Aggarwal, go to chapter 1 and look for exercise 1 and 4.

If you have any problem please feel free to comment it. Need help! Thank you in advance.

(0,1) L. X B FIRST LAYER TRANSFORM B (-1,1) (0,1) (1,1) (0,0) (1,0) NOT LINEARLY SEPARABLE LINEARLY SEPARABLE HIDDEN LAYER INPUT LAYER +1 X, +1 OUTPUT / -1 X2 ha

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!