Question: Consider again the Python implementation of the polynomial regression in Section 9.5.1, where the stochastic gradient descent was used for training. Using the polynomial regression

Consider again the Python implementation of the polynomial regression in Section 9.5.1, where the stochastic gradient descent was used for training.

Using the polynomial regression data set, implement and run the following four alternative training methods:

(a) the steepest-descent Algorithm 9.4.1;

(b) the Levenberg-Marquardt Algorithm B.3.3, in conjunction with Algorithm 9.4.2 for computing the matrix of Jacobi;

(c) the limited-memory BFGS Algorithm 9.4.4;

(d) the Adam Algorithm 9.4.5, which uses past gradient values to determine the next search direction.

For each training algorithm, using trial and error, tune any algorithmic parameters so that the network training is as fast as possible. Comment on the relative advantages and disadvantages of each training/optimization method. For example, comment on which optimization method makes rapid initial progress, but gets trapped in a suboptimal solution, and which method is slower, but more consistent in finding good optima.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a The following code implements the steepestdescent algorithm b The f... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Statistical Techniques in Business Questions!