Question: In this experiment we study the linear adaptive prediction of a signal x [n] governed the following recursion: x [n] = 0.8x [n ? 1]
In this experiment we study the linear adaptive prediction of a signal x [n] governed the following recursion: x [n] = 0.8x [n ? 1] ? 0.1x [n ? 2] + 0.1v [n], where v [n] is drawn from a discrete?time white noise process of zero mean and unit variance. (A process generated in this manner is referred to as an autoregressive pro of order two.) Specifically, the adaptive prediction is performed using the normalized algorithm defined by below, where p is the prediction order and .t is the normalized step-size parameter. The important point to note here is that p is dimensionless and stability of the algorithm is assured b choosing it in accordance with the formula 0 k [0] = 0 for all k. The learning curve of the algorithm is defined as a plot of the mean-square error versus the number of iterations n for specified parameter values, which is obtained by averaging the plot of e2 [n] versus n over a large number of different realizations of the algorithm.
(a) Plot the learning curves for the adaptive prediction of x [n] for a fixed prediction order p = 5 and three different values of step-size parameter: ? = 0.0075, 0.05, and 0.5.
(b) What observations can you make from the learning curves of part (a)?
![*[n] E walrx[n - k] %3D x[n] 2[n] e[n] %3D k =](https://dsd5zvtm8ll6.cloudfront.net/si.experts.images/questions/2022/11/636a49fba5634_139636a49fb97677.jpg)
*[n] E walrx[n - k] %3D x[n] 2[n] e[n] %3D k = 1, 2, ..., p xIn kle[n), waln + 1] = w.[n] + E x*In - k] k-1
Step by Step Solution
3.41 Rating (157 Votes )
There are 3 Steps involved in it
Matlab codes Problem 340 CS Haykin Normalized LMS prediction of AR process speech signal Mathini Sel... View full answer
Get step-by-step solutions from verified subject matter experts
Document Format (1 attachment)
19-E-T-E-C-S (284).docx
120 KBs Word File
