Question: ChAPTER 9 Serial Correlation 9 . 1 Time SeriesVirtually every equation in the text so far has been cross - sectional in nature,but thats going

ChAPTER 9 Serial Correlation9.1Time SeriesVirtually every equation in the text so far has been cross-sectional in nature,but thats going to change dramatically in this chapter. As a result, its probablyworthwhile to talk about some of the characteristics of time-series equations.Time-series data involve a single entity (like a person, corporation, or state)over multiple points in time. Such a time-series approach allows researchersto investigate analytical issues that cant be examined very easily with a cross-sectional regression. For example, macroeconomic models and supply-and-demand models are best studied using time-series, not cross-sectional, data.The notation for a time-series study is different from that for a cross-sectional one. Our familiar cross-sectional notation (for one time period andN different entities) is:Yi =0+1X1i +2X2i +3X3i + eiwhere i goes from 1 to N.A time-series regression has one entity and T different time periods, however,so well switch to this notation:Yt=0+1X1t +2X2t +3X3t + etwhere t goes from 1 to T.Thus:Y1=0+1X11+2X21+3X31+ e1 refers to observations from the firsttime periodY2=0+1X12+2X22+3X32+ e2 refers to observations from the secondtime periodgYT =0+1X1T +2X2T +3X3T + eT refers to observations from the Tthtime periodWhats so tough about that, you say? All weve done is change from i to tand change from N to T. Well, it turns out that time-series studies have somecharacteristics that make them more difficult to deal with than cross-sections:1.2. The order of observations in a time series is fixed. With a cross-sectional dataset, you can enter the observations in any order you want, but with time-series data, you must keep the observations in chronological order.Time-series samples tend to be much smaller than cross-sectional ones. Mosttime-series populations have many fewer potential observations thando cross-sectional ones, and these smaller data sets make statisticalinference more difficult. In addition, its much harder to generate a275 Pure versus Impure Serial Correlation3.4. time-series observation than a cross-sectional one. After all, it takes ayear to get one more observation in an annual time series!The theory underlying time-series analysis can be quite complex. In partbecause of the problems mentioned above, time-series econometricsincludes a number of complex topics that require advanced estimationtechniques. Well tackle these topics in Chapters 12,14, and 15.The stochastic error term in a time-series equation is often affected by eventsthat took place in a previous time period. This is serial correlation, thetopic of our chapter, so lets get started!9.2Pure versus Impure Serial CorrelationPure Serial CorrelationPure serial correlation occurs when Classical Assumption IV, whichassumes uncorrelated observations of the error term, is violated in a correctlyspecified equation. If there is correlation between observations of the errorterm, then the error term is said to be serially correlated. When econometri-cians use the term serial correlation without any modifier, they are referringto pure serial correlation.The most commonly assumed kind of serial correlation is first-orderserial correlation, in which the current value of the error term is a functionof the previous value of the error term:et= et-1+ ut (9.1)where: e = the error term of the equation in question= the first-order autocorrelation coefficientu = a classical (not serially correlated) error termThe functional form in Equation 9.1 is called a first-order Markov scheme.The new symbol, (rho, pronounced row), called the first-order autocor-relation coefficient, measures the functional relationship between the valueof an observation of the error term and the value of the previous observationof the error term.The magnitude of indicates the strength of the serial correlation in anequation. If is zero, then there is no serial correlation (because e wouldequal u, a classical error term). As approaches 1 in absolute value, thevalue of the previous observation of the error term becomes more importantin determining the current value of et, and a high degree of serial correlationexists. For to be greater than 1 in absolute value is unreasonable because276 ChAPTER 9 Serial Correlationit implies that the error term has a tendency to continually increase in abso-lute value over time (explode). As a result of this, we can state that:-166+1(9.2)The sign of indicates the nature of the serial correlation in an equation.A positive value for implies that the error term tends to have the same signfrom one time period to the next; this is called positive serial correlation.Such a tendency means that if et happens by chance to take on a large valuein one time period, subsequent observations would tend to retain a portionof this original large value and would have the same sign as the original.For example, in time-series models, the effects of a large external shock toan economy (like an earthquake) in one period may linger for several timeperiods. The error term will tend to be positive for a number of observations,then negative for several more, and then back positive again.Figure 9.1 shows two different examples of positive serial correlation.The error term observations plotted in Figure 9.1 are arranged in chrono-logical order, with the first observation being the first period for which dataare available, the second being the second, and so on. To see the differencebetween error terms with and without positive serial correlation, comparethe patterns in Figure 9.1 with the depiction of no serial correlation 1=02in Figure 9.2.A negative value of implies that the error term has a tendency to switchsigns from negative to positive and back again in consecutive observations;this is called negative serial correlation. It implies that there is some sortof cycle (like a pendulum) behind the drawing of stochastic disturbances.Figure 9.3 shows two different examples of negative serial correlation. Forinstance, negative serial correlation might exist in the error term of anequation that is in first differences because changes in a variable often fol-low a cyclical pattern. In most time-series applications, however, negativepure serial correlation is much less likely than positive pure serial correla-tion. As a result, most econometricians analyzing pure serial correlationconcern themselves primarily with positive serial correlation.Serial correlation can take on many forms other than first-order serial cor-relation. For example, in a quarterly model, the current quarters error termobservation may be functionally related to the observation of the error termfrom the same quarter in the previous year. This is called seasonally based serialcorrelation:et= et-4+ ut277 Pure versus Impure Serial Correlation1e0 Time2e120 TimeFigure 9.1 Positive Serial CorrelationWith positive first-order serial correlation, the current observation of the error termtends to have the same sign as the previous observation of the error term. An exampleof positive serial correlation would be external shocks to an economy that take morethan one time period to completely work through the system.Similarly, it is possible that the error term in an equation might be a functionof more than one previous observation of the error term:et=1et-1+2et-2+ utSuch a formulation is called second-order serial correlation.278 ChAPTER 9 Serial Correlation1e0 Time2Figure 9.2 No Serial CorrelationWith no serial correlation, different observations of the error term are completely uncorrelated with each other. Such error terms would conform to Classical Assumption IV.Impure Serial CorrelationBy impure serial correlation we mean serial correlation that is caused bya specification error such as an omitted variable or an incorrect functionalform. While pure serial correlation is caused by the underlying distributionof the error term of the true specification of an equation (which cannot bechanged by the researcher), impure serial correlation is caused by a specifica-tion error that often can be corrected.How is it possible for a specification error to cause serial correlation?Recall that the error term can be thought of as the effect of omitted variables,nonlinearities, measurement errors, and pure stochastic disturbances on thedependent variable. This means, for example, that if we omit a relevant vari-able or use the wrong functional form, then the portion of that omitted effectthat cannot be represented by the included explanatory variables must beabsorbed by the error term. The error term for an incorrectly specified equa-tion thus includes a portion of the effect of any omitted variables and/or aportion of the effect of the difference between the proper functional formand the one chosen by the researcher. This new error term might be seriallycorrelated even if the true one is not. If this is the case, the serial correlationhas been caused by the researchers choice of a specification and not by thepure error term associated with the correct specification.As youll see in Section 9.5, the proper remedy for serial correlationdepends on whether the serial correlation is likely to be pure or impure. Not279 Pure versus Impure Serial Correlatione120 Timee120 TimeFigure 9.3 Negative Serial CorrelationWith negative first-order serial correlation, the current observation of the error termtends to have the opposite sign from the previous observation of the error term. In mosttime-series applications, negative serial correlation is much less likely than positive serial correlation.surprisingly, the best remedy for impure serial correlation is to attempt tofind the omitted variable (or at least a good proxy) or the correct functionalform for the equation. Both the bias and the impure serial correlation willdisappear if the specification error is corrected. As a result, most econometri-cians try to make sure they have the best specification possible before theyspend too much time worrying about pure serial correlation.280 ChAPTER 9 Serial CorrelationTo see how an omitted variable can cause the error term to be serially correlated, suppose that the true equation is:Yt=0+1X1t +2X2t + et (9.3)where et is a classical error term. As shown in Section 6.1, if X2 is accidentallyomitted from the equation (or if data for X2 are unavailable), then:Yt=0+1X1t + e*t where e*t=2X2t + et (9.4)Thus, the error term in the omitted variable case is not the classical errorterm e. Instead, its also a function of one of the independent variables, X2. Asa result, the new error term, e*, can be serially correlated even if the true errorterm e is not. In particular, the new error term e* will tend to exhibit detect-able serial correlation when:1.X2 itself is serially correlated (this is quite likely in a time series) and2. the size of e is small1 compared to the size of 2X2.These tendencies hold even if there are a number of included and/or omittedvariables. Therefore:e*t= e*t-1+ ut (9.5)Another common kind of impure serial correlation is caused by an incor-rect functional form. Here, the choice of the wrong functional form can causethe error term to be serially correlated. Lets suppose that the true equation ispolynomial in nature:Yt=0+1X1t +2X21t + et (9.6)but that instead a linear regression is run:Yt=0+1X1t + e*t (9.7)The new error term e* is now a function of the true error term e and of thedifferences between the linear and the polynomial functional forms. As canbe seen in Figure 9.4, these differences often follow fairly autoregressive pat-terns. That is, positive differences tend to be followed by positive differences,and negative differences tend to be followed by negative differences. As a1. If typical values of e are significantly larger in absolute size than 2X2, then even a seriallycorrelated omitted variable (X2) will not change e* very much. In addition, recall that the omit-ted variable, X2, will cause bias in the estimate of 1, depending on the correlation between thetwo Xs. If n1 is biased because of the omission of X2, then a portion of the 2X2 effect must havebeen absorbed by n1 and will not end up in the residuals. As a result, tests for serial correlationbased on those residuals may give incorrect readings. Such residuals may leave misleading cluesas to possible specification errors.the Consequences of Serial Correlation281YY 5 b01 b1X10X11e0X12Figure 9.4 Incorrect Functional Form as a Source of Impure Serial CorrelationThe use of an incorrect functional form tends to group positive and negative residualstogether, causing positive impure serial correlation.result, using a linear functional form when a nonlinear one is appropriatewill usually result in positive impure serial correlation.9.3The Consequences of Serial CorrelationThe consequences of serial correlation are quite different in nature from theconsequences of the problems discussed so far in this text. Omitted variables,irrelevant variables, and multicollinearity all have fairly recognizable external282 ChAPTER 9 Serial Correlationsymptoms. Each problem changes the estimated coefficients and standarderrors in a particular way, and an examination of these changes (and theunderlying theory) often provides enough information for the problem tobe detected. As we shall see, serial correlation is more likely to have internalsymptoms; it affects the estimated equation in a way that is not easily observ-able from an examination of just the results themselves.The existence of serial correlation in the error term of an equation violatesClassical Assumption IV, and the estimation of the equation with OLS has atleast three consequences:21.2.3. Pure serial correlation does not cause bias in the coefficient estimates.Serial correlation causes OLS to no longer be the minimum varianceestimator (of all the linear unbiased estimators).Serial correlation causes the OLS estimates of the SE1n2s to be biased, leading to unreliable hypothesis testing.1. Pure serial correlation does not cause bias in the coefficient estimates. If theerror term is serially correlated, one of the assumptions of the GaussMarkov Theorem is violated, but this violation does not cause the coef-ficient estimates to be biased. If the serial correlation is impure, however,bias may be introduced by the use of an incorrect specification.This lack of bias does not necessarily mean that the OLS estimatesof the coefficients of a serially correlated equation will be close to thetrue coefficient values. A single estimate observed in practice can comefrom a wide range of possible values. In addition, the standard errors ofthese estimates will typically be increased by the serial correlation. Thisincrease will raise the probability that a nwill differ significantly fromthe true value. What unbiased means in this case is that the distribu-tion of the ns is still centered around the true .2. Serial correlation causes OLS to no longer be the minimum variance estimator(of all the linear unbiased estimators). Although the violation of ClassicalAssumption IV causes no bias, it does affect the other main conclusionof the GaussMarkov Theorem, that of minimum variance. In particular,
ChAPTER 9 Serial Correlation 9 . 1 Time

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Economics Questions!