Abstract
When a straight line is fitted to time series data, generalized least squares (GLS) estimators of the trend slope and intercept are attractive as they are unbiased and of minimum variance. However, computing GLS estimators is laborious as their form depends on the autocovariances of the regression errors. On the other hand, ordinary least squares (OLS) estimators are easy to compute and do not involve the error autocovariance structure. It has been known for 50 years that OLS and GLS estimators have the same asymptotic variance when the errors are second-order stationary. Hence, little precision is gained by using GLS estimators in stationary error settings. This article revisits this classical issue, deriving explicit expressions for the GLS estimators and their variances when the regression errors are drawn from an autoregressive process. These expressions are used to show that OLS methods are even more efficient than previously thought. Specifically, we show that the convergence rate of variance differences is one polynomial degree higher than that of least squares estimator variances. We also refine Grenander's (1954) variance ratio. An example is presented where our new rates cannot be improved upon. Simulations show that the results change little when the autoregressive parameters are estimated.
Original language | English (US) |
---|---|
Pages (from-to) | 312-324 |
Number of pages | 13 |
Journal | Journal of Time Series Analysis |
Volume | 33 |
Issue number | 2 |
DOIs | |
State | Published - Mar 2012 |
Externally published | Yes |
Keywords
- Asymptotic variance
- Autoregression
- Convergence rate
- Efficiency
- Simple linear regression
- Time series
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Applied Mathematics