Skip to main content

2011 | Buch

The Foundations of Modern Time Series Analysis

insite
SUCHEN

Über dieses Buch

This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

Inhaltsverzeichnis

Frontmatter
1. Prolegomenon: A Personal Perspective and an Explanation of the Structure of the Book
Abstract
1.1 My interest in time series analysis began around 1977, soon after I had been appointed to a lectureship in econometrics in the School of Economic Studies at the University of Leeds. I had earlier been subjected to a rather haphazard training in econometrics and statistics, both as an undergraduate at Essex and as a postgraduate at Warwick, so that when I entered academia I was basically self-taught in these subjects. This was an undoubted advantage in that my enthusiasm for them remained undiminished but it was accompanied by a major drawback: I was simply unacquainted with large areas of econometric and statistical theory. As an example of this haphazard background, in my final undergraduate year in 1973 I attended a course on the construction of continuous time economic models given by Peter Phillips, now the extremely distinguished Sterling Professor of Econometrics and Statistics at Yale but then in his first academic appointment, while my econometrics course consisted of being taught the yet to be examined thesis of a temporary lecturer who subsequently left academia, never to return, at the end of that academic year!
Terence C. Mills
2. Yule and Hooker and the Concepts of Correlation and Trend
Abstract
2.1 The foundations of modern time series analysis began to be laid in the late nineteenth century and were made possible by the invention of regression and the related concept of the correlation coefficient. By the final years of the century the method of correlation had made its impact felt primarily in biology, through the work of Francis Galton on heredity (Galton, 1888, 1890) and of Karl Pearson on evolution (Pearson 1896; Pearson and Filon, 1898).1 Correlation had also been used by Edgeworth (1893, 1894) to investigate social phenomena and by G. Udny Yule in the field of economic statistics, particularly to examine the relationship between welfare and poverty (Yule, 1895, 1896).2 This led Yule (1897a, 1897b) to provide a full development of the theory of correlation which, unusually from a modern perspective — but, as we shall see, importantly for time series analysis —, was based on the related idea of a regression between two variables X and Y.3 It also did not rely on the assumption that the two variables were jointly normally distributed, which was central to the formal development of correlation in Edgeworth (1892) and Pearson (1896). This was an important generalization, for Yule was quick to appreciate that much of the data appearing in the biological and social sciences were anything but normally distributed, typically being highly skewed.
Terence C. Mills
3. Schuster, Beveridge and Periodogram Analysis
Abstract
3.1 Around the time that Hooker and Yule were developing correlation and detrending techniques for economic time series, the physicist Sir Arthur Schuster was investigating periodicities in series such as earthquake frequency and sunspot numbers using a technique that became to be known as periodogram analysis (see Schuster, 1897, 1898, 1906).1 Periodogram analysis is based on the technique of harmonic analysis and the use of Fourier series, which we outline using the classic approach taken in Whittaker and Robinson (1924) and Davis (1941).2
Terence C. Mills
4. Detrending and the Variate Differencing Method: Student, Pearson and Their Critics
Abstract
4.1 The differencing approach to detrending time series proposed by Hooker (1905) and Cave-Brown-Cave (1905) (§2.9) was reconsidered some years later by ‘Student’ (1914) in rather more formal fashion.1 Student began by assuming that y t and x t were randomly distributed in time and space, by which he meant that, in modern terminology, E(y t y t−i ), E(x t x t−i ) and E(y t x t−i ), i ≠ 0, were all zero if it was assumed that both variables had zero mean. If the correlation between y t and x t was denoted r yx = E(y t x t )/σ y σ x , where \(\sigma _{y}^{2}=E\left( {y_{t}^{2}} \right)\) and \(\sigma _{x}^{2}=E\left( {x_{t}^{2}} \right)\), Student showed that the correlation between the dth differences of x and y was the same value.
Terence C. Mills
5. Nonsense Correlations, Random Shocks and Induced Cycles: Yule, Slutzky and Working
Abstract
5.1 By the mid-1920s the methodological advances discussed in the previous two chapters, namely periodogram analysis and the variate differencing method, appeared to be running out of steam, with few new applications appearing and increasing concern about the underlying assumptions of the techniques. At this point, three papers appeared in quick succession which transformed the subject and laid the foundations for modern approaches to the analysis of time series. Two of the papers, by Yule (1926) and Slutzky (1927), went a long way to establishing the basis for the theoretical analysis of stationary time series and, because of the enduring importance of their contributions, are consequently subjected to detailed scrutiny in this chapter, along with a subsequent and closely related paper by Working (1934). The third paper, also by Yule (Yule, 1927), attacked periodic time series in a new way and, in turn, provided the foundations for analysing oscillatory time series: this is our focus in Chapter 6.
Terence C. Mills
6. Periodicities in Sunspots and Air Pressure: Yule, Walker and the Modelling of Superposed Fluctuations and Disturbances
Abstract
6.1 At the same time as he was analysing the nonsense correlation problem, Yule was also turning his attention back to harmonic motion and, in particular, to how harmonic motion responds to external shocks. This attention led to yet another seminal paper in the foundations of time series analysis: Yule (1927). Yule’s starting point was to take a simple harmonic function of time and to superpose upon it a sequence of random errors. If these errors were small, ‘the only effect is to make the graph somewhat irregular, leaving the suggestion of periodicity still quite clear to the eye’ (ibid., page 267), and an example of this situation is shown in Figure 6.1(a). If the errors were increased in size, as in Figure 6.1(b), ‘the graph becomes more irregular, the suggestion of periodicity more obscure, and we have only sufficiently to increase the “errors” to mask completely any appearance of periodicity’ (ibid., page 267). Nevertheless, no matter how large the errors, periodogram analysis would still be applicable and should, given a sufficient number of observations, continue to provide a close approximation to both the period and the amplitude of the underlying harmonic function. Yule referred to this setup as one of superposed fluctuations — ‘fluctuations which do not in any way disturb the steady course of the underlying periodic function’ (ibid., page 268).
Terence C. Mills
7. The Formal Modelling of Stationary Time Series: Wold and the Russians
Abstract
7.1 Slutzky’s modelling of the ‘summation of random causes’, introduced in §§5.11–5.16, and the ‘ordinary regression equations’ (6.10) and (6.13) of Yule and Walker were to become the basic models of time series analysis. One of the reasons why they have been such enduring features, apart from their obvious usefulness, may be because of their renaming as moving averages and linear autoregressions, respectively, by Herman Wold (1938, page 2), as these are terms that convey their structure with great clarity and effectiveness.1
Terence C. Mills
8. Generalizations and Extensions of Stationary Autoregressive Models: From Kendall to Box and Jenkins
Abstract
8.1 After being introduced by Yule and Walker and having its theoretical foundations established by Wold, the autoregressive model was further developed in a trio of papers written during the Second World War by Maurice Kendall (1943, 1944, 1945a).1
Terence C. Mills
9. Statistical Inference, Estimation and Model Building for Stationary Time Series
Abstract
9.1 As we saw in §8.3, Kendall (1945a) expressed frustration at the lack of a sampling theory related to serial correlations when attempting to interpret the correlograms obtained from his experimental series.
The significance of the correlogram is … difficult to discuss in theoretical terms. … (O)ur real problem is to test the significance of a set of values which are, in general, correlated. It is quite possible for a part of the correlogram to be below the significance level and yet to exhibit oscillations which are themselves significant of autoregressive effects. At the present time our judgments of the reality of oscillations in the correlogram must remain on the intuitive plane. (ibid., page 103)
In his discussion of the paper from which this quote is taken, Bartlett actually took Kendall to task for not attempting any form of inference: ‘it might have been useful, and probably not too intractable mathematically, to have evaluated at least the approximate theoretical standard errors for the autocorrelations’ (ibid., page 136). This rebuke may have been a marker for a major development in the sampling theory of serial correlations that was to be published within a year of the appearance of Kendall’s paper.
Terence C. Mills
10. Dealing with Nonstationarity: Detrending, Smoothing and Differencing
Abstract
10.1 As we discussed in §§2.6–2.9, Hooker (1901b, 1905) was the first to be concerned with the problems of dealing with time series containing trends, proposing both differencing and the use of moving averages to ‘detrend’ the data prior to statistical analysis.1 Beveridge (1921, 1922) later used a variation on the moving average to eliminate a secular trend from his wheat prices before subjecting them to periodogram analysis (§§3.8–3.9). The variate differencing approach examined in detail in Chapter 4 explored the link between successive differencing and fitting polynomials in time to a series, with Persons (1917) explicitly considering the decomposition of an observed time series into various unobserved components, one of which was the secular trend (§4.11). Indeed, the identification and removal of the trend component became a preoccupation of many analysts of time series data for much of the twentieth century, even though it was conceded that even the definition of a trend posed considerable conceptual problems: as Kendall (1941, page 43) remarked
(t)he concept of ‘trend’, like that of time itself, is one of those ideas which are generally understood but difficult to define with exactitude. A movement which has the evolutionary appearance of a trend over a period of thirty or forty years may in reality be one phase of an oscillatory movement of greater extent. A good deal depends on the length of the series under consideration whether we regard any particular tendency in the series as a trend, or a longterm movement, or an oscillation, or short-term movement. But in any case we require of a trend curve that it shall exhibit only the general direction of the time-series, and in practice this amounts to saying that it must be representable, at least locally, by a smooth non-periodic function such as a polynomial or a logistic curve.
Terence C. Mills
11. Forecasting Nonstationary Time Series
Abstract
11.1 Forecasting time series, particularly economic ones, has had a long, and often chequered, history. Attempts to find temporal patterns in economic data that might enable predictions to be made about future events stretch all the way back to a London cloth merchant, John Graunt, who in 1662 published several ingenious comparisons using bills of mortality.1 For example, in an attempt to make trade and government ‘more certain and regular’, Graunt searched for seasonal and other periodic patterns in mortality, conditioned the data on the plague, and determined the temporal pattern of ‘sickliness’ that would enable him to predict ‘by what spaces, and intervals we may hereafter expect such times again’, as quoted in Klein (1997, page 55), who provides an authoritative account of these early attempts at statistical analysis using economic data.
Terence C. Mills
12. Modelling Dynamic Relationships Between Time Series
Abstract
12.1 As the theory of testing the significance of autocorrelation coefficients was being developed (see §§9.1–9.7), so the related theory of testing the significance of the correlation between two time series was being investigated in tandem.
Terence C. Mills
13. Spectral Analysis of Time Series: The Periodogram Revisited and Reclaimed
Abstract
13.1 The periodograms of the sunspot numbers and Beveridge’s wheat price index calculated in Chapter 3 are rather ‘jumpy’ and show numerous peaks, particularly in the latter, which led to much disquiet from commentators and discussants when they were first published and a rather unconvincing explanation by Beveridge (§3.8). Subsequently, periodogram analysis lost much of its appeal, but it was only in the late 1940s that a convincing explanation was offered for this erratic behaviour of the periodogram.
Terence C. Mills
14. Tackling Seasonal Patterns in Time Series
Abstract
14.1 Seasonal patterns in economic and meteorological time series were first investigated in the middle of the nineteenth century, with Gilbart (1854), Babbage (1856) and Jevons (1866) all uncovering seasonal fluctuations in currency data, but an, albeit informal, definition of seasonality had to wait a further half-century until Persons (1919, page 18): ‘(b)y seasonal movement is meant a consistent variation from one month to the next. Are the items for certain months of the year systematically or regularly different from the items for other months? If so, there is a seasonal variation.’
Terence C. Mills
15. Emerging Themes
Abstract
15.1 This chapter discusses four research themes that began to emerge during the late 1950s and 1960s but whose real importance, like many aspects of this latter decade, only became apparent from the late 1970s onwards. These themes are: (i) inference in nonstationary autoregressive models; (ii) the use of model selection criteria; (iii) the Kalman filter, state space formulations and recursive estimation of time series models; and (iv) the specification and modelling of nonlinear time series processes.
Terence C. Mills
16. The Scene is Set
Abstract
16.1 The publication of Box and Jenkins’ book in 1970 represented a watershed in the development of time series analysis, for it provided a systematic framework for identifying, estimating and checking a range of models that have had a great impact on the practical modelling of time series, particularly for forecasting. This synthesis also provided the impetus for major theoretical developments which, when allied with rapidly increasing computing power and enhanced computational algorithms, opened up many new areas for empirical analysis.
Terence C. Mills
Backmatter
Metadaten
Titel
The Foundations of Modern Time Series Analysis
verfasst von
Terence C. Mills
Copyright-Jahr
2011
Verlag
Palgrave Macmillan UK
Electronic ISBN
978-0-230-30502-1
Print ISBN
978-1-349-33135-2
DOI
https://doi.org/10.1057/9780230305021

Premium Partner