Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Letter
  • Published:

Cluster-weighted modelling for time-series analysis

Abstract

The need to characterize and forecast time series recurs throughout the sciences, but the complexity of the real world is poorly described by the traditional techniques of linear time-series analysis. Although newer methods can provide remarkable insights into particular domains, they still make restrictive assumptions about the data, the analyst, or the application1. Here we show that signals that are nonlinear, non-stationary, non-gaussian, and discontinuous can be described by expanding the probabilistic dependence of the future on the past around local models of their relationship. The predictors derived from this general framework have the form of the global combinations of local functions that are used in statistics2,3,4, machine learning5,6,7,8,9,10 and studies of nonlinear dynamics11,12. Our method offers forecasts of errors in prediction and model estimation, provides a transparent architecture with meaningful parameters, and has straightforward implementations for offline and online applications. We demonstrate our approach by applying it to data obtained from a pseudo-random dynamical system, from a fluctuating laser, and from a bowed violin.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Figure 1: Forecasting a linear feedback shift register.
Figure 2: Modelling of a laser fluctuating near the gain threshold.
Figure 3: Clustering stationary and non-stationary data.
Figure 4: Modelling of a bowed violin.

Similar content being viewed by others

References

  1. Weigend, A. S. & Gershenfeld, N. A. (eds) Time Series Prediction: Forecasting the Future and Understanding the Past (Santa Fe Inst. Studies in the Sciences of Complexity, Addison-Wesley, Reading, MA, (1993)).

    Google Scholar 

  2. Cleveland, W. S. & Devlin, S. J. Regression analysis by local fitting. J. Am. Stat. Assoc. 83, 596–610 (1988).

    Article  Google Scholar 

  3. Wand, M. P. & Jones, M. C. Kernel Smoothing (Chapman & Hall, London, (1995)).

    Book  Google Scholar 

  4. Fan, J. & Gijbels, I. Local Polynomial Modelling and Its Applications (Chapman & Hall, London, (1996)).

    MATH  Google Scholar 

  5. Fan, J. & Gijbels, I. Data-driven bandwidth selection in local polynomial fitting: variable bandwidth and spatial adaptation. J. R. Stat. Soc. B 57, 371–394 (1995).

    MathSciNet  MATH  Google Scholar 

  6. Jordan, M. I. & Jacobs, R. A. Hierarchical mixtures of experts and the EM algorithm. Neural Comput. 6, 181–214 (1994).

    Article  Google Scholar 

  7. Weigend, A. S., Mangeas, M. & Srivastava, A. N. Nonlinear gated experts for time series: discovering regimes and avoiding overfitting. Int. J. Neural Syst. 6, 373–399 (1995).

    Article  CAS  Google Scholar 

  8. Ghahramani, Z. & Jordan, M. I. in Advances In Neural Information Processing SystemsVol. 6 120–127 (MIT Press, Cambridge, MA, (1995)).

    Google Scholar 

  9. Lei, X., Jordan, M. I. & Hinton, G. E. in Advances in Neural Information Processing SystemsVol. 7 633–640 (MIT Press, Cambridge, MA, (1995)).

    Google Scholar 

  10. Waterhouse, S., MacKay, D. & Robinson, T. in Advances in Neural Information Processing SystemsVol. 8 351–357 (MIT Press, Cambridge, MA, (1996)).

    Google Scholar 

  11. Farmer, J. D. & Sidorowich, J. J. Predicting chaotic time series. Phys. Rev. Lett. 59, 845–848 (1987).

    Article  ADS  MathSciNet  CAS  Google Scholar 

  12. Sauer, T. in Time Series Prediction: Forecasting the Future and Understanding the Past (eds Weigend, A.S. & Gershenfeld, N. A.) 175–193 (Santa Fe Inst. Studies in the Sciences of Complexity, Addison-Wesley, Reading, MA, (1983)).

    Google Scholar 

  13. Gershenfeld, N. A. The Nature of Mathematical Modeling (Cambridge Univ. Press, (1999)).

    MATH  Google Scholar 

  14. Gershenfeld, N. A. & Grinstein, G. Entrainment and communication with dissipative pseudorandom dynamics. Phys. Rev. Lett. 74, 5024–5027 (1995).

    Article  ADS  CAS  Google Scholar 

  15. Dempster, A. P., Laird, N. M. & Rubin, D. B. Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. B 39, 1–38 (1977).

    MathSciNet  MATH  Google Scholar 

  16. Wahba, G. & Wold, S. A. Completely automatic French curve: fitting spline functions by cross validation. Commun. Stat. 4, 1–17 (1975).

    MathSciNet  MATH  Google Scholar 

  17. Richardson, S. & Green, P. J. On Bayesian analysis of mixtures with an unknown number of components. J. R. Stat. Soc. B 59, 731–792 (1997).

    Article  MathSciNet  Google Scholar 

  18. Hüebner, U., Weiss, C. -O., Abraham, N. & Tang, D. in Time Series Prediction: Forecasting the Future and Understanding the Past (eds Weigend, A. S. & Gershenfeld, N. A.) 73–105 (Santa Fe Inst. Studies in the Sciences of Complexity, Addison-Wesley, Reading, MA, (1993)).

    Google Scholar 

  19. Casdagli, M. in Nonlinear Modeling and Forecasting (eds Casdagli, M. & Eubank, S.) 265–281 (Santa Fe Inst. Studies in the Sciences of Complexity, Addison-Wesley, Redwood City, CA, (1992)).

    Google Scholar 

  20. Stark, J., Broomhead, D. S., Davies, M. E. & Huke, J. Takens embedding theorems for forces and stochastic systems. Nonlinear Anal. 30, 5303–5314 (1997).

    Article  MathSciNet  Google Scholar 

  21. Buntine, W. L. Operations for learning with graphical models. J. Artif. Intelligence Res. 2, 159–225 (1994).

    Article  Google Scholar 

  22. Smyth, P., Heckerman, D. & Jordan, M. I. Probabilistic independence networks for hidden Markov probability models. Neural Comp. 9, 227–269 (1997).

    Article  CAS  Google Scholar 

Download references

Acknowledgements

We thank C. Douglas, C. Cooper, R. Shioda and E. Boyden for help with the collection and analysis of the violin data. This work was supported by the Things That Think consortium of MIT Media Laboratory.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gershenfeld, N., Schoner, B. & Metois, E. Cluster-weighted modelling for time-series analysis. Nature 397, 329–332 (1999). https://doi.org/10.1038/16873

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1038/16873

This article is cited by

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing