Skip to main content
Top

2018 | OriginalPaper | Chapter

3. Industrial Time Series Prediction

Authors : Jun Zhao, Wei Wang, Chunyang Sheng

Published in: Data-Driven Prediction for Industrial Processes and Their Applications

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Time series prediction is a significant way for forecasting the variables involved in industrial process, which usually identifies the latent rules hidden behind the time series data of the variables by means of auto-regression. In this chapter we introduce the phase space reconstruction technique, which aims to construct the training dataset for modeling, and then a series of data-driven machine learning methods are provided for time series prediction, where some well-known artificial neural networks (ANNs) models are introduced, and a dual estimation-based echo state network (ESN) model is particularly proposed to simultaneously estimate the uncertainties of the output weights and the internal states by using a nonlinear Kalman-filter and a linear one for noisy industrial time series. In addition, the kernel based methods, including Gaussian processes (GP) model and support vector machine (SVM) model, are also presented in this chapter. Specifically, an improved GP-based ESN model is proposed for time series prediction, in which the output weights in ESN modeled by using GP avoids the ill-conditioned phenomenon associated with the generic ESN version. A number of case studies related to industrial energy system are provided to validate the performance of these methods.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Takens, F. (1981). Detecting strange attractors in turbulence. Lecture Notes in Math, 898, 361–381.MathSciNetMATH Takens, F. (1981). Detecting strange attractors in turbulence. Lecture Notes in Math, 898, 361–381.MathSciNetMATH
2.
go back to reference Kennel, M. B., Brown, R., & Abarbanel, H. D. (1992). Determining embedding dimension for phase-space reconstruction using a geometrical construction. Physical Review A Atomic Molecular & Optical Physics, 45(6), 3403–3411.CrossRef Kennel, M. B., Brown, R., & Abarbanel, H. D. (1992). Determining embedding dimension for phase-space reconstruction using a geometrical construction. Physical Review A Atomic Molecular & Optical Physics, 45(6), 3403–3411.CrossRef
3.
go back to reference Cao, L. (1997). Practical method for determining the minimum embedding dimension of a scalar time series. Physica D-Nonlinear Phenomena, 110(1-2), 43–50.CrossRef Cao, L. (1997). Practical method for determining the minimum embedding dimension of a scalar time series. Physica D-Nonlinear Phenomena, 110(1-2), 43–50.CrossRef
4.
go back to reference Fraser, A. M., & Swinney, H. L. (1986). Independent coordinates for strange attractors from mutual information. Physical Review A General Physics, 33(2), 1134.MathSciNetCrossRef Fraser, A. M., & Swinney, H. L. (1986). Independent coordinates for strange attractors from mutual information. Physical Review A General Physics, 33(2), 1134.MathSciNetCrossRef
5.
go back to reference Kim, H. S., Eykholt, R., & Salas, J. D. (1999). Nonlinear dynamics, delay times, and embedding windows. Physica D Nonlinear Phenomena, 127(1–2), 48–60.CrossRef Kim, H. S., Eykholt, R., & Salas, J. D. (1999). Nonlinear dynamics, delay times, and embedding windows. Physica D Nonlinear Phenomena, 127(1–2), 48–60.CrossRef
6.
go back to reference Brock, W. A., Hsieh, D. A., & LeBaron, B. (1991). Nonlinear dynamics, chaos, and instability: Statistical theory and economic evidence. Cambridge: MIT Press. Brock, W. A., Hsieh, D. A., & LeBaron, B. (1991). Nonlinear dynamics, chaos, and instability: Statistical theory and economic evidence. Cambridge: MIT Press.
7.
go back to reference Han, M., & Xu, M. (2018). Laplacian echo state network for multivariate time series prediction. IEEE Transactions on Neural Networks and Learning System, 29(1), 238–244.MathSciNetCrossRef Han, M., & Xu, M. (2018). Laplacian echo state network for multivariate time series prediction. IEEE Transactions on Neural Networks and Learning System, 29(1), 238–244.MathSciNetCrossRef
8.
go back to reference Bishop, C. M. (2006). Pattern recognition and machine learning (Information Science and Statistics). New York: Springer.MATH Bishop, C. M. (2006). Pattern recognition and machine learning (Information Science and Statistics). New York: Springer.MATH
9.
go back to reference Zhao, J., Liu, Q., Wang, W., et al. (2012). Hybrid neural prediction and optimized adjustment for coke oven gas system in steel industry. IEEE Transactions on Neural Networks and Learning Systems, 23(3), 439–450.CrossRef Zhao, J., Liu, Q., Wang, W., et al. (2012). Hybrid neural prediction and optimized adjustment for coke oven gas system in steel industry. IEEE Transactions on Neural Networks and Learning Systems, 23(3), 439–450.CrossRef
10.
go back to reference An, S., Liu, W., & Venkatesh, S. (2007). Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognition, 40(8), 2154–2162.CrossRef An, S., Liu, W., & Venkatesh, S. (2007). Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognition, 40(8), 2154–2162.CrossRef
11.
go back to reference Hong, W. G., Feng, Q., Yan, C. L., Wen, L. D., & Lu, W. (2008). Identification and control nonlinear systems by a dissimilation particle swarm optimization-based Elman neural network. Nonlinear Analysis Real World Applications, 9, 1345–1360.MathSciNetCrossRef Hong, W. G., Feng, Q., Yan, C. L., Wen, L. D., & Lu, W. (2008). Identification and control nonlinear systems by a dissimilation particle swarm optimization-based Elman neural network. Nonlinear Analysis Real World Applications, 9, 1345–1360.MathSciNetCrossRef
12.
go back to reference Li, X., Chen, Z. Q., & Yuan, Z. Z. (2000). Nonlinear stable adaptive control based upon Elman networks. Applied Mathematics–A Journal of Chinese Universities, Series, B15, 332–340.MathSciNetMATH Li, X., Chen, Z. Q., & Yuan, Z. Z. (2000). Nonlinear stable adaptive control based upon Elman networks. Applied Mathematics–A Journal of Chinese Universities, Series, B15, 332–340.MathSciNetMATH
13.
go back to reference Liou, C. Y., Huang, J. C., & Yang, W. C. (2008). Modeling word perception using the Elman network. Neurocomputing, 71, 3150–3157.CrossRef Liou, C. Y., Huang, J. C., & Yang, W. C. (2008). Modeling word perception using the Elman network. Neurocomputing, 71, 3150–3157.CrossRef
14.
go back to reference Köker, R. (2005). Reliability-based approach to the inverse kinematics solution of robots using Elman’s networks. Engineering Applications of Artificial Intelligence, 18(6), 685–693.CrossRef Köker, R. (2005). Reliability-based approach to the inverse kinematics solution of robots using Elman’s networks. Engineering Applications of Artificial Intelligence, 18(6), 685–693.CrossRef
15.
go back to reference Welch, G., & Bishop, G. (1995). An introduction to the Kalman filter, Technical Report TR 95-041. University of North Carolina, Department of Computer Science. Welch, G., & Bishop, G. (1995). An introduction to the Kalman filter, Technical Report TR 95-041. University of North Carolina, Department of Computer Science.
16.
go back to reference Jaeger, H. (2002). Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and “Echo State Network” approach, Technical Report GMD Report 159. German National Research Center for Information Technology. Jaeger, H. (2002). Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and “Echo State Network” approach, Technical Report GMD Report 159. German National Research Center for Information Technology.
17.
go back to reference Farkaš, I., Bosák, R., & Gergeľ, P. (2016). Computational analysis of memory capacity in echo state networks. Neural Networks, 83, 109–120.CrossRef Farkaš, I., Bosák, R., & Gergeľ, P. (2016). Computational analysis of memory capacity in echo state networks. Neural Networks, 83, 109–120.CrossRef
18.
go back to reference Jaeger, H., & Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667), 78–80.CrossRef Jaeger, H., & Haas, H. (2004). Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667), 78–80.CrossRef
19.
go back to reference Jaeger, H. (2005). Reservoir riddles: Suggestions for echo state network research (pp. 1460–1462). In Proceedings of the International Joint Conference on Neural Networks. Jaeger, H. (2005). Reservoir riddles: Suggestions for echo state network research (pp. 1460–1462). In Proceedings of the International Joint Conference on Neural Networks.
20.
go back to reference Shi, Z. W., & Han, M. (2007). Support vector echo-state machine for chaotic time-series prediction. IEEE Transactions on Neural Network, 18(2), 359–372.CrossRef Shi, Z. W., & Han, M. (2007). Support vector echo-state machine for chaotic time-series prediction. IEEE Transactions on Neural Network, 18(2), 359–372.CrossRef
21.
go back to reference Liu, Y., Zhao, J., & Wang, W. (2009). Improved echo state network based on data-driven and its application in prediction of blast furnace gas output. Acta Automatica Sinica., 35, 731–738.CrossRef Liu, Y., Zhao, J., & Wang, W. (2009). Improved echo state network based on data-driven and its application in prediction of blast furnace gas output. Acta Automatica Sinica., 35, 731–738.CrossRef
22.
go back to reference Golub, G. H., & van Loan, C. F. (1983). Matrix computations. Baltimore: The Johns Hopkins University Press.MATH Golub, G. H., & van Loan, C. F. (1983). Matrix computations. Baltimore: The Johns Hopkins University Press.MATH
23.
go back to reference Saxén, H., & Pettersson, F. (2005). A simple method for selection of inputs and structure of feedforward neural networks. Computers & Chemical Engineering, 30(6), 1038–1045. Saxén, H., & Pettersson, F. (2005). A simple method for selection of inputs and structure of feedforward neural networks. Computers & Chemical Engineering, 30(6), 1038–1045.
24.
go back to reference Saxen, H., & Pettersson, F. (2007). Nonlinear prediction of the hot metal silicon content in the blast furnace. Transactions of the Iron & Steel Institute of Japan, 47(12), 1732–1737.CrossRef Saxen, H., & Pettersson, F. (2007). Nonlinear prediction of the hot metal silicon content in the blast furnace. Transactions of the Iron & Steel Institute of Japan, 47(12), 1732–1737.CrossRef
25.
go back to reference Wan, E. A., & Nelson, A. T. (2001). Dual extended Kalman filter methods. In S. Haykin (Ed.), Kalman filtering and neural networks (pp. 123–174). Chichester: Wiley.CrossRef Wan, E. A., & Nelson, A. T. (2001). Dual extended Kalman filter methods. In S. Haykin (Ed.), Kalman filtering and neural networks (pp. 123–174). Chichester: Wiley.CrossRef
26.
go back to reference Vapnik, V. (1995). The nature of statistical learning theory. New York: Springer.CrossRef Vapnik, V. (1995). The nature of statistical learning theory. New York: Springer.CrossRef
27.
go back to reference Bennett, K. P., & Mangasarian, O. L. (1992). Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods and Software, 1, 23–34.CrossRef Bennett, K. P., & Mangasarian, O. L. (1992). Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods and Software, 1, 23–34.CrossRef
28.
go back to reference Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20, 273–297.MATH Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20, 273–297.MATH
29.
go back to reference Fletcher, R. (1989). Practical methods of optimization. New York: Wiley.MATH Fletcher, R. (1989). Practical methods of optimization. New York: Wiley.MATH
30.
go back to reference Schölkopf, B., & Smola, A. J. (2002). Learning with kernels. Cambridge: MIT Press.MATH Schölkopf, B., & Smola, A. J. (2002). Learning with kernels. Cambridge: MIT Press.MATH
31.
go back to reference Gestel, V., Suykens, J. A. K., et al. (2001). Financial time series prediction using least squares support vector machines within the evidence framework. IEEE Transactions on Neural Networks, 12(4), 809–821.CrossRef Gestel, V., Suykens, J. A. K., et al. (2001). Financial time series prediction using least squares support vector machines within the evidence framework. IEEE Transactions on Neural Networks, 12(4), 809–821.CrossRef
32.
go back to reference Suykens, J., & Vandewalle, J. (1999). Least squares support vector machines classifiers. Neural Processing Letters, 9(3), 293–300.CrossRef Suykens, J., & Vandewalle, J. (1999). Least squares support vector machines classifiers. Neural Processing Letters, 9(3), 293–300.CrossRef
33.
go back to reference Diykh, M., Li, Y., & Wen, P. (2017). Classify epileptic EEG signals using weighted complex networks based community structure detection. Expert Systems with Applications, 90(30), 87–100.CrossRef Diykh, M., Li, Y., & Wen, P. (2017). Classify epileptic EEG signals using weighted complex networks based community structure detection. Expert Systems with Applications, 90(30), 87–100.CrossRef
34.
go back to reference Zhao, J., Wang, W., Pedrycz, W., et al. (2012). Online parameter optimization-based prediction for converter gas system by parallel strategies. IEEE Transactions on Control Systems Technology, 20(3), 835–845.CrossRef Zhao, J., Wang, W., Pedrycz, W., et al. (2012). Online parameter optimization-based prediction for converter gas system by parallel strategies. IEEE Transactions on Control Systems Technology, 20(3), 835–845.CrossRef
Metadata
Title
Industrial Time Series Prediction
Authors
Jun Zhao
Wei Wang
Chunyang Sheng
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-94051-9_3

Premium Partner