Skip to main content
Erschienen in:

2022 | OriginalPaper | Buchkapitel

Boosted Embeddings for Time-Series Forecasting

verfasst von : Sankeerth Rao Karingula, Nandini Ramanan, Rasool Tahmasbi, Mehrnaz Amjadi, Deokwoo Jung, Ricky Si, Charanraj Thimmisetty, Luisa F. Polania, Marjorie Sayer, Jake Taylor, Claudionor Nunes Coelho Jr

Erschienen in: Machine Learning, Optimization, and Data Science

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Time-series forecasting is a fundamental task emerging from diverse data-driven applications. Many advanced autoregressive methods such as ARIMA were used to develop forecasting models. Recently, deep learning based methods such as DeepAR, NeuralProphet, and Seq2Seq have been explored for the time-series forecasting problem. In this paper, we propose a novel time-series forecast model, DeepGB. We formulate and implement a variant of gradient boosting wherein the weak learners are deep neural networks whose weights are incrementally found in a greedy manner over iterations. In particular, we develop a new embedding architecture that improves the performance of many deep learning models on time-series data using a gradient boosting variant. We demonstrate that our model outperforms existing comparable state-of-the-art methods using real-world sensor data and public data sets.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
4.
Zurück zum Zitat Armstrong, J.S.: Long-range Forecasting. Wiley, Hoboken (1985) Armstrong, J.S.: Long-range Forecasting. Wiley, Hoboken (1985)
5.
Zurück zum Zitat Bahdanau, D., Chorowski, J., Serdyuk, D., Brakel, P., Bengio, Y.: End-to-end attention-based large vocabulary speech recognition. In: ICASSP, pp. 4945–4949 (2016) Bahdanau, D., Chorowski, J., Serdyuk, D., Brakel, P., Bengio, Y.: End-to-end attention-based large vocabulary speech recognition. In: ICASSP, pp. 4945–4949 (2016)
6.
Zurück zum Zitat Box, G., Jenkins, G.M.: Time Series Analysis: Forecasting and Control. Holden-Day, San Francisco (1976) Box, G., Jenkins, G.M.: Time Series Analysis: Forecasting and Control. Holden-Day, San Francisco (1976)
7.
Zurück zum Zitat Csáji, B.C.: Approximation with artificial neural networks. Fac. Sci. Eötvös Loránd Univ. Hungary 24, 7 (2001) Csáji, B.C.: Approximation with artificial neural networks. Fac. Sci. Eötvös Loránd Univ. Hungary 24, 7 (2001)
8.
10.
Zurück zum Zitat Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors). Ann. Stat. 28(2), 337–407 (2000) Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors). Ann. Stat. 28(2), 337–407 (2000)
12.
Zurück zum Zitat Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. The MIT Press, Cambridge (2016) Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. The MIT Press, Cambridge (2016)
13.
Zurück zum Zitat Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12, 993–1001 (1990) Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12, 993–1001 (1990)
14.
Zurück zum Zitat Harvey, A., Peters, S.: Estimation procedures for structural time series models. J. Forecasting. 9, 89–108 (1990)CrossRef Harvey, A., Peters, S.: Estimation procedures for structural time series models. J. Forecasting. 9, 89–108 (1990)CrossRef
16.
Zurück zum Zitat He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR (2016) He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR (2016)
17.
Zurück zum Zitat Hewamalage, H., Bergmeir, C., Bandara, K.: Recurrent neural networks for time series forecasting: Current status and future directions. Int. J. Forecast. 37(1), 388–427 (2021)CrossRef Hewamalage, H., Bergmeir, C., Bandara, K.: Recurrent neural networks for time series forecasting: Current status and future directions. Int. J. Forecast. 37(1), 388–427 (2021)CrossRef
18.
Zurück zum Zitat Hwang, S., Jeon, G., Jeong, J., Lee, J.: A novel time series based seq2seq model for temperature prediction in firing furnace process. Procedia Comput. Sci. 155, 19–26 (2019)CrossRef Hwang, S., Jeon, G., Jeong, J., Lee, J.: A novel time series based seq2seq model for temperature prediction in firing furnace process. Procedia Comput. Sci. 155, 19–26 (2019)CrossRef
19.
Zurück zum Zitat Hyndman, R., Athanasopoulos, G.: Forecasting: Principles and Practice, 3rd edn. OTexts, Australia (2021) Hyndman, R., Athanasopoulos, G.: Forecasting: Principles and Practice, 3rd edn. OTexts, Australia (2021)
20.
Zurück zum Zitat Karevan, Z., Suykens, J.A.: Transductive LSTM for time-series prediction: an application to weather forecasting. Neural Netw. 125, 1–9 (2020)CrossRef Karevan, Z., Suykens, J.A.: Transductive LSTM for time-series prediction: an application to weather forecasting. Neural Netw. 125, 1–9 (2020)CrossRef
21.
Zurück zum Zitat Kaushik, S., et al.: AI in healthcare: time-series forecasting using statistical, neural, and ensemble architectures. Front. Big Data 3, 4 (2020)CrossRef Kaushik, S., et al.: AI in healthcare: time-series forecasting using statistical, neural, and ensemble architectures. Front. Big Data 3, 4 (2020)CrossRef
22.
Zurück zum Zitat Kazemi, S.M., et al.: Time2vec: learning a vector representation of time (2019) Kazemi, S.M., et al.: Time2vec: learning a vector representation of time (2019)
23.
Zurück zum Zitat Ke, G., et al.: LIGHTGBM: a highly efficient gradient boosting decision tree. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4–9, December 2017, Long Beach, CA, USA, pp. 3146–3154 (2017) Ke, G., et al.: LIGHTGBM: a highly efficient gradient boosting decision tree. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4–9, December 2017, Long Beach, CA, USA, pp. 3146–3154 (2017)
25.
Zurück zum Zitat Makridakis, S., Spiliotis, E., Assimakopoulos, V.: The m5 accuracy competition: results, findings and conclusions. Int. J. Forecast. (2020) Makridakis, S., Spiliotis, E., Assimakopoulos, V.: The m5 accuracy competition: results, findings and conclusions. Int. J. Forecast. (2020)
26.
Zurück zum Zitat Makridakis, S., Hibon, M.: The m3-competition: results, conclusions and implications. Int. J. Forecast. 16(4), 451–476 (2000) Makridakis, S., Hibon, M.: The m3-competition: results, conclusions and implications. Int. J. Forecast. 16(4), 451–476 (2000)
27.
Zurück zum Zitat Nitanda, A., Suzuki, T.: Functional gradient boosting based on residual network perception. In: International Conference on Machine Learning, pp. 3819–3828. PMLR (2018) Nitanda, A., Suzuki, T.: Functional gradient boosting based on residual network perception. In: International Conference on Machine Learning, pp. 3819–3828. PMLR (2018)
28.
Zurück zum Zitat Opitz, D.W., Shavlik, J.W.: Actively searching for an effective neural network ensemble. Connection Sci. 8, 337–354 (1996) Opitz, D.W., Shavlik, J.W.: Actively searching for an effective neural network ensemble. Connection Sci. 8, 337–354 (1996)
29.
Zurück zum Zitat Perrone, M.P., Cooper, L.N.: When networks disagree: ensemble methods for hybrid neural networks. Brown University Institute for Brain and Neural Systems, Tech. rep. (1992) Perrone, M.P., Cooper, L.N.: When networks disagree: ensemble methods for hybrid neural networks. Brown University Institute for Brain and Neural Systems, Tech. rep. (1992)
30.
Zurück zum Zitat Salinas, D., Flunkert, V., Gasthaus, J., Januschowski, T.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast. 36(3), 1181–1191 (2020)CrossRef Salinas, D., Flunkert, V., Gasthaus, J., Januschowski, T.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast. 36(3), 1181–1191 (2020)CrossRef
31.
Zurück zum Zitat Chen, S., Billings, S.A.: Representations of non-linear systems: the NARMAX model. Int. J. Control 49(3), 1013–1032 (1989)CrossRef Chen, S., Billings, S.A.: Representations of non-linear systems: the NARMAX model. Int. J. Control 49(3), 1013–1032 (1989)CrossRef
32.
Zurück zum Zitat Seabold, S., Perktold, J.: Statsmodels: econometric and statistical modeling with Python. In: 9th Python in Science Conference (2010) Seabold, S., Perktold, J.: Statsmodels: econometric and statistical modeling with Python. In: 9th Python in Science Conference (2010)
36.
Zurück zum Zitat Timmermann, A.: Forecasting methods in finance. Ann. Rev. Financ. Econ. 10, 449–479 (2018)CrossRef Timmermann, A.: Forecasting methods in finance. Ann. Rev. Financ. Econ. 10, 449–479 (2018)CrossRef
37.
38.
Zurück zum Zitat Veit, A., Wilber, M.J., Belongie, S.: Residual networks behave like ensembles of relatively shallow networks. NIPS 29, 550–558 (2016) Veit, A., Wilber, M.J., Belongie, S.: Residual networks behave like ensembles of relatively shallow networks. NIPS 29, 550–558 (2016)
Metadaten
Titel
Boosted Embeddings for Time-Series Forecasting
verfasst von
Sankeerth Rao Karingula
Nandini Ramanan
Rasool Tahmasbi
Mehrnaz Amjadi
Deokwoo Jung
Ricky Si
Charanraj Thimmisetty
Luisa F. Polania
Marjorie Sayer
Jake Taylor
Claudionor Nunes Coelho Jr
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-95470-3_1