skip to main content
10.1145/3358960.3379123acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
short-paper

An Automated Forecasting Framework based on Method Recommendation for Seasonal Time Series

Published: 20 April 2020 Publication History

Abstract

Due to the fast-paced and changing demands of their users, computing systems require autonomic resource management. To enable proactive and accurate decision-making for changes causing a particular overhead, reliable forecasts are needed. In fact, choosing the best performing forecasting method for a given time series scenario is a crucial task. Taking the "No-Free-Lunch Theorem" into account, there exists no forecasting method that performs best on all types of time series. To this end, we propose an automated approach that (i) extracts characteristics from a given time series, (ii) selects the best-suited machine learning method based on recommendation, and finally, (iii) performs the forecast. Our approach offers the benefit of not relying on a single method with its possibly inaccurate forecasts. In an extensive evaluation, our approach achieves the best forecasting accuracy.

References

[1]
Ratnadip Adhikari and R. K. Agrawal. 2013. An Introductory Study on Time Series Modeling and Forecasting. CoRR, Vol. abs/1302.6613 (2013).
[2]
Leo Breiman. 2001. Random forests. Machine learning, Vol. 45, 1 (2001), 5--32.
[3]
Leo Breiman, Joseph H Friedman, R. A. Olshen, and C. J. Stone. 1983. Classification and Regression Trees.
[4]
Tianqi Chen and Carlos Guestrin. 2016. Xgboost: A scalable tree boosting system. In ACM SIGKDD 2016. ACM, 785--794.
[5]
Robert B Cleveland, William S Cleveland, Jean E McRae, and Irma Terpenning. 1990. STL: A seasonal-trend decomposition procedure based on loess. Journal of Official Statistics, Vol. 6, 1 (1990), 3--73.
[6]
Fred Collopy and J Scott Armstrong. 1992. Rule-based forecasting: Development and validation of an expert systems approach to combining time series extrapolations. Management Science, Vol. 38, 10 (1992), 1394--1414.
[7]
Alysha M De Livera, Rob J Hyndman, and Ralph D Snyder. 2011. Forecasting time series with complex seasonal patterns using exponential smoothing. J. Amer. Statist. Assoc., Vol. 106, 496 (2011), 1513--1527.
[8]
Harris Drucker, Christopher JC Burges, Linda Kaufman, Alex J Smola, and Vladimir Vapnik. 1997. Support vector regression machines. In Advances in neural information processing systems. 155--161.
[9]
Thomas Grubinger, Achim Zeileis, and Karl-Peter Pfeiffer. 2014. evtree: Evolutionary Learning of Globally Optimal Classification and Regression Trees in R. Journal of Statistical Software, Articles, Vol. 61, 1 (2014), 1--29.
[10]
Rob Hyndman, George Athanasopoulos, Christoph Bergmeir, Gabriel Caceres, Leanne Chhay, Mitchell O'Hara-Wild, Fotios Petropoulos, Slava Razbash, Earo Wang, and Farah Yasmeen. 2018. forecast: Forecasting functions for time series and linear models. http://pkg.robjhyndman.com/forecast R package version 8.4.
[11]
Rob J Hyndman and George Athanasopoulos. 2014. Forecasting: principles and practice .OTexts, Melbourne, Australia.
[12]
Rob J Hyndman, Anne B Koehler, Ralph D Snyder, and Simone Grose. 2002. A state space framework for automatic forecasting using exponential smoothing methods. International Journal of forecasting, Vol. 18, 3 (2002), 439--454.
[13]
Christiane Lemke and Bogdan Gabrys. 2010. Meta-learning for time series forecasting and forecast combination. Neurocomputing, Vol. 73, 10--12 (2010), 2006--2016.
[14]
Steven M Pincus, Igor M Gladstone, and Richard A Ehrenkranz. 1991. A regularity statistic for medical data analysis. Journal of clinical monitoring, Vol. 7, 4 (1991), 335--345.
[15]
Liudmila Prokhorenkova, Gleb Gusev, Aleksandr Vorobev, Anna Veronika Dorogush, and Andrey Gulin. 2018. CatBoost: unbiased boosting with categorical features. In Advances in Neural Information Processing Systems. 6638--6648.
[16]
J Ross Quinlan. 1993. Combining instance-based and model-based learning. In Proceedings of the tenth international conference on machine learning. 236--243.
[17]
Priyanga Talagala, Rob Hyndman, George Athanasopoulos, et almbox. 2018. Meta-learning how to forecast time series. Technical Report. Monash University, Department of Econometrics and Business Statistics.
[18]
Xiaozhe Wang, Kate Smith-Miles, and Rob Hyndman. 2009. Rule induction for forecasting method selection: Meta-learning the characteristics of univariate time series. Neurocomputing, Vol. 72, 10--12 (2009), 2581 -- 2594.
[19]
D. H. Wolpert and W. G. Macready. 1997. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, Vol. 1, 1 (Apr 1997), 67--82.
[20]
Marwin Züfle, André Bauer, Veronika Lesch, Christian Krupitzer, Nikolas Herbst, Samuel Kounev, and Valentin Curtef. 2019. Autonomic Forecasting Method Selection: Examination and Ways Ahead. In Proceedings of the 16th IEEE International Conference on Autonomic Computing (ICAC). IEEE.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '20: Proceedings of the ACM/SPEC International Conference on Performance Engineering
April 2020
319 pages
ISBN:9781450369916
DOI:10.1145/3358960
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 April 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. comparative studies
  2. feature engineering
  3. forecasting
  4. machine learning
  5. recommendation

Qualifiers

  • Short-paper

Funding Sources

Conference

ICPE '20

Acceptance Rates

ICPE '20 Paper Acceptance Rate 15 of 62 submissions, 24%;
Overall Acceptance Rate 252 of 851 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)17
  • Downloads (Last 6 weeks)7
Reflects downloads up to 24 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media