skip to main content
10.1145/3030207.3030225acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

IRIS: Iterative and Intelligent Experiment Selection

Published: 17 April 2017 Publication History

Abstract

Benchmarking is a widely-used technique to quantify the performance of software systems. However, the design and implementation of a benchmarking study can face several challenges. In particular, the time required to perform a benchmarking study can quickly spiral out of control, owing to the number of distinct variables to systematically examine. In this paper, we propose IRIS, an IteRative and Intelligent Experiment Selection methodology, to maximize the information gain while minimizing the duration of the benchmarking process. IRIS selects the region to place the next experiment point based on the variability of both dependent, i.e., response, and independent variables in that region. It aims to identify a performance function that minimizes the response variable prediction error for a constant and limited experimentation budget. We evaluate IRIS for a wide selection of experimental, simulated and synthetic systems with one, two and three independent variables. Considering a limited experimentation budget, the results show IRIS is able to reduce the performance function prediction error up to 4.3 times compared to equal distance experiment point selection. Moreover, we show that the error reduction can further improve through system-specific parameter tuning. Analysis of the error distributions obtained with IRIS reveals that the technique is particularly effective in regions where the response variable is sensitive to changes in the independent variables.

References

[1]
G. E. Box and D. W. Behnken. Some new three level designs for the study of quantitative variables. Technometrics, 2(4):455--475, 1960.
[2]
S.-W. Cheng, T. K. Dey, and J. Shewchuk. Delaunay Mesh Generation. CRC Press, 2012.
[3]
M. Courtois and M. Woodside. Using regression splines for software performance analysis. In Proc. of WOSP '00, pages 105--114. ACM, 2000.
[4]
N. Cressie. The origins of kriging. Mathematical geology, 22(3):239--252, 1990.
[5]
J. H. Friedman. Multivariate adaptive regression splines. The Annals of Statistics, pages 1--67, 1991.
[6]
R. Hashemian, D. Krishnamurthy, and M. Arlitt. Overcoming web server benchmarking challenges in the multi-core era. In Proc. of ICST '12, pages 648--653. IEEE, 2012.
[7]
R. Hashemian, D. Krishnamurthy, M. Arlitt, and N. Carlsson. Improving the scalability of a multi-core web server. In Proc. of ICPE '13, pages 161--172. ACM, 2013.
[8]
P. Jamshidi and G. Casale. An uncertainty-aware approach to optimal configuration of stream processing systems. arXiv preprint arXiv:1606.06543, 2016.
[9]
E. D. Lazowska, J. Zahorjan, G. S. Graham, and K. C. Sevcik. Quantitative System Performance: Computer System Analysis Using Queueing Network Models. Prentice-Hall, Inc., 1984.
[10]
K. Molka and G. Casale. Experiments or simulation? a characterization of evaluation methods for in-memory databases. In Proc. of CNSM '15, pages 201--209. IEEE, 2015.
[11]
R. H. Myers, D. C. Montgomery, and C. M. Anderson-Cook. Response Surface Methodology: Process and Product Optimization using Designed Experiments, volume 705. John Wiley & Sons, 2009.
[12]
R. L. Plackett and J. P. Burman. The design of optimum multifactorial experiments. Biometrika, pages 305--325, 1946.
[13]
R. Reussner, P. Sanders, L. Prechelt, and M. Müller. SKaMPI: A detailed, accurate MPI benchmark. In Recent Advances in Parallel Virtual Machine and Message Passing Interface, pages 52--59. Springer, 1998.
[14]
J. A. Rolia and K. C. Sevcik. The method of layers. IEEE Transactions on Software Engineering, 21(8):689--700, 1995.
[15]
Uqlab: The framework for uncertainty quantification. http://www.uqlab.com/.
[16]
D. Westermann, J. Happe, R. Krebs, and R. Farahbod. Automated inference of goal-oriented performance prediction functions. In Proc. of ASE '12, pages 190--199. ACM/IEEE, 2012.
[17]
D. Westermann, R. Krebs, and J. Happe. Efficient experiment selection in automated software performance evaluations. In Computer Performance Engineering, pages 325--339. Springer, 2011.

Cited By

View all

Index Terms

  1. IRIS: Iterative and Intelligent Experiment Selection

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICPE '17: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering
    April 2017
    450 pages
    ISBN:9781450344043
    DOI:10.1145/3030207
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 April 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. controlled experimentation
    2. performance benchmarking
    3. system performance

    Qualifiers

    • Research-article

    Conference

    ICPE '17
    Sponsor:

    Acceptance Rates

    ICPE '17 Paper Acceptance Rate 27 of 83 submissions, 33%;
    Overall Acceptance Rate 252 of 851 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 07 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media