skip to main content
10.1145/3184407.3184417acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments

Published: 30 March 2018 Publication History

Abstract

Software performance testing is an important activity to ensure quality in continuous software development environments. Current performance testing approaches are mostly based on scripting languages and framework where users implement, in a procedural way, the performance tests they want to issue to the system under test. However, existing solutions lack support for explicitly declaring the performance test goals and intents. Thus, while it is possible to express how to execute a performance test, its purpose and applicability context remain implicitly described. In this work, we propose a declarative domain specific language (DSL) for software performance testing and a model-driven framework that can be programmed using the mentioned language and drive the end-to-end process of executing performance tests. Users of the DSL and the framework can specify their performance intents by relying on a powerful goal-oriented language, where standard (e.g., load tests) and more advanced (e.g., stability boundary detection, and configuration tests) performance tests can be specified starting from templates. The DSL and the framework have been designed to be integrated into a continuous software development process and validated through extensive use cases that illustrate the expressiveness of the goal-oriented language, and the powerful control it enables on the end-to-end performance test execution to determine how to reach the declared intent.

References

[1]
Varsha Apte, T V S Viswanath, Devidas Gawali, Akhilesh Kommireddy, and Anshul Gupta. 2017. AutoPerf: Automated load testing and resource usage profiling of multi-tier internet applications. In Proc. of ICPE. 115--126.
[2]
Maicon Bernardino, Avelino F Zorzo, and Elder M Rodrigues. 2014. Canopus: A Domain-Specific Language for Modeling Performance Testing Proc. of ICSEA. 157--167.
[3]
Andreas Brunnert and Helmut Krcmar. 2017. Continuous performance evaluation and capacity planning using resource profiles for enterprise applications. Journal of Systems and Software Vol. 123, 1 (2017), 239--262.
[4]
Andreas Brunnert, André van Hoorn, Felix Willnecker, et almbox. 2015. Performance-oriented DevOps: A Research Agenda. Technical Report. SPEC RG DevOps.
[5]
Matheus Cunha, Nabor C Mendonc ca, and Américo Sampaio. 2013. A Declarative Environment for Automatic Performance Evaluation in IaaS Clouds Proc. of CLOUD. 285--292.
[6]
Stefano Di Alesio, Shiva Nejati, Arnaud Gotlieb, and Lionel Briand. 2013. Stress testing of task deadlines - A constraint programming approach Proc of. ISSRE. 158--167.
[7]
Vincenzo Ferme, Ana Ivanchikj, and Cesare Pautasso. 2015. A Framework for Benchmarking BPMN 2.0 Workflow Management Systems Proc. of BPM. 251--259.
[8]
Vincenzo Ferme and Cesare Pautasso. 2017. Towards Holistic Continuous Software Performance Assessment Proc. of ICPE Companion. 159--164.
[9]
Brian Fitzgerald and Klaas-Jan Stol. 2017. Continuous software engineering: A roadmap and agenda. Journal of Systems and Software Vol. 123, 1 (2017), 176--189.
[10]
Ilias Gerostathopoulos, Tomas Bures, Sanny Schmid, Vojtech Horký, Christian Prehofer, and Petr Truma. 2016. Towards Systematic Live Experimentation in Software-Intensive Systems of Systems Proc. of ECSA. 1--7.
[11]
Daniel Golovin, Benjamin Solnik, Subhodeep Moitra, Greg Kochanski, John Karro, and D Sculley. 2017. Google Vizier - A Service for Black-Box Optimization Proc. of KDD. 1487--1495.
[12]
Juha Itkonen, Casper Lassenius, and Eero Laukkanen. 2017. Problems, causes and solutions when adopting continuous delivery - A systematic literature review. Information and Software Technology Vol. 82, 2 (2017), 55--79.
[13]
Nicolas Michael, Nitin Ramannavar, Yixiao Shen, Sheetal Patil, and Jan-Lung Sung. 2017. CloudPerf - A Performance Test Framework for Distributed and Dynamic Multi-Tenant Environments. In Proc. of ICPE. 189--200.
[14]
Jean Christophe Petkovich, A Oliveira, Y Zhang, Thomas Reidemeister, and Sebastian Fischmeister. 2015. DataMill: a distributed heterogeneous infrastructure for robust experimentation. Software: Practice and Experience Vol. 46, 10 (2015), 1411--1440.
[15]
A Omar Portillo-Dominguez, Miao Wang, John Murphy, Damien Magoni, Nick Mitchell, Peter F Sweeney, and Erik Altman. 2014. Towards an automated approach to use expert systems in the performance testing of distributed systems. In Proc. of JAMAICA. 22--27.
[16]
Gerald Schermann, Dominik Schöni, Philipp Leitner, and Harald C Gall. 2016. Bifrost - Supporting Continuous Deployment with Automated Enactment of Multi-Phase Live Testing Strategies. In Proc. of Middleware. 12:1--12:14.
[17]
Joel Scheuner, Philipp Leitner, Jürgen Cito, and Harald C Gall. 2014. Cloud Work Bench - Infrastructure-as-Code Based Cloud Benchmarking. Proc. of CloudCom. 246--253.
[18]
Simon Spinner, Giuliano Casale, Fabian Brosig, and Samuel Kounev. 2015. Evaluating approaches to resource demand estimation. Performance Evaluation Vol. 92, C (2015), 51--71.
[19]
Jan Waller, Nils C Ehmke, and Wilhelm Hasselbring. 2015. Including Performance Benchmarks into Continuous Integration to Enable DevOps. ACM SIGSOFT Software Engineering Notes Vol. 40, 2 (2015), 1--4.
[20]
Jürgen Walter, André van Hoorn, Heiko Koziolek, Dusan Okanovic, and Samuel Kounev. 2016. Asking "What"?, Automating the "How"? - The Vision of Declarative Performance Engineering Proc. of ICPE. 91--94.
[21]
Dennis Westermann. 2014. Deriving Goal-oriented Performance Models by Systematic Experimentation. Ph.D. Dissertation. Karlsruhe Institute of Technology.
[22]
Johannes Wettinger, Uwe Breitenbücher, Michael Falkenthal, and Frank Leymann. 2016. Collaborative gathering and continuous delivery of DevOps solutions through repositories. Computer Science - Research and Development Vol. 31, 4 (2016), 1--10.
[23]
Liming Zhu, Len Bass, and George Champlin-Scharff. 2016. DevOps and Its Practices. IEEE Software Vol. 33, 3 (2016), 32--34.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '18: Proceedings of the 2018 ACM/SPEC International Conference on Performance Engineering
March 2018
328 pages
ISBN:9781450350952
DOI:10.1145/3184407
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 March 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. continuous software performance testing
  2. declarative performance tests
  3. goal-driven performance tests

Qualifiers

  • Research-article

Funding Sources

  • Swiss National Science Foundation

Conference

ICPE '18

Acceptance Rates

Overall Acceptance Rate 252 of 851 submissions, 30%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)4
Reflects downloads up to 30 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media