skip to main content
10.1145/3338906.3341181acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Eagle: a team practices audit framework for agile software development

Published: 12 August 2019 Publication History

Abstract

Agile/XP (Extreme Programming) software teams are expected to follow a number of specific practices in each iteration, such as estimating the effort (”points”) required to complete user stories, properly using branches and pull requests to coordinate merging multiple contributors’ code, having frequent ”standups” to keep all team members in sync, and conducting retrospectives to identify areas of improvement for future iterations. We combine two observations in developing a methodology and tools to help teams monitor their performance on these practices. On the one hand, many Agile practices are increasingly supported by web-based tools whose ”data exhaust” can provide insight into how closely the teams are following the practices. On the other hand, some of the practices can be expressed in terms similar to those developed for expressing service level objectives (SLO) in software as a service; as an example, a typical SLO for an interactive Web site might be ”over any 5-minute window, 99% of requests to the main page must be delivered within 200ms” and, analogously, a potential Team Practice (TP) for an Agile/XP team might be ”over any 2-week iteration, 75% of stories should be ’1-point’ stories”. Following this similarity, we adapt a system originally developed for monitoring and visualizing service level agreement (SLA) compliance to monitor selected TPs for Agile/XP software teams. Specifically, the system consumes and analyzes the data exhaust from widely-used tools such as GitHub and Pivotal Tracker and provides team(s) and coach(es) a ”dashboard” summarizing the teams’ adherence to various practices. As a qualitative initial investigation of its usefulness, we deployed it to twenty student teams in a four-sprint software engineering project course. We find an improvement of the adherence to team practice and a positive students’ self-evaluations of their team practices when using the tool, compared to previous experiences using an Agile/XP methodology. The demo video is located at <a>https://youtu.be/A4xwJMEQh9c</a> and a landing page with a live demo at <a>https://isa-group.github.io/2019-05-eagle-demo/</a>.

References

[1]
Xiaoying Bai, Mingjie Li, Dan Pei, Shanshan Li, and Deming Ye. 2018. Continuous delivery of personalized assessment and feedback in agile software engineering projects. In 2018 IEEE/ACM 40th International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET). IEEE, 58–67.
[2]
Sebastian Baltes and Stephan Diehl. 2018. Towards a Theory of Software Development Expertise. In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE 2018). ACM, New York, NY, USA, 187–200.
[3]
Kent Beck and Erich Gamma. 2000. Extreme programming explained: embrace change. addison-wesley professional.
[4]
Hennie Huijgens, Robert Lamping, Dick Stevens, Hartger Rothengatter, Georgios Gousios, and Daniele Romano. 2017. Strong Agile Metrics: Mining Log Data to Determine Predictive Power of Software Metrics for Continuous Delivery Teams. In Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2017). ACM, New York, NY, USA, 866–871. org/10.1145/3106237.3117779
[5]
Eetu Kupiainen, Mika V. Mäntylä, and Juha Itkonen. 2014. Why Are Industrial Agile Teams Using Metrics and How Do They Use Them?. In Proceedings of the 5th International Workshop on Emerging Trends in Software Metrics (WETSoM 2014). ACM, New York, NY, USA, 23–29.
[6]
Christoph Matthies, Thomas Kowark, Keven Richly, Matthias Uflacker, and Hasso Plattner. 2016. How Surveys, Tutors and Software Help to Assess Scrum Adoption in a Classroom Software Engineering Project. In 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C). IEEE, 313–322.
[7]
André N. Meyer, Thomas Fritz, Gail C. Murphy, and Thomas Zimmermann. 2014. Software Developers’ Perceptions of Productivity. In Proceedings of the 22Nd ACM SIGSOFT International Symposium on Foundations of Software Engineering (FSE 2014). ACM, New York, NY, USA, 19–29.
[8]
Carlos Müller, Pablo Fernandez, Antonio M. Gutierrez, Octavio Martín-Díaz, Manuel Resinas, and Antonio Ruiz-Cortés. 2018. Automated Validation of Compensable SLAs. IEEE Transactions on Services Computing (2018).
[9]
Ken Schwaber and Mike Beedle. 2002. Agile software development with Scrum. Vol. 1. Prentice Hall Upper Saddle River.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ESEC/FSE 2019: Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
August 2019
1264 pages
ISBN:9781450355728
DOI:10.1145/3338906
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 August 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. agile
  2. team dashboard
  3. team practice
  4. team practice agreement

Qualifiers

  • Research-article

Funding Sources

  • Ministerio de Ciencia e Innovación

Conference

ESEC/FSE '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 112 of 543 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)0
Reflects downloads up to 05 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media