skip to main content
10.1145/1159733.1159742acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
Article

Evaluating guidelines for empirical software engineering studies

Published: 21 September 2006 Publication History

Abstract

Background. Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Andreas Jedlitschka and Dietmar Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. If guidelines are flawed, they will cause more problems that they solve.Aim. The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed.Method. We used perspective-based inspections to perform a theoretical evaluation of the guidelines. A separate inspection was performed for each perspective. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the inspections were based on a set of questions derived by brainstorming. The inspection using the Author perspective reviewed each section of the guidelines sequentially. Results. The question-based perspective inspections detected 42 issues where the guidelines would benefit from amendment or clarification and 8 defects.Conclusions. Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. Software engineering researchers need to be cautious about adopting reporting guidelines that differ from those used by other disciplines. The current guidelines need to be revised and the revised guidelines need to be subjected to further theoretical and empirical validation. Perspective-based inspection is a useful validation method but the practitioner/consultant perspective presents difficulties.

References

[1]
Zeiad Abdelnabi, Giovanni Cantone, Marcus Ciolkowski, Dieter Rombach. Comparing Code Reading Techniques Applied to Object-Oriented Software Frameworks with Regard to Effectiveness and Defect Detection Rate Proceedings ISESE 04, 2004.
[2]
Silvia Abrahao, Geert Poels, and Oscar Pastor. Assessing the Reproducibility and Accuracy of Functional Size Measurement Methods through Experimentation, Proceedings ISESE 04, 2004.
[3]
Tore Dybå, Vigdis By Kampenes, Dag I.K. Sjøberg. A systematic review of statistical power in software engineering experiments. Information and Software Technology, in press.
[4]
Peter Harris. Designing and Reporting Experiments in Psychology. 2nd Edition, Open University Press, 2002.
[5]
James Hartley. Current findings from research on structured abstracts. J. Med. Libr. Assoc. 92(3), July 2004, pp 368--371.
[6]
Andreas Jedlitschka and Dietmar Pfahl. Reporting Guidelines for Controlled Experiments in Software Engineering. IESEReport IESE-035.5/E, 2005.
[7]
Barbara Kitchenham, Shari Lawrence Pfleeger, Lesley Pickard, Peter Jones, David Hoaglin, Khaled El Emam, and Jarrett Rosenberg. Preliminary Guidelines for Empirical Research in Software Engineering. IEEE Transactions on Software Engineering, 28(8) August 2002, pp721--734.
[8]
David Moher, Kenneth F Schultz, Douglas Altman. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomized trials. The LANCET, 357, April 14, 2001, pp. 1191--1194.
[9]
Pickard, L.M., Kitchenham, B.A., and Jones, P. (1998) Combining Empirical Results in Software Engineering. Information and Software Technology. Vol 40 No 14, pp 811--821.
[10]
Patrick J. Schroeder, Pankaj Bolaki, and Vijayram Gopu. Comparing the Fault Detection Effectiveness of N-way and Random Test Suites, Proceedings ISESE 04, 2004.
[11]
Dag I.K. Sjoberg, Jo E. Hannay Ove Hansen, Vigdis By Kampenes, Amela Karahasanovic, Nils-Kristian Liborg, Anette C. Rekdal. A Survey of Controlled Experiments in Software Engineering IEEE Transactions on Software Engineering, September 2005 (Vol. 31, No. 9) pp. 733--753.
[12]
Jan Verelst. The Influence of the Level of Abstraction on the Evolvability of Conceptual Models of Information Systems. Proceedings ISESE 04, 2004.
[13]
Wholin, C., Petersson, H., and Aurum, A. Combining data from reading experiments in Software Inspections. In Juristo, N. and Moreno, A. (eds.) Lecture Notes on Empirical Software Engineering, World Scientific Publishing, October 2003.
[14]
Wholin, C., Runeson, P., Höst, M., Regnell, B., and Wesslén, A. Experimentation in Software Engineering. An Introduction. Kluwer Academic Publishers, 2000.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ISESE '06: Proceedings of the 2006 ACM/IEEE international symposium on Empirical software engineering
September 2006
388 pages
ISBN:1595932186
DOI:10.1145/1159733
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 September 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. controlled experiments
  2. guidelines
  3. perspective-based inspection
  4. software engineering

Qualifiers

  • Article

Conference

ISESE06
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)25
  • Downloads (Last 6 weeks)1
Reflects downloads up to 06 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media