skip to main content
10.1145/3193965.3193971acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
short-paper

Towards an experiment line on software inspection with human computation

Published: 28 May 2018 Publication History

Abstract

Software Inspection is an important approach to find defects in Software Engineering (SE) artifacts. While there has been extensive research on traditional software inspection with pen-and-paper materials, modern SE poses new environments, methods, and tools for the cooperation of software engineers. Technologies, such as Human Computation (HC), provide tool support for distributed and tool-mediated work processes. However, there is little empirical experience on how to leverage HC for software inspection. In this vision paper, we present the context for a research program on this topic and introduce the preliminary concept of a theory-based experiment line to facilitate designing experiment families that fit together to answer larger questions than individual experiments. We present an example feature model for an experiment line for Software Inspection with Human Computation and discuss its expected benefits for the research program, including the coordination of research, design and material reuse, and aggregation facilities.

References

[1]
Aurum A., Petersson H., Wohlin C.: "State-of-the-Art: Software Inspection after 25 years", Software Testing Verification and Reliability, 12(3), pp. 133-- 154, 2002.
[2]
Acher, M., Lopez-Herrejon, R.E. and Rabiser, R.: "Teaching software product lines: A snapshot of current practices and challenges", ACM Trans. on Computing Education (TOCE), 18 (1), 31 pages, 2017.
[3]
Basili, V.R., Shull, F. and Lanubile, F.: "Building knowledge through families of experiments", IEEE Trans. on Software Engineering, 25(4), pp.456--473, 1999.
[4]
Biffl S., Freimut B., Laitenberger O.: "Investigating the cost-effectiveness of reinspection in software development", In: Proc. of ICSE, pp.155--164, 2001.
[5]
Brambilla M., Cabot J., Wimmer M.: "Model-Driven Software Engineering in Practice", Morgan & Claypool publishers, 2017.
[6]
Fagan, M.E., "Design and Code Inspection to Reduce Errors in Program Development", IBM Systems Journal, vol. 15, no. 3, pp. 182--211, 1976.
[7]
Jeffery R: "Empirical Methods and Theory Research in Software Engineering: State of the Art and State of Practice", Keynote, Asia-Pacific Software Engineering Conference, 2017.
[8]
Kalinowski, M., Travassos, G.H.: "A computational framework for supporting software inspections", In: Int. Conf. on Automated Software Engineering (ASE), pp. 46--55, 2004.
[9]
Laitenberger, O., DeBaud, J.-M.: "An encompassing life cycle centric survey of software inspection", Journal of Systems and Software, vol. 50 (1), pp. 5--31, 2000.
[10]
Land L.P.W., Wong B., Jeffery R.: "An extension of the behavioral theory of group performance in software development technical reviews", In: Proc. of APSEC, pp.520--530, 2003.
[11]
LaToza T.D., van der Hoek A.: "Crowdsourcing in Software Engineering: Models, Motivations, and Challenges", IEEE Software, vol. 33 (1), pp. 74--80, 2016.
[12]
Mao K., Capra L., Harman M., Jia Y.: "A survey of the use of crowdsourcing in software engineering", In: Journal of Systems and Software, 28p, 2016.
[13]
Pohl, K., Böckle, G., van der Linden, F.: "Software Product Line Engineering: Foundations, Principles, and Techniques", Springer, 2005.
[14]
Sauer C., Jeffery D.R., Land L., Yetton P.: "The effectiveness of software development technical reviews: a behaviourally motivated program of research", IEEE Transactions on Software Engineering (TSE), 26(1), pp.1--14, 2000.
[15]
Winkler D., Sabou M., Petrovic S., Carneiro G., Kalinowski M., Biffl S.: "Improving Model Inspection with Crowdsourcing", In: International Workshop on Crowdsourcing in Software Engineering (CSI-SE), ICSE, pp.30--34, 2017.
[16]
Winkler D., Sabou M., Petrovic S., Carneiro G., Kalinowski M., Biffl S.: "Investigating Model Quality Assurance with a Distributed and Scalable Review Process', In: Ibero-American Conf. on Software Engineering (CIBSE), pp.141--154, 2017.
[17]
Winkler D., Sabou M., Petrovic S., Carneiro G., Kalinowski M., Biffl S.: "'Improving Model Inspection Processes with Crowdsourcing: Findings from a Controlled Experiment", In: European Conference on Systems, Software & Service Process Improvement and Innovation (EuroSPI), CCIS, volume 748, pp.125--137, 2017.

Cited By

View all
  • (2020)Empirical Software Engineering Experimentation with Human ComputationContemporary Empirical Methods in Software Engineering10.1007/978-3-030-32489-6_7(173-215)Online publication date: 28-Aug-2020
  • (2019)A Preliminary Comparison of Using Variability Modeling Approaches to Represent Experiment FamiliesProceedings of the 23rd International Conference on Evaluation and Assessment in Software Engineering10.1145/3319008.3319356(333-338)Online publication date: 15-Apr-2019
  • (2019)Towards Modeling Variability of Products, Processes and Resources in Cyber-Physical Production Systems EngineeringProceedings of the 23rd International Systems and Software Product Line Conference - Volume B10.1145/3307630.3342411(49-56)Online publication date: 9-Sep-2019

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CESI '18: Proceedings of the 6th International Workshop on Conducting Empirical Studies in Industry
May 2018
45 pages
ISBN:9781450357364
DOI:10.1145/3193965
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 May 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. collaborative and social computing
  2. empirical software engineering
  3. empirical studies
  4. experiment lines
  5. experimentation
  6. human computation
  7. software defect analysis
  8. software inspection
  9. software verification

Qualifiers

  • Short-paper

Conference

ICSE '18
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Empirical Software Engineering Experimentation with Human ComputationContemporary Empirical Methods in Software Engineering10.1007/978-3-030-32489-6_7(173-215)Online publication date: 28-Aug-2020
  • (2019)A Preliminary Comparison of Using Variability Modeling Approaches to Represent Experiment FamiliesProceedings of the 23rd International Conference on Evaluation and Assessment in Software Engineering10.1145/3319008.3319356(333-338)Online publication date: 15-Apr-2019
  • (2019)Towards Modeling Variability of Products, Processes and Resources in Cyber-Physical Production Systems EngineeringProceedings of the 23rd International Systems and Software Product Line Conference - Volume B10.1145/3307630.3342411(49-56)Online publication date: 9-Sep-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media