skip to main content
10.1145/2884781.2884840acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Code review quality: how developers see it

Published: 14 May 2016 Publication History

Abstract

In a large, long-lived project, an effective code review process is key to ensuring the long-term quality of the code base. In this work, we study code review practices of a large, open source project, and we investigate how the developers themselves perceive code review quality. We present a qualitative study that summarizes the results from a survey of 88 Mozilla core developers. The results provide developer insights into how they define review quality, what factors contribute to how they evaluate submitted code, and what challenges they face when performing review tasks. We found that the review quality is primarily associated with the thoroughness of the feedback, the reviewer's familiarity with the code, and the perceived quality of the code itself. Also, we found that while different factors are perceived to contribute to the review quality, reviewers often find it difficult to keep their technical skills up-to-date, manage personal priorities, and mitigate context switching.

References

[1]
K. Amant and P. Zemliansky. Internet-based Workplace Communications: Industry & Academic Applications. Information Science Pub., 2005.
[2]
J. Anvik. Automating bug report assignment. In Proceedings of the 28th International Conference on Software Engineering, pages 937--940, 2006.
[3]
J. Anvik, L. Hiew, and G. C. Murphy. Who should fix this bug? In Proc. of the 28th Int. Conference on Software Engineering, pages 361--370, 2006.
[4]
A. Bacchelli and C. Bird. Expectations, outcomes, and challenges of modern code review. In Proceedings of the International Conference on Software Engineering, pages 712--721, 2013.
[5]
O. Baysal, R. Holmes, and M. W. Godfrey. No issue left behind: Reducing information overload in issue tracking. In Proceedings of the 22Nd ACM SIGSOFT International Symposium on Foundations of Software Engineering, pages 666--677, 2014.
[6]
O. Baysal, O. Kononenko, R. Holmes, and M. Godfrey. The secret life of patches: A firefox case study. In Proc. of the 19th Working Conference on Reverse Engineering, pages 447--455, 2012.
[7]
O. Baysal, O. Kononenko, R. Holmes, and M. W. Godfrey. The influence of non-technical factors on code review. In Proc. of the Working Conference on Reverse Engineering, pages 122--131, 2013.
[8]
A. Begel and T. Zimmermann. Analyze this! 145 questions for data scientists in software engineering. In Proceedings of the 36th International Conference on Software Engineering, pages 12--23, 2014.
[9]
M. E. Fagan. Design and code inspections to reduce errors in program development. IBM Syst. J., 15(3):182--211, Sept. 1976.
[10]
D. Freelon. ReCal2: Reliability for 2 coders. http://dfreelon.org/utils/recalfront/recal2/.
[11]
G. Gousios, M. Pinzger, and A. v. Deursen. An exploratory study of the pull-based software development model. In Proceedings of the 36th International Conference on Software Engineering, pages 345--355, 2014.
[12]
G. Gousios, A. Zaidman, M.-A. Storey, and A. van Deursen. Work practices and challenges in pull-based development: The integrator's perspective. In Proceedings of the 37th International Conference on Software Engineering, 2015.
[13]
L. Hatton. Testing the value of checklists in code inspections. IEEE Software, 25(4):82--88, 2008.
[14]
I. Herraiz, D. M. German, J. M. Gonzalez-Barahona, and G. Robles. Towards a simplification of the bug report form in eclipse. In Proc. of the Int. Working Conf. on Mining Soft. Repos., pages 145--148, 2008.
[15]
Y. Jiang, B. Adams, and D. M. German. Will my patch make it? and how fast?: Case study on the linux kernel. In Proceedings of the 10th Working Conference on Mining Software Repositories, pages 101--110, 2013.
[16]
C. F. Kemerer and M. C. Paulk. The impact of design and code reviews on software quality: An empirical study based on psp data. IEEE Trans. Softw. Eng., 35(4):534--550, July 2009.
[17]
O. Kononenko and O. Baysal. A Qualitative Exploratory Study of How OSS Developers Define Code Review Quality. Technical Report CS-2015-14, University of Waterloo, Waterloo, Canada, August 2015.
[18]
O. Kononenko, O. Baysal, L. Guerrouj, Y. Cao, and M. W. Godfrey. Investigating code review quality: Do people and participation matter? In Proceedings of the International Conference on Software Maintenance and Evolution, pages 111--120, 2015.
[19]
J. Marlow, L. Dabbish, and J. Herbsleb. Impression formation in online peer production: Activity traces and personal profiles in github. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, pages 117--128, 2013.
[20]
M. Miles and A. Huberman. Qualitative Data Analysis: An Expanded Sourcebook. SAGE Publications, 1994.
[21]
A. Mockus, R. T. Fielding, and J. D. Herbsleb. Two case studies of open source software development: Apache and mozilla. ACM Transactions on Software Engineering and Methodology, 11(3):309--346, 2002.
[22]
A. Mockus and J. Herbsleb. Expertise browser: a quantitative approach to identifying expertise. In Proceedings of the 24rd International Conference on Software Engineering, pages 503--512, May 2002.
[23]
Mozilla. BMO/ElasticSearch. https://wiki.mozilla.org/BMO/ElasticSearch.
[24]
Mozilla. Code-Review Policy. http://www.mozilla.org/hacking/reviewers.html, August 2015.
[25]
P. C. Rigby and C. Bird. Convergent contemporary software peer review practices. In Proceedings of the 9th Joint Meeting on Foundations of Software Engineering, pages 202--212, 2013.
[26]
P. C. Rigby and D. M. German. A preliminary examination of code review processes in open source projects. Technical Report DCS-305-IR, University of Victoria, January 2006.
[27]
P. C. Rigby and M.-A. Storey. Understanding broadcast based peer review on open source software projects. In Proceedings of the 33rd International Conference on Software Engineering, pages 541--550, 2011.
[28]
L. Singer, F. F. Filho, and M.-A. Storey. Software engineering at the speed of light: How developers stay current using twitter. Technical Report DCS-350-IR, University of Victoria, Victoria, Canada.
[29]
J. Tsay, L. Dabbish, and J. Herbsleb. Influence of social and technical factors for evaluating contribution in github. In Proceedings of the 36th International Conference on Software Engineering, pages 356--366, 2014.
[30]
C. Weiss, R. Premraj, T. Zimmermann, and A. Zeller. How long will it take to fix this bug? In Proceedings of the 4th International Workshop on Mining Software Repositories, pages 1--1, May 2007.
[31]
P. Weissgerber, D. Neu, and S. Diehl. Small patches get in! In Proc. of the 2008 Int. Working Conf. on Mining Soft. Repos., pages 67--76, 2008.
[32]
M. Wiki. MozReview. https://wiki.mozilla.org/Auto-tools/Projects/MozReview, August 2015.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE '16: Proceedings of the 38th International Conference on Software Engineering
May 2016
1235 pages
ISBN:9781450339001
DOI:10.1145/2884781
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 May 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. code review
  2. developer perception
  3. review quality
  4. survey

Qualifiers

  • Research-article

Conference

ICSE '16
Sponsor:

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)293
  • Downloads (Last 6 weeks)25
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media