skip to main content
10.1145/3105726.3106180acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article
Public Access

Quantifying Incremental Development Practices and Their Relationship to Procrastination

Published: 14 August 2017 Publication History

Abstract

We present quantitative analyses performed on character-level program edit and execution data, collected in a junior-level data structures and algorithms course. The goal of this research is to determine whether proposed measures of student behaviors such as incremental development and procrastination during their program development process are significantly related to the correctness of final solutions, the time when work is completed, or the total time spent working on a solution. A dataset of 6.3 million fine-grained events collected from each student's local Eclipse environment is analyzed, including the edits made and events such as running the program or executing software tests. We examine four primary metrics proposed as part of previous work, and also examine variants and refinements that may be more effective. We quantify behaviors such as working early and often, frequency of program and test executions, and incremental writing of software tests. Projects where the author had an earlier mean time of edits were more likely to submit their projects earlier and to earn higher scores for correctness. Similarly earlier median time of edits to software tests was also associated with higher correctness scores. No significant relationships were found with incremental test writing or incremental checking of work using either interactive program launches or running of software tests, contrary to expectations. A preliminary prediction model with 69% accuracy suggests that the underlying metrics may support early prediction of student success on projects. Such metrics also can be used to give targeted feedback to help students improve their development practices.

References

[1]
Amjad Altadmri and Neil C.C. Brown. 2015. 37 Million Compilations: Investigating Novice Programming Mistakes in Large-Scale Student Data. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15). ACM, New York, NY, USA, 522--527. x978--1--4503--2966--8 https://doi.org/10.1145/2676723.2677258
[2]
Neil Christopher Charles Brown, Michael Kölling, Davin McCall, and Ian Utting. 2014. Blackbox: A Large Scale Repository of Novice Programmers' Activity. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (SIGCSE '14). ACM, New York, NY, USA, 223--228. x978--1--4503--2605--6 https://doi.org/10.1145/2538862.2538924
[3]
Kevin Buffardi and Stephen H. Edwards. 2014. A Formative Study of Influences on Student Testing Behaviors. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (SIGCSE '14). ACM, New York, NY, USA, 597--602. x978--1--4503--2605--6 https://doi.org/10.1145/2538862.2538982
[4]
Adam Scott Carter and Christopher David Hundhausen. 2017. Using Programming Process Data to Detect Differences in Students' Patterns of Programming. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '17). ACM, New York, NY, USA, 105--110. x978--1--4503--4698--6 https://doi.org/10.1145/3017680.3017785
[5]
Adam S. Carter, Christopher D. Hundhausen, and Olusola Adesope. 2015. The Normalized Programming State Model: Predicting Student Performance in Computing Courses Based on Programming Behavior. In Proceedings of the Eleventh Annual International Conference on International Computing Education Research (ICER '15). ACM, New York, NY, USA, 141--150. x978--1--4503--3630--7 https://doi.org/10.1145/2787622.2787710
[6]
Stephen H. Edwards. 2003. Improving Student Performance by Evaluating How Well Students Test Their Own Programs. J. Educ. Resour. Comput. 3, 3, Articlebibinfoarticleno1 (Sept. 2003). 1531--4278 https://doi.org/10.1145/1029994.1029995
[7]
Stephen H. Edwards and Manuel A. Perez-Quinones. 2008. Web-CAT: Automatically Grading Programming Assignments. In Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE '08). ACM, New York, NY, USA, 328--328. x978--1--60558-078--4 https://doi.org/10.1145/1384271.1384371
[8]
Stephen H. Edwards, Jason Snyder, Manuel A. Pérez-Quinones, Anthony Allevato, Dongkwan Kim, and Betsy Tretola. 2009. Comparing Effective and Ineffective Behaviors of Student Programmers. In Proceedings of the Fifth International Workshop on Computing Education Research Workshop (ICER '09). ACM, New York, NY, USA, 3--14. x978--1--60558--615--1 https://doi.org/10.1145/1584322.1584325
[9]
Juha Helminen, Petri Ihantola, and Ville Karavirta. 2013. Recording and Analyzing In-browser Programming Sessions. In Proceedings of the 13th Koli Calling International Conference on Computing Education Research (Koli Calling '13). ACM, New York, NY, USA, 13--22. x978--1--4503--2482--3 https://doi.org/10.1145/2526968.2526970
[10]
Juha Helminen, Petri Ihantola, Ville Karavirta, and Lauri Malmi. 2012. How Do Students Solve Parsons Programming Problems?: An Analysis of Interaction Traces. In Proceedings of the Ninth Annual International Conference on International Computing Education Research (ICER '12). ACM, New York, NY, USA, 119--126. x978--1--4503--1604-0 https://doi.org/10.1145/2361276.2361300
[11]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H. Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, Miguel Ángel Rubio, Judy Sheard, Bronius Skupas, Jaime Spacco, Claudia Szabo, and Daniel Toll. 2015. Educational Data Mining and Learning Analytics in Programming: Literature Review and Case Studies. In Proceedings of the 2015 ITiCSE on Working Group Reports (ITICSE-WGR '15). ACM, New York, NY, USA, 41--63. x978--1--4503--4146--2 https://doi.org/10.1145/2858796.2858798
[12]
Matthew C Jadud. 2005. A First Look at Novice Compilation Behaviour Using BlueJ. Computer Science Education 15, 1 (2005), 25--40.
[13]
Matthew C. Jadud. 2006. Methods and Tools for Exploring Novice Compilation Behaviour. In Proceedings of the Second International Workshop on Computing Education Research (ICER '06). ACM, New York, NY, USA, 73--84. x1--59593--494--4 https://doi.org/10.1145/1151588.1151600
[14]
Philip M Johnson, Hongbing Kou, Joy M Agustin, Qin Zhang, Aaron Kagawa, and Takuya Yamashita. 2004. Practical automated process and product metric collection and analysis in a classroom setting: Lessons learned from Hackystat-UH. In Proceedings of the 2004 International Symposium on Empirical Software Engineering, ISESE'04. 136--144.
[15]
Ayaan M. Kazerouni, Stephen H. Edwards, T. Simin Hall, and Clifford A. Shaffer. 2017. DevEventTracker: Tracking development events to assess incremental development and procrastination. In Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education (ITiSCE '17). ACM, New York, NY, USA. https://doi.org/10.1145/2361276.2361300
[16]
Joshua Martin, Stephen H. Edwards, and Clfford A. Shaffer. 2015. The Effects of Procrastination Interventions on Programming Project Success. In Proceedings of the Eleventh Annual International Conference on International Computing Education Research (ICER '15). ACM, New York, NY, USA, 3--11. x978--1--4503--3630--7 https://doi.org/10.1145/2787622.2787730
[17]
Allen Newell, Paul S Rosenbloom, and JR Anderson. 1981. Mechanisms of skill acquisition and the law of practice. Cognitive skills and their acquisition 1 (1981), 1--55.
[18]
Jaime Spacco and William Pugh. 2006. Helping Students Appreciate Test-driven Development (TDD). In Companion to the 21st ACM SIGPLAN Symposium on Object-oriented Programming Systems, Languages, and Applications (OOPSLA '06). ACM, New York, NY, USA, 907--913. x1--59593--491-X https://doi.org/10.1145/1176617.1176743
[19]
Piers Steel. 2007. The nature of procrastination: a meta-analytic and theoretical review of quintessential self-regulatory failure. (2007).
[20]
C. Watson, F. W. B. Li, and J. L. Godwin. 2013. Predicting Performance in an Introductory Programming Course by Logging and Analyzing Student Programming Behavior. In 2013 IEEE 13th International Conference on Advanced Learning Technologies. 319--323. 2161--3761 https://doi.org/10.1109/ICALT.2013.99

Cited By

View all
  • (2024)Insights from the Field: Exploring Students' Perspectives on Bad Unit Testing PracticesProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653643(101-107)Online publication date: 3-Jul-2024
  • (2024)Incremental Development and CS1 Student Outcomes And BehaviorsProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636253(87-93)Online publication date: 29-Jan-2024
  • (2024)Do Behavioral Factors Influence the Extent to which Students Engage with Formative Practice Opportunities?Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630833(18-24)Online publication date: 7-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICER '17: Proceedings of the 2017 ACM Conference on International Computing Education Research
August 2017
316 pages
ISBN:9781450349680
DOI:10.1145/3105726
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 August 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. IDE
  2. deveventtracker
  3. eclipse
  4. educational data mining
  5. incremental development
  6. metrics
  7. plugin
  8. software development process

Qualifiers

  • Research-article

Funding Sources

Conference

ICER '17
Sponsor:
ICER '17: International Computing Education Research Conference
August 18 - 20, 2017
Washington, Tacoma, USA

Acceptance Rates

ICER '17 Paper Acceptance Rate 29 of 180 submissions, 16%;
Overall Acceptance Rate 189 of 803 submissions, 24%

Upcoming Conference

ICER 2025
ACM Conference on International Computing Education Research
August 3 - 6, 2025
Charlottesville , VA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)121
  • Downloads (Last 6 weeks)24
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Insights from the Field: Exploring Students' Perspectives on Bad Unit Testing PracticesProceedings of the 2024 on Innovation and Technology in Computer Science Education V. 110.1145/3649217.3653643(101-107)Online publication date: 3-Jul-2024
  • (2024)Incremental Development and CS1 Student Outcomes And BehaviorsProceedings of the 26th Australasian Computing Education Conference10.1145/3636243.3636253(87-93)Online publication date: 29-Jan-2024
  • (2024)Do Behavioral Factors Influence the Extent to which Students Engage with Formative Practice Opportunities?Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630833(18-24)Online publication date: 7-Mar-2024
  • (2024)The Temporal Dynamics of Procrastination and its Impact on Academic Performance: The Case of a Task-oriented Programming CourseProceedings of the 39th ACM/SIGAPP Symposium on Applied Computing10.1145/3605098.3636072(48-55)Online publication date: 8-Apr-2024
  • (2023)Personalized Agent-Based Procrastination Suppression SystemProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638449(657-668)Online publication date: 2-Dec-2023
  • (2023)A Model of How Students Engineer Test Cases With FeedbackACM Transactions on Computing Education10.1145/362860424:1(1-31)Online publication date: 20-Oct-2023
  • (2023)The Impact of a Remote Live-Coding Pedagogy on Student Programming Processes, Grades, and Lecture Questions AskedProceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 110.1145/3587102.3588846(533-539)Online publication date: 29-Jun-2023
  • (2023)Understanding and Measuring Incremental Development in CS1Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569880(722-728)Online publication date: 2-Mar-2023
  • (2023)Accurate Estimation of Time-on-Task While ProgrammingProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569804(708-714)Online publication date: 2-Mar-2023
  • (2023)Do the Test Smells Assertion Roulette and Eager Test Impact Students' Troubleshooting and Debugging Capabilities?Proceedings of the 45th International Conference on Software Engineering: Software Engineering Education and Training10.1109/ICSE-SEET58685.2023.00009(29-39)Online publication date: 17-May-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media