skip to main content
10.1145/3209635.3209652acmotherconferencesArticle/Chapter ViewAbstractPublication PageswccceConference Proceedingsconference-collections
research-article

Visualizing Code Patterns in Novice Programmers

Published: 04 May 2018 Publication History

Abstract

Many researchers have investigated the difficulties faced by novice programmers. However, these approaches have so far focused primarily on the identification and correction of common syntax errors, or that of topic difficulty in the CS1 curriculum. Meanwhile, poor coding practices adopted by students have gone mostly unaddressed. While these practices may not necessarily lead to erroneous code, they may nonetheless indicate areas of difficulty and lead to poorly structured programs. To address these issues, our project examines students' coding habits and common errors in CS1 exercises gathered from 77 first-year students. This data was collected in real time so that we may later reconstruct the thought process of the student while solving the programming exercises. To assist our analysis, we built a code visualizer that animates the programming process dynamically and summarizes error metrics simultaneously. Our ultimate goal is to use the code visualizer to help either an instructor or a student to identify poor programming practices during the coding process. With the error metrics gathered, an instructor can inspect potential improvements in coding behaviors for an individual student at a given point in time or over time, and identify bad coding habits common to populations of students.

References

[1]
Amjad Altadmri and Neil C.C. Brown. 2015. 37 Million Compilations: Investigating Novice Programming Mistakes in Large-Scale Student Data. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15). ACM, New York, NY, USA, 522--527.
[2]
Neil C.C. Brown and Amjad Altadmri. 2014. Investigating Novice Programming Mistakes: Educator Beliefs vs. Student Data. In Proceedings of the Tenth Annual Conference on International Computing Education Research (ICER '14). ACM, New York, NY, USA, 43--50.
[3]
Ricardo Caceffo, Steve Wolfman, Kellogg S. Booth, and Rodolfo Azevedo. 2016. Developing a Computer Science Concept Inventory for Introductory Programming. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE '16). ACM, New York, NY, USA, 364--369.
[4]
Yuliya Cherenkova, Daniel Zingaro, and Andrew Petersen. 2014. Identifying Challenging CS1 Concepts in a Large Problem Dataset. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (SIGCSE '14). ACM, New York, NY, USA, 695--700.
[5]
Fanny Chevalier, Pierre Dragicevic, Anastasia Bezerianos, and Jean-Daniel Fekete. 2010. Using Text Animated Transitions to Support Navigation in Document Histories. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 683--692.
[6]
Carlos Fernandez-Medina, Juan Ramón Pérez-Pérez, Victor M. Álvarez García, and M. del Puerto Paule-Ruiz. 2013. Assistance in Computer Programming Learning Using Educational Data Mining and Learning Analytics. In Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '13). ACM, New York, NY, USA, 237--242.
[7]
Maria Hristova, Ananya Misra, Megan Rutter, and Rebecca Mercuri. 2003. Identifying and Correcting Java Programming Errors for Introductory Computer Science Students. In Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education (SIGCSE '03). ACM, New York, NY, USA, 153--156.
[8]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H. Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, Miguel Ángel Rubio, Judy Sheard, Bronius Skupas, Jaime Spacco, Claudia Szabo, and Daniel Toll. 2015. Educational Data Mining and Learning Analytics in Programming: Literature Review and Case Studies. In Proceedings of the 2015 ITiCSE on Working Group Reports (ITICSE-WGR '15). ACM, New York, NY, USA, 41--63.
[9]
Cindy Norris, Frank Barry, James B. Fenwick Jr., Kathryn Reid, and Josh Rountree. 2008. ClockIt: Collecting Quantitative Data on How Beginning Software Developers Really Work. In Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE '08). ACM, New York, NY, USA, 37--41.
[10]
Arto Vihavainen, Juha Helminen, and Petri Ihantola. 2014. How Novices Tackle Their First Lines of Code in an IDE: Analysis of Programming Session Traces. In Proceedings of the 14th Koli Calling International Conference on Computing Education Research (Koli Calling '14). ACM, New York, NY, USA, 109--116.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
WCCCE '18: Proceedings of the 23rd Western Canadian Conference on Computing Education
May 2018
86 pages
ISBN:9781450358057
DOI:10.1145/3209635
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 May 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CS1
  2. Source code snapshot analysis
  3. code metrics
  4. code patterns
  5. error diagnosis
  6. learning analytics
  7. programming behavior
  8. programming session trace analysis
  9. self-regulation

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

WCCCE '18

Acceptance Rates

WCCCE '18 Paper Acceptance Rate 19 of 29 submissions, 66%;
Overall Acceptance Rate 78 of 117 submissions, 67%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)35
  • Downloads (Last 6 weeks)5
Reflects downloads up to 28 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media