skip to main content
10.1145/3502717.3532168acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
extended-abstract

Steps Learners Take when Solving Programming Tasks, and How Learning Environments (Should) Respond to Them

Published: 07 July 2022 Publication History

Abstract

Every year, millions of students learn how to write programs. Learning activities for beginners almost always include programming tasks that require a student to write a program to solve a particular problem. When learning how to solve such a task, many students need feedback on their previous actions, and hints on how to proceed. In the case of programming, the feedback should take the steps a student has taken towards implementing a solution into account, and the hints should help a student to complete or improve a possibly partial solution. Only a limited number of learning environments for programming give feedback and hints on intermediate steps students take towards a solution, and little is known about the quality of the feedback provided. To determine the quality of feedback of such tools and to help further developing them, we create and curate data sets that show what kinds of steps students take when solving programming exercises for beginners, and what kind of feedback and hints should be provided. This working group aims to 1) select or create several data sets with steps students take to solve programming tasks, 2) introduce a method to annotate students' steps in these data sets, 3) attach feedback and hints to these steps, 4) set up a method to utilize these data sets in various learning environments for programming, and 5) analyse the quality of hints and feedback in these learning environments.

References

[1]
Joseph E. Beck, Kai Min Chang, Jack Mostow, and Albert Corbett. 2008. Does help help? Introducing the Bayesian Evaluation and Assessment Methodology. In Proceedings of the International Conference on Intelligent Tutoring Systems .
[2]
Neil Christopher Charles Brown, Michael Kölling, Davin McCall, and Ian Utting. 2014. Blackbox: A Large Scale Repository of Novice Programmers' Activity. In Proceedings of the ACM Technical Symposium on Computer Science Education (SIGCSE).
[3]
code.org. 2014. Hoc4 and Hoc18 datasets. (2014). https://code.org/research.
[4]
Tyne Crow, Andrew Luxton-Reilly, and Burkhard Wuensche. 2018. Intelligent Tutoring Systems for Programming Education: A Systematic Review. In Proceedings of the Australasian Computing Education Conference .
[5]
Alex Gerdes, Bastiaan Heeren, Johan Jeuring, and L. Thomas van Binsbergen. 2017. Ask-Elle: an Adaptable Programming Tutor for Haskell Giving Automated Feedback. International Journal of Artificial Intelligence in Education, Vol. 27, 1 (2017).
[6]
Andreas Giannakoulas and Stelios Xinogalos. 2020. A review of educational games for teaching programming to primary school students. Handbook of Research on Tools for Teaching Computational Thinking in P-12 Education (2020).
[7]
Samiha Marwan, Ge Gao, Susan Fisk, Thomas W. Price, and Tiffany Barnes. 2020. Adaptive Immediate Feedback Can Improve Novice Programming Engagement and Intention to Persist in Computer Science. In Proceedings of the ACM Conference on International Computing Education Research (ICER) .
[8]
Samiha Marwan, Joseph Jay Williams, and Thomas Price. 2019. An Evaluation of the Impact of Automated Programming Hints on Performance and Learning. In Proceedings of the International Computing Education Research Conference .
[9]
Benjamin Paaßen. 2019. Python Programming Dataset. (2019). https://doi.org/10.4119/unibi/2941052 Bielefeld University.
[10]
Thomas W. Price, Yihuan Dong, and Dragan Lipovac. 2017. iSnap: towards intelligent tutoring in novice programming environments. In Proceedings of the ACM SIGCSE Technical Symposium on computer science education .
[11]
Thomas W. Price, Yihuan Dong, Rui Zhi, Benjamin Paaßen, Nicholas Lytle, Veronica Cateté, and Tiffany Barnes. 2019. A comparison of the quality of data-driven programming hint generation algorithms. International Journal of Artificial Intelligence in Education, Vol. 29, 3 (2019).
[12]
Thomas W. Price, David Hovemeyer, Kelly Rivers, Ge Gao, Austin Cory Bart, Ayaan M. Kazerouni, Brett A. Becker, Andrew Petersen, Luke Gusukuma, Stephen H. Edwards, and David Babcock. 2020. ProgSnap2: A Flexible Format for Programming Process Data. In Proceedings of the ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE).
[13]
Kelly Rivers and Kenneth R Koedinger. 2017. Data-driven hint generation in vast solution spaces: a self-improving Python programming tutor. International Journal of Artificial Intelligence in Education, Vol. 27, 1 (2017).
[14]
Valerie J. Shute. 2008. Focus on formative feedback . Review of Educational Research, Vol. 78, 1 (2008).
[15]
Kurt VanLehn. 2011. The Relative Effectiveness of Human Tutoring, Intelligent Tutoring Systems, and Other Tutoring Systems. Educ. Psych., Vol. 46, 4 (2011).

Cited By

View all

Index Terms

  1. Steps Learners Take when Solving Programming Tasks, and How Learning Environments (Should) Respond to Them

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITiCSE '22: Proceedings of the 27th ACM Conference on on Innovation and Technology in Computer Science Education Vol. 2
    July 2022
    686 pages
    ISBN:9781450392006
    DOI:10.1145/3502717
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 July 2022

    Check for updates

    Author Tags

    1. automated
    2. learning programming
    3. tutoring systems

    Qualifiers

    • Extended-abstract

    Conference

    ITiCSE 2022
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 552 of 1,613 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)36
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 27 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media