skip to main content
10.1145/2899415.2899473acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Benchmarking Introductory Programming Exams: How and Why

Published: 11 July 2016 Publication History

Abstract

Ten selected questions have been included in 13 introductory programming exams at seven institutions in five countries. The students' results on these questions, and on the exams as a whole, lead to the development of a benchmark against which the exams in other introductory programming courses can be assessed. We illustrate some potential benefits of comparing exam performance against this benchmark, and show other uses to which it can be put, for example to assess the size and the overall difficulty of an exam. We invite others to apply the benchmark to their own courses and to share the results with us.

References

[1]
Bennedsen, J. and Caspersen, M.E. (2007). Failure rates in introductory programming. SIGCSE Bulletin, 39:2, 32--36.
[2]
Lister, R., Adams, E. S., Fitzgerald, S., Fone, W., Hamer, J., Lindholm, M., McCartney, R., Moström, E., Sanders, K., Seppälä, O., Simon, B. and Thomas, L. (2004). A multi-national study of reading and tracing skills in novice programmers. SIGCSE Bulletin, 36:4, 119--150.
[3]
Lister, R., Corney, M., Curran, J., D'Souza, D., Fidge, C., Gluga, R., Hamilton, M., Harland, J., Hogan, J., Kay, J., Murphy, T., Roggenkamp, M., Sheard, J., Simon and Teague, D. (2012). Toward a shared understanding of competency in programming: An invitation to the BABELnot project. 14th Australasian Computing Education Conference (ACE 2012), 53--60.
[4]
McCracken, M., Almstrum, V., Diaz, D., Guzdial, M., Hagan, D., Ben-David Kolikant, Y., Laxer, C., Thomas, L., Utting, I. and Wilusz, T. (2001). A multi-national, multi-institutional study assessment of programming skills of first-year CS students. SIGCSE Bulletin - Working Group reports: Making inroads to improve computing education, 33:4, 125--140.
[5]
Oliver, R. and Towers, S. (2000). Benchmarking ICT literacy in tertiary learning settings. 17th Annual Conference of the Australian Society for Computers in Learning in Tertiary Education (ASCILITE 2000), 381--390.
[6]
Petersen, A., Craig, M. and Zingaro, D. (2011). Reviewing CS1 exam question content. 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), 631--636.
[7]
Sheard, J., Simon, Carbone, A., Chinn, D., Clear, T., Corney, M., D'Souza, D., Fenwick, J., Harland, J., Laakso, M.-J. and Teague, D. (2013). How difficult are exams? A framework for assessing the complexity of introductory programming exams. 15th Australasian Computing Education Conference (ACE 2013), 145--154.
[8]
Sheard, J., Simon, Dermoudy, J., D'Souza, D., Hu, M., and Parsons, D. (2014). Benchmarking a set of exam questions for introductory programming. 16th Australasian Computing Education Conference (ACE 2014), 113--121.
[9]
Sim, S.E., Easterbrook, S. and Holt, R. (2003). Using benchmarking to advance research: a challenge to software engineering. 25th International Conference on Software Engineering, 74--83.
[10]
Simon, Sheard, J., Carbone, A., D'Souza, D., Harland, J. and Laakso, M.-J. (2012). Can computing academics assess the difficulty of programming examination questions? 11th Koli Calling International Conference on Computing Education Research, 160--163.
[11]
Simon, Sheard, J., D'Souza, D., Lopez, M., Luxton-Reilly, A., Putro I.H., Robbins, P., Teague, D., and Whalley, J. (2015). How (not) to write an introductory programming exam. 17th Australasian Computing Education Conference (ACE 2015), 137--146.
[12]
Simon and Snowdon, S. (2011). Explaining program code: giving students the answer helps -- but only just. Seventh International Computing Education Research Workshop (ICER 2011), 93--99.
[13]
Simon, B., Clancy, M., McCartney, R., Morrison, B., Richards, B., and Sanders, K. (2010). Making sense of data structures exams. Sixth International Computing Education Research workshop (ICER 2010), 97--105.
[14]
Watson, C. and Li, F.W. (2014). Failure rates in introductory programming revisited. 19th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '14), 39--44.
[15]
Whalley, J., Lister, R., Thompson, E., Clear, T., Robbins, P., Kumar, P.K.A. and Prasad, C. (2006). An Australasian study of reading and comprehension skills in novice programmers, using the Bloom and SOLO taxonomies. Eighth Australasian Computing Education conference (ACE 2006), 243--251.

Cited By

View all

Index Terms

  1. Benchmarking Introductory Programming Exams: How and Why

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITiCSE '16: Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science Education
    July 2016
    394 pages
    ISBN:9781450342315
    DOI:10.1145/2899415
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 July 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. benchmarking
    2. examination
    3. introductory programming

    Qualifiers

    • Research-article

    Conference

    ITiCSE '16
    Sponsor:

    Acceptance Rates

    ITiCSE '16 Paper Acceptance Rate 56 of 147 submissions, 38%;
    Overall Acceptance Rate 552 of 1,613 submissions, 34%

    Upcoming Conference

    ITiCSE '25
    Innovation and Technology in Computer Science Education
    June 27 - July 2, 2025
    Nijmegen , Netherlands

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 29 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media