skip to main content
10.1145/3605468.3609778acmotherconferencesArticle/Chapter ViewAbstractPublication PageswipsceConference Proceedingsconference-collections
extended-abstract

Developing a Computer Science Content Knowledge Test for 10th Grade Students

Published: 27 September 2023 Publication History

Abstract

Computer Science competencies are becoming increasingly important in our current digitised society. To foster students in Computer Science, different countries have introduced subjects with new educational plans based on Computer Science frameworks, such as the K-12 CS Framework and the Informatics Reference Framework for Schools. Those include, in one way or another, four content areas: data and coding (including data structures and their applications), algorithms (involving variables, loops, and software projects), computers and networks (emphasizing the role of computers in a network and data transmission) and information society and data security (covering topics like asymmetric encryption and personal data protection). Various tests have been developed in the field of Computer Science Education for different age groups, primarily at university level for introductory Computer Science courses, and some for upper secondary school level and below, particularly for Computational Thinking. Given the research context, the objective of this study is to develop a set of items that measures the content knowledge in the four aforementioned areas. Specifically, the study focuses on the educational plan of the Computer Science subject called IMP, which is taught from the 8th to the 10th grade in secondary schools in Baden-Württemberg, Germany. In total, 155 items were evaluated by experts, resulting in a test consisting of 68 items. This poster abstract presents the ongoing progress of the test development.

References

[1]
Carlo Bellettini, Violetta Lonati, Dario Malchiodi, Mattia Monga, Anna Morpurgo, and Mauro Torelli. 2015. How Challenging Are Bebras Tasks? An IRT Analysis Based on the Performance of Italian Students. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/2729094.2742603
[2]
Ryan Bockmon and Chris Bourke. 2023. Validation of the Placement Skill Inventory: A CS0/CS1 Placement Exam. SIGCSE 2023 - Proceedings of the 54th ACM Technical Symposium on Computer Science Education 1 (2023), 39 – 45. https://doi.org/10.1145/3545945.3569738 Cited by: 0; All Open Access, Bronze Open Access.
[3]
Michael E. Caspersen, Ira Diethelm, Judith Gal-Ezer, Andrew McGettrick, Enrico Nardelli, Don Passey, Branislav Rovan, and Mary Webb. 2022. Informatics Reference Framework for School. https://www.informaticsforall.org/wp-content/uploads/2022/03/Informatics-Reference-Framework-for-School-release-February-2022.pdf
[4]
K-12 Computer Science Framework Steering Committee. 2016. K-12 Computer Science Framework. Technical Report. New York, NY, USA.
[5]
Stuart W. Elliott. 2017. Computers and the Future of Skill Demand. (2017). https://doi.org/10.1787/9789264284395-en
[6]
Rina P. Y. Lai. 2021. Beyond Programming: A Computer-Based Assessment of Computational Thinking Competency. ACM Trans. Comput. Educ. 22, 2, Article 14 (nov 2021), 27 pages. https://doi.org/10.1145/3486598
[7]
Mariana Lilley and Andrew Pyper. 2009. The Application of the Flexilevel Approach for the Assessment of Computer Science Undergraduates. In Human-Computer Interaction. Interacting in Various Application Domains, Julie A. Jacko (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 140–148.
[8]
Ministerium für Kultus, Jugend und Sport Baden Württemberg. 2018. Bildungsplan zum Profilfach Informatik, Mathematik, Physik (IMP). (2018). http://bildungsplaene-bw.de/site/bildungsplan/get/documents/lsbw/export-pdf/depot-pdf/ALLG/BP2016BW_ALLG_GYM_IMP.pdf
[9]
Jonas Neugebauer, Peter Hubwieser, Johannes Magenheim, Laura Ohrndorf, Niclas Schaper, and Sigrid Schubert. 2014. Measuring Student Competences in German Upper Secondary Computer Science Education. In Informatics in Schools. Teaching and Learning Perspectives, Yasemin Gülbahar and Erinç Karataş (Eds.). Springer International Publishing, Cham, 100–111.
[10]
Miranda C. Parker, Mark Guzdial, and Allison Elliott Tew. 2021. Uses, Revisions, and the Future of Validated Assessments in Computing Education: A Case Study of the FCS1 and SCS1. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3446871.3469744
[11]
Markeya S. Peteranetz and Anthony D. Albano. 2020. Development and Evaluation of the Nebraska Assessment of Computing Knowledge. Frontiers in Computer Science 2 (2020). https://doi.org/10.3389/fcomp.2020.00011
[12]
Arif Rachmatullah, Bita Akram, Danielle Boulden, Bradford Mott, Kristy Boyer, James Lester, and Eric Wiebe. 2020. Development and validation of the Middle Grades Computer Science Concept Inventory (MG-CSCI) assessment. https://www.ejmste.com/article/development-and-validation-of-the-middle-grades-computer-science-concept-inventory-mg-csci-7798
[13]
Marcos Román-González, Juan-Carlos Pérez-González, and Carmen Jiménez-Fernández. 2017. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior 72 (2017), 678–691. https://doi.org/10.1016/j.chb.2016.08.047
[14]
Jucelio S. Santos, Wilkerson L. Andrade, João Brunet, and Monilly Ramos Araujo Melo. 2020. A Systematic Literature Review of Methodology of Learning Evaluation Based on Item Response Theory in the Context of Programming Teaching. In 2020 IEEE Frontiers in Education Conference (FIE). 1–9. https://doi.org/10.1109/FIE44824.2020.9274068
[15]
Allison Elliott Tew and Mark Guzdial. 2011. The FCS1: A Language Independent Assessment of CS1 Knowledge. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/1953163.1953200
[16]
Linda Werner, Jill Denner, Shannon Campe, and Damon Chizuru Kawamoto. 2012. The Fairy Performance Assessment: Measuring Computational Thinking in Middle School. In Proceedings of the 43rd ACM Technical Symposium on Computer Science Education (Raleigh, North Carolina, USA) (SIGCSE ’12). Association for Computing Machinery, New York, NY, USA, 215–220. https://doi.org/10.1145/2157136.2157200
[17]
Eric Wiebe, Jennifer London, Osman Aksit, Bradford W. Mott, Kristy Elizabeth Boyer, and James C. Lester. 2019. Development of a Lean Computational Thinking Abilities Assessment for Middle Grades Students. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (Minneapolis, MN, USA) (SIGCSE ’19). Association for Computing Machinery, New York, NY, USA, 456–461. https://doi.org/10.1145/3287324.3287390

Cited By

View all

Index Terms

  1. Developing a Computer Science Content Knowledge Test for 10th Grade Students

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    WiPSCE '23: Proceedings of the 18th WiPSCE Conference on Primary and Secondary Computing Education Research
    September 2023
    173 pages
    ISBN:9798400708510
    DOI:10.1145/3605468
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 September 2023

    Check for updates

    Author Tags

    1. Competency
    2. Computer Science
    3. Content Knowledge
    4. Expert Rating

    Qualifiers

    • Extended-abstract
    • Research
    • Refereed limited

    Funding Sources

    • Ministry of Science, Research and Arts Baden-Württemberg

    Conference

    WiPSCE '23

    Acceptance Rates

    Overall Acceptance Rate 104 of 279 submissions, 37%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)39
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 23 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media