skip to main content
10.1145/3545945.3569734acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article
Open access

The Programming Exercise Markup Language: Towards Reducing the Effort Needed to Use Automated Grading Tools

Published: 03 March 2023 Publication History

Abstract

Automated programming assignment grading tools have become integral to CS courses at introductory as well as advanced levels. However such tools have their own custom approaches to setting up assignments and describing how solutions should be tested, requiring instructors to make a significant learning investment to begin using a new tool. In addition, differences between tools mean that initial investment must be repeated when switching tools or adding a new one. Worse still, tool-specific strategies further reduce the ability of educators to share and reuse their assignments. This paper describes an early experiences with PEML, the Programming Exercise Markup Language, which provides an easy to use, instructor friendly approach for writing programming assignments. Unlike tool-oriented data interchange formats, PEML is designed to provide a human friendly authoring format that has been developed to be intuitive, expressive and not be a technological or notational barrier to instructors. We describe the design and implementation of PEML, both as a programming library and also a public-access web microservice that provides full parsing and rendering capabilities for easy integration into any tools or scripting libraries. We also describe experiences using PEML to describe a full range of programming assignments, laboratory exercises, and small coding questions of varying complexity in demonstrating the practicality of the notation. The aim is to develop PEML as a community resource to reduce the barriers to entry for automated assignment tools while widening the scope of programming assignment sharing and reuse across courses and institutions.

References

[1]
A. Agrawal and B. Reed. 2022. A SURVEY ON GRADING FORMAT OF AUTOMATED GRADING TOOLS FOR PROGRAMMING ASSIGNMENTS. In ICERI 2022 Proceedings (Seville, Spain) (15th annual International Conference of Education, Research and Innovation). IATED, 7506--7514. https://doi.org/10.21125/iceri.2022.1912
[2]
ArchieML. 2022. Archie Markup Language. http://archieml.org/ Retrieved August 15, 2022 from http://archieml.org/
[3]
Christopher Brown, Robert Pastel, Bill Siever, and John Earnest. 2012. JUG: A JUnit Generation, Time Complexity Analysis and Reporting Tool to Streamline Grading. In Proceedings of the 17th ACM Annual Conference on Innovation and Technology in Computer Science Education (Haifa, Israel) (ITiCSE '12). Association for Computing Machinery, New York, NY, USA, 99--104. https://doi.org/10.1145/2325296.2325323
[4]
Stephen H. Edwards. 2021. Automated Feedback, the Next Generation: Designing Learning Experiences. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education (Virtual Event, USA) (SIGCSE '21). Association for Computing Machinery, New York, NY, USA, 610--611. https://doi.org/10.1145/3408877.3437225
[5]
Stephen H. Edwards, Jürgen Börstler, Lillian N. Cassel, Mark S. Hall, and Joseph Hollingsworth. 2008. Developing a Common Format for Sharing Programming Assignments. SIGCSE Bull., Vol. 40, 4 (nov 2008), 167--182. https://doi.org/10.1145/1473195.1473240
[6]
Xiang Fu, Boris Peltsverger, Kai Qian, Lixin Tao, and Jigang Liu. 2008. APOGEE: Automated Project Grading and Instant Feedback System for Web Based Computing. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education (Portland, OR, USA) (SIGCSE '08). Association for Computing Machinery, New York, NY, USA, 77--81. https://doi.org/10.1145/1352135.1352163
[7]
Marcelo Guerra Hahn, Silvia Margarita Baldiris Navarro, Luis De La Fuente Valentín, and Daniel Burgos. 2021. A Systematic Review of the Effects of Automatic Scoring and Automatic Feedback in Educational Settings. IEEE Access, Vol. 9 (2021), 108190--108198. https://doi.org/10.1109/ACCESS.2021.3100890
[8]
Aliya Hameer and Brigitte Pientka. 2019. Teaching the Art of Functional Programming Using Automated Grading (Experience Report). Proc. ACM Program. Lang., Vol. 3, ICFP, Article 115 (jul 2019), pages15 pages. https://doi.org/10.1145/3341719
[9]
Petri Ihantola, Tuukka Ahoniemi, Ville Karavirta, and Otto Seppälä. 2010. Review of Recent Systems for Automatic Assessment of Programming Assignments. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research (Koli, Finland) (Koli Calling '10). Association for Computing Machinery, New York, NY, USA, 86--93. https://doi.org/10.1145/1930464.1930480
[10]
Alan Marchiori. 2022. Labtool: A Command-Line Interface Lab Assistant and Assessment Tool. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 1 (Providence, RI, USA) (SIGCSE 2022). Association for Computing Machinery, New York, NY, USA, 1--7. https://doi.org/10.1145/3478431.3499285
[11]
Divyansh S. Mishra and Stephen H. Edwards. 2022a. PEML Examples. https://github.com/CSSPLICE/peml-feasibility-examples Retrieved December 14, 2022.
[12]
Divyansh S. Mishra and Stephen H. Edwards. 2022b. PEML Live! https://cssplice.github.io/peml/peml-live.html Retrieved December 14, 2022.
[13]
Divyansh S. Mishra and Stephen H. Edwards. 2022c. PEML REST API. https://cssplice.github.io/peml/peml-api.html Retrieved December 14, 2022.
[14]
Divyansh S. Mishra and Stephen H. Edwards. 2022d. PEML: The Programming Exercise Markup Language. https://github.com/CSSPLICE/peml Retrieved December 14, 2022.
[15]
Sidhidatri Nayak, Reshu Agarwal, and Sunil Kumar Khatri. 2022. Automated Assessment Tools for grading of programming Assignments: A review. In 2022 International Conference on Computer Communication and Informatics (ICCCI). 1--4. https://doi.org/10.1109/ICCCI54379.2022.9740769
[16]
JavaScript Object Notation. 2022. Introducing JSON. Retrieved August 15, 2022 from https://www.json.org/
[17]
Chris Wilcox. 2016. Testing Strategies for the Automated Grading of Student Programs. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (Memphis, Tennessee, USA) (SIGCSE '16). Association for Computing Machinery, New York, NY, USA, 437--442. https://doi.org/10.1145/2839509.2844616
[18]
XML. 2022. Extensible Markup Language. https://www.w3.org/XML/ Retrieved August 15, 2022.
[19]
YAML 1.2. 2022. YAML Ain't Markup Language. https://yaml.org/ Retrieved August 15, 2022.
[20]
Jeremy K. Zhang, Chao Hsu Lin, Melissa Hovik, and Lauren J. Bricker. 2020. GitGrade: A Scalable Platform Improving Grading Experiences. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (Portland, OR, USA) (SIGCSE '20). Association for Computing Machinery, New York, NY, USA, 1284. https://doi.org/10.1145/3328778.3372634

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE 2023: Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 1
March 2023
1481 pages
ISBN:9781450394314
DOI:10.1145/3545945
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 March 2023

Check for updates

Author Tags

  1. automated grading
  2. interchange format
  3. markup language
  4. notation
  5. programming assignment
  6. web service

Qualifiers

  • Research-article

Funding Sources

  • National Science Foundation

Conference

SIGCSE 2023
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

Upcoming Conference

SIGCSE TS 2025
The 56th ACM Technical Symposium on Computer Science Education
February 26 - March 1, 2025
Pittsburgh , PA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)206
  • Downloads (Last 6 weeks)17
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media