skip to main content
10.1145/2538862.2544269acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
abstract

ACCE: automatic coding composition evaluator (abstract only)

Published: 05 March 2014 Publication History

Abstract

Coding style is important to teach to beginning programmers, so that bad habits don't become permanent. This is often done manually at the University level because automated Python static analyzers cannot accurately grade based on a given rubric. However, even manual analysis of coding style encounters problems, as we have seen quite a bit of inconsistency among our graders. We introduce ACCE--Automated Coding Composition Evaluator--a module that automates grading for the composition of programs. ACCE, given certain constraints, assesses the composition of a program through static analysis, feature extraction, supervised learning and clustering (unsupervised learning), automating the subjective process of grading based on style and identifying common mistakes. Further, we create visual representations of the clusters to allow readers and students understand where a submission falls, and the overall trends. We have applied this tool to CS61A--a CS1 level course at UC, Berkeley experiencing rapid growth in student enrollment--in an attempt to help expedite the involved process as well as reduce human grader inconsistencies.

Reference

[1]
J. Huang, C. Piech, A. Nguyen, L. Guibas. Syntactic and Functional Variability of a Million Code Submissions in a Machine Learning MOOC. In Proceedings of the 16th Annual Conference on Atificial Intelligence in Education 2013. ACM, 2013.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE '14: Proceedings of the 45th ACM technical symposium on Computer science education
March 2014
800 pages
ISBN:9781450326056
DOI:10.1145/2538862
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 March 2014

Check for updates

Author Tags

  1. CS1
  2. assessment
  3. autograding
  4. clustering
  5. composition
  6. evaluation
  7. gephi
  8. grading
  9. style
  10. unsupervised learning
  11. visualization

Qualifiers

  • Abstract

Conference

SIGCSE '14
Sponsor:

Acceptance Rates

SIGCSE '14 Paper Acceptance Rate 108 of 274 submissions, 39%;
Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

Upcoming Conference

SIGCSE TS 2025
The 56th ACM Technical Symposium on Computer Science Education
February 26 - March 1, 2025
Pittsburgh , PA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media