skip to main content
10.1145/2538862.2538900acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

CrowdGrader: a tool for crowdsourcing the evaluation of homework assignments

Published: 05 March 2014 Publication History

Abstract

CrowdGrader is a system that lets students submit and collaboratively review and grade homework. We describe the techniques and ideas used in CrowdGrader, and report on the experience of using CrowdGrader in disciplines ranging from Computer Science to Economics, Writing, and Technology. In CrowdGrader, students receive an overall crowd-grade that reflects both the quality of their homework, and the quality of their work as reviewers. This creates an incentive for students to provide accurate grades and helpful reviews of other students' work. Instructors can use the crowd-grades as final grades, or fine-tune the grades according to their wishes. Our results on seven classes show that students actively participate in the grading and write reviews that are generally helpful to the submissions' authors. The results also show that grades computed by CrowdGrader are sufficiently precise to be used as the homework component of class grades. Students report that the main benefits in using CrowdGrader are the quality of the reviews they receive, and the ability to learn from reviewing their peers' work. Instructors can leverage peer learning in their classes, and easily handle homework evaluation in large classes.

References

[1]
J. Bartholdi, C. Tovey, and M. Trick. Voting schemes for which it can be difficult to tell who won the election. Social Choice and Welfare, 6(2):157--165, 1989.
[2]
D. J. Boud, R. Cohen, and J. Sampson. Peer learning in higher education: learning from & with each other. Psychology Press, 2001.
[3]
R. Bradley and M. Terry. Rank analysis of incomplete block designs: I. the method of paired comparisons. Biometrika, 39(3/4):pp. 324--345, 1952.
[4]
K. Cho, T. Chung, W. King, and C. Schunn. Peer-based computer-supported knowledge refinement: An empirical investigation. Communications of the ACM, 51(3):83--88, 2008.
[5]
C. Davidson. How to crowdsource grading, 2009. http://www.hastac.org/blogs/cathy-davidson/how-crowdsource-grading.
[6]
L. de Alfaro and M. Shavlovsky. Crowdgrader: Crowdsourcing the evaluation of homework assignments. Technical Report UCSC-SOE-13-11, UC Santa Cruz, 2013. arXiv:1308.5273.
[7]
J.-C. de Borda. Memoire sur les Elections au Scrutin. 1781.
[8]
C. Dwork, R. Kumar, M. Naor, and D. Sivakumar. Rank aggregation methods for the web. In Proceedings of the 10th international conference on World Wide Web, pages 613--622. ACM, 2001.
[9]
A. Elo. The Rating of Chess Players Past and Present. New York, Arco, 1978.
[10]
M. Glickman. Paired Comparison Models with Time-varying Parameters. Harvard University, 1993.
[11]
R. Luce. Individual choice behavior : a theoretical analysis. Wiley N.Y, 1959.
[12]
M. Merrifield and D. Saari. Telescope time without tears: A distributed approach to peer review. Astronomy & Geophysics, 50(4):4--16, 2009.
[13]
P. Naghizadeh and M. Liu. Incentives, quality, and risk: A look into the NSF proposal review pilot. Arxiv, 1307.6528v1, 2013.
[14]
National Science Foundation. Dear colleague letter: Information to principal investigators (PIs) planning to submit proposals to the sensors and sensing systems (sss) program October 1, 2013 deadline, 2013.

Cited By

View all

Index Terms

  1. CrowdGrader: a tool for crowdsourcing the evaluation of homework assignments

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGCSE '14: Proceedings of the 45th ACM technical symposium on Computer science education
    March 2014
    800 pages
    ISBN:9781450326056
    DOI:10.1145/2538862
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 March 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowdsourcing
    2. grading
    3. peer evaluation

    Qualifiers

    • Research-article

    Conference

    SIGCSE '14
    Sponsor:

    Acceptance Rates

    SIGCSE '14 Paper Acceptance Rate 108 of 274 submissions, 39%;
    Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

    Upcoming Conference

    SIGCSE TS 2025
    The 56th ACM Technical Symposium on Computer Science Education
    February 26 - March 1, 2025
    Pittsburgh , PA , USA

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 23 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media