skip to main content
10.1145/3131785.3131791acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
demonstration

CritiqueKit: A Mixed-Initiative, Real-Time Interface For Improving Feedback

Published: 20 October 2017 Publication History

Abstract

We present CritiqueKit, a mixed-initiative machine-learning system that helps students give better feedback to peers by reusing prior feedback, reducing it to be useful in a general context, and retraining the system about what is useful in real time. CritiqueKit exploits the fact that novices often make similar errors, leading reviewers to reuse the same feedback on many different submissions. It takes advantage of all prior feedback, and classifies feedback as the reviewer types it. CritiqueKit continually updates the corpus of feedback with new comments that are added, and it guides reviewers to improve their feedback, and thus the entire corpus, over time.

Supplementary Material

suppl.mov (uistde112.mp4)
Supplemental video

References

[1]
2017. Gradescope. (2017). https://gradescope.com
[2]
Michael D. Greenberg, Matthew W. Easterday, and Elizabeth M. Gerber. 2015. Critiki: A Scaffolded Approach to Gathering Design Feedback from Paid Crowdworkers. In Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition - C&C '15. 235--244.
[3]
J. Hattie and H. Timperley. 2007. The Power of Feedback. Review of Educational Research 77, 1 (mar 2007), 81--112.
[4]
David Kelley and Tom Kelley. 2013. Creative confidence: Unleashing the creative potential within us all. Crown Publishing Group, New York. 288 pages.
[5]
Chinmay Kulkarni, Koh Pang Wei, Huy Le, Daniel Chia, Kathryn Papadopoulos, Justin Cheng, Daphne Koller, and Scott R. Klemmer. 2013. Peer and self assessment in massive online classes. ACM Transactions on Computer-Human Interaction 20, 6 (Dec 2013), 1--31.
[6]
Chinmay E. Kulkarni, Michael S. Bernstein, and Scott R. Klemmer. 2015. PeerStudio: Rapid Peer Feedback Emphasizes Revision and Improves Performance. In Proceedings of the Second (2015) ACM Conference on Learning @ Scale - L@S '15. 75--84.
[7]
Huy Nguyen, Wenting Xiong, and Diane Litman. 2016. Instant Feedback for Increasing the Presence of Solutions in Peer Reviews. HLT-NAACL Demos (2016), 6--10.
[8]
D. Royce Sadler. 1989. Formative assessment and the design of instructional systems. Instructional Science 18, 2 (Jun 1989), 119--144.
[9]
Donald A. Schön. 1985. The design studio: An exploration of its traditions and potentials. RIBA Publications for RIBA Building Industry Trust. 99 pages.
[10]
W Xiong, D. Litman, and C. Schunn. 2012. Natural Language Processing techniques for researching and improving peer feedback. Journal of Writing Research 4, 2 (Nov 2012), 155--176.
[11]
Alvin Yuan, Kurt Luther, Markus Krause, Sophie Isabel Vennix, Steven P Dow, and Björn Hartmann. 2016. Almost an Expert: The Effects of Rubrics and Expertise on Perceived Value of Crowdsourced Design Critiques. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16. 1003--1015.

Cited By

View all

Index Terms

  1. CritiqueKit: A Mixed-Initiative, Real-Time Interface For Improving Feedback

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '17 Adjunct: Adjunct Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology
    October 2017
    217 pages
    ISBN:9781450354196
    DOI:10.1145/3131785
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 October 2017

    Check for updates

    Author Tags

    1. critique
    2. educational technology
    3. feedback
    4. real-time classification

    Qualifiers

    • Demonstration

    Funding Sources

    Conference

    UIST '17

    Acceptance Rates

    UIST '17 Adjunct Paper Acceptance Rate 73 of 324 submissions, 23%;
    Overall Acceptance Rate 355 of 1,733 submissions, 20%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)30
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 21 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media