skip to main content
10.1145/3059454.3078701acmconferencesArticle/Chapter ViewAbstractPublication Pagesc-n-cConference Proceedingsconference-collections
abstract

Enhancing the Usage of Crowd Feedback for Iterative Design

Published: 22 June 2017 Publication History

Abstract

Online crowd platforms (e.g. social networks, online communities, task markets) enable designers to gain insights from large audiences quickly and affordably. However, there is no guidance for designers to better allocate their social capital, time, and financial resources for acquiring feedback that meets their own needs. Also, feedback received online can be ambiguous and contradictory, making it difficult to interpret and act on. These limitations hinder the utility of crowd feedback, making designers hesitant to actively make use of feedback received. The goal of my dissertation is to 1) formulate a framework that suggests which crowd genres to solicit feedback according to individual needs, 2) develop lightweight activities that promote deeper interpretation on a large volume of feedback, and 3) design and deploy an experimental platform that collects long-term user data, and reduces the burden of conducting online studies of design feedback.

References

[1]
Donald Cox and Saul Greenberg. Supporting collaborative interpretation in distributed Groupware. In Proceedings of the 2000 ACM conference on Computer supported cooperative work. ACM. Philadelphia, Pennsylvania, USA. 2000. 289--298.
[2]
Richard Higgins, Peter Hartley, and Alan Skelton, 2001. Getting the Message Across: The problem of communicating assessment feedback. Teaching in Higher Education, 6 (2), 269--274.
[3]
Julie S. Hui, Elizabeth M. Gerber, and Steven P. Dow. Crowd-based design activities: helping students connect with users online. In Proceedings of the 2014 conference on Designing interactive systems. ACM. Vancouver, BC, Canada. 2014. 875--884.
[4]
Ruogu Kang, Aimee Kane, and Sara Kiesler. Teammate inaccuracy blindness: when information sharing tools hinder collaborative analysis. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM. Baltimore, Maryland, USA. 2014. 797--806.
[5]
Matthew Miles and Michael Huberman, 1984. Qualitative Data Analysis. Sage, Newbury Park, CA.
[6]
David Nicol. From monologue to dialogue: improving written feedback processes in mass higher education.
[7]
Margaret Price, Karen Handley, Jill Millar, and Berry O'donovan, 2010. Feedback : all that effort, but what is the effect? Assessment & Evaluation in Higher Education, 35 (3), 277--289.
[8]
Donald A. Schon, 1992. Designing as reflective conversation with the materials of a design situation. Research in Engineering Design, 3 (3), 131--147.
[9]
Yu-Chun (Grace) Yen, Steven P. Dow, Elizabeth Gerber, and Brian P. Bailey, 2017. Listen to Others, Listen to Yourself: Combining Feedback Review and Reflection to Improve Iterative Design. In ACM Conference on Creativity and Cognition ACM.
[10]
Yu-Chun Yen, Steven P. Dow, Elizabeth Gerber, and Brian P. Bailey. Social Network, Web Forum, or Task Market?: Comparing Different Crowd Genres for Design Feedback Exchange. In Proceedings of the ACM Conference on Designing Interactive Systems. ACM. Brisbane, QLD, Australia. 2016. 773--784.

Cited By

View all

Index Terms

  1. Enhancing the Usage of Crowd Feedback for Iterative Design

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    C&C '17: Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition
    June 2017
    584 pages
    ISBN:9781450344036
    DOI:10.1145/3059454
    • General Chairs:
    • David A. Shamma,
    • Jude Yew,
    • Program Chair:
    • Brian Bailey
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 June 2017

    Check for updates

    Author Tags

    1. creativity
    2. crowdsourcing
    3. feedback
    4. iterative design
    5. reflection

    Qualifiers

    • Abstract

    Conference

    C&C '17
    Sponsor:
    C&C '17: Creativity and Cognition
    June 27 - 30, 2017
    Singapore, Singapore

    Acceptance Rates

    C&C '17 Paper Acceptance Rate 27 of 94 submissions, 29%;
    Overall Acceptance Rate 108 of 371 submissions, 29%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 30 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media