skip to main content
10.1145/1142405.1142439acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
Article

What do usability evaluators do in practice?: an explorative study of think-aloud testing

Published: 26 June 2006 Publication History

Abstract

Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that immediate analysis of observations made in the think-aloud sessions is done only sporadically, if at all. When testing, evaluators seem to seek confirmation of problems that they are already aware of. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators learn much about the usability of the tested system but little about its utility. The study shows how practical realities rarely discussed in the literature on usability evaluation influence sessions. We discuss implications for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions.

References

[1]
Arnowitz, J., Gray, D., Dorsch, N., Heidelberg, M., & Arent, M. The Stakeholder Forest: Designing an Expense Application for the Enterprise, Proc. CHI 2005, ACM Press (2005), 941--956.
[2]
Beyer, H. & Holtzblatt, K. Contextual Design, Morgan Kaufman Publishers, San Francisco, 1998.
[3]
Boivie, I., Åborg, C., Persson, J., & Lööfberg, M. Why Usability for Lost or Usability in in-House Software Development, Interacting with Computers, 15 (2003), 623--639.
[4]
Boren, M. T. & Ramey, J. Thinking Aloud: Reconciling Theory and Practice, IEEE Transactions on Professional Communication, 43, 3 (2000), 261--277.
[5]
Carter, L. & Yeats, D. The Role of Highlights Video in Usability Testing: Rhetorical and Generic Expectations, Technical Communications, 52, 2 (2005), 1--7.
[6]
Chi, M. T. H. Quantifying Qualitative Analyses of Verbal Data: A Practical Guide, The Journal of the Learning Sciences, 6, 3 (1997), 271--315.
[7]
Cockton G., Lavery, D., & Woolrych, A., Inspection-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1118--1138.
[8]
Cockton, G., Woolrych, A., Hall, L., & Hidemarch, M. Changing Analysts' Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment, Proc. HCI 2003, Springer Verlag (2003), 145--162.
[9]
Dumas J., User-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1093--1117.
[10]
Dumas, J., Molich, R., & Jefferies, R. Describing Usability Problems: Are We Sending the Right Message?, interactions, 4 (2004), 24--29.
[11]
Dumas, J. & Redish, J. A Practical Guide to Usability Testing, Intellect, 1999.
[12]
Ericsson, K. A. & Simon, H. Protocol Analysis: Verbal Reports As Data, Revised Edition, MIT Press, Cambridge, MA, 1993.
[13]
Frøkjær, E. & Hornbææk, K. Cooperative Usability Testing: Complementing Usability Tests With User-Supported Interpretation Sessions, Extended Abstracts of ACM Conference on Human Factors in Computing Systems (2005), 1383--1386.
[14]
Gulliksen, J., Boivie, I., Persson, J., Hektor, A., & Herulf, L. Making a Difference - a Survey of the Usability Profession in Sweden, Proc. Nordichi 2004, ACM Press (2004), 207--215.
[15]
Hertzum, M. User Testing in Industry: A Case Study of Laboratory, Workshop, and Field Tests, Proc. ERCIM Workshop on User Interfaces for All, (1999), 59--72.
[16]
Hertzum, M. & Jacobsen, N. E. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods, International Journal of Human-Computer Interaction, 13 (2001), 421--443.
[17]
Hornbæk, K. & Frøøkjær, E. Comparing Usability Problems and Redesign Proposals As Input to Practical Systems Development, Proc. CHI'2005, ACM Press (2005), 391--400.
[18]
Hornbæk, K. & Frøøkjær, E. Two Psychology-Based Usability Inspection Techniques Studied in a Diary Experiment, Proc. Nordichi 2004, ACM Press (2004), 3--12.
[19]
Iivari, N. Usability Specialists - 'a Mommy Mob', 'Realistic Humanists' or 'Staid Researchers'? An Analysis of Usability Work in Software Product Development, Proc. Interact 2005, Edizioni Guiseppe Laterza, (2005), 418--430.
[20]
Jacobsen, N. E. & John, B. E. Two Case Studies in Using Cognitive Walkthroughs for Interface Evaluation, CMU-CS-00-132 (2000).
[21]
Jeffries, R., Miller, J., Wharton, C., & Uyeda, K. User Interface Evaluation in the Real World: A Comparison of Four Techniques., Proc. CHI'91, (1991), 119--124.
[22]
John, B. Beyond the UI: Product, Process and Passion, Proc. Nordichi 2004, ACM Press (2004), 285--286.
[23]
John, B. E. & Mashyna, M. M. Evaluating a Multimedia Authoring Tool, Journal of the American Society of Information Science, 48, 9 (1997), 1004--1022.
[24]
John, B. E. & Packer, H. Learning and Using the Cognitive Walkthrough Method: a Case Study Approach, Proc. CHI'95, ACM Press (1995), 429--436.
[25]
Karat, C.-M., Campbell, R., & Fiegel, T. Comparison of Empirical Testing and Walkthrough Methods in Usability Interface Evaluation, Proc. CHI'92, ACM Press (1992), 397--404.
[26]
Molich, Rolf, User testing, Discount user testing, 2003, www.dialogdesign.dk.
[27]
Molich, R., Ede, M. R., Kaasgaard, K., & Karyukin, B. Comparative Usability Evaluation, Behaviour & Information Technology, 23, 1 (2004), 65--74.
[28]
Nielsen, J. Usability Engineering, Morgan Kaufmann Publishers, San Francisco, CA, 1993.
[29]
Nielsen, J. Finding Usability Problems Through Heuristic Evaluation, Proc. CHI'92, ACM Press (1992), 373--380.
[30]
Pace, S. A Grounded Theory of the Flow Experiences of Web Users, International Journal of Human-Computer Studies, 60 (2004), 347-363.
[31]
Sawyer, P., Flanders, A., & Wixon, D. Making a Difference - The Impact of Inspections, Proc. CHI'96, ACM Press (1996), 376--382.
[32]
Spencer, R. The Streamlined Cognitive Walkthrough Method, Working Around Social Constraints Encountered in a Software Development Company, Proc. CHI'2000, (2000), 353--359.
[33]
Strauss, A. & Corbin, J. Basics of Qualitative Research - Techniques and Procedures for Developing Grounded Theory, Sage Publications, California, (1998).
[34]
Szczur, M. Usability Testing - on a Budget: a NASA Usability Test Case Study, Behaviour & Information Technology, 13 (1994), 106--118.
[35]
Vredenburg, K., Mao, J.-Y., Smith, P. W., & Carey, T. A Survey of User-Centered Design Practice, Proc. CHI 2002, ACM Press (2002), 472--478.
[36]
Wilson, S., Bekker, M., Johnson, P., & Johnson, H. Helping and Hindering User Involvement - a Tale of Everyday Design, Proc. CHI'97, ACM Press (1997), 178--185.
[37]
Wixon, D. Evaluating Usability Methods: Why the Current Literature Fails the Practitioner, interactions, 10, 4 (2003), 29--34.
[38]
Zirkler, D. & Ballman, D. R. Usability Testing in a Competive Market: Lessons Learned, Behaviour and Information Technology, 13, 1&2 (1994), 191--197.

Cited By

View all
  • (2024)Protocol for the development of a tool to map systemic sclerosis pain sources, patterns, and management experiences: a Scleroderma Patient-centered Intervention Network patient-researcher partnershipBMC Rheumatology10.1186/s41927-024-00398-38:1Online publication date: 21-Jun-2024
  • (2024)Usability Study of Security Features in Programmable Logic ControllersProceedings of the 2024 European Symposium on Usable Security10.1145/3688459.3688471(200-219)Online publication date: 30-Sep-2024
  • (2024)LLM-powered Multimodal Insight Summarization for UX TestingProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685701(4-11)Online publication date: 4-Nov-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DIS '06: Proceedings of the 6th conference on Designing Interactive systems
June 2006
384 pages
ISBN:1595933670
DOI:10.1145/1142405
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. industrial software development
  2. think aloud testing
  3. usability evaluation
  4. user-centered design

Qualifiers

  • Article

Conference

DIS06
Sponsor:
DIS06: Designing Interactive Systems 2006
June 26 - 28, 2006
PA, University Park, USA

Acceptance Rates

Overall Acceptance Rate 1,158 of 4,684 submissions, 25%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)271
  • Downloads (Last 6 weeks)33
Reflects downloads up to 28 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media