skip to main content
10.1145/1085313.1085348acmconferencesArticle/Chapter ViewAbstractPublication PagesdocConference Proceedingsconference-collections
Article

Usability over time

Published: 21 September 2005 Publication History

Abstract

Testing of usability could perhaps be more accurately described as testing of learnability. We know more about the problems of novice users than we know of the problems of experienced users. To understand how these problems differ, and to understand how usability problems change as users change from novice to experienced, we conducted a longitudinal study of usability among middle-school teachers creating Web sites. The study looked at the use both the use of documentation and the underlying software, tracking the causes and extent of user frustration over eight weeks. We validated a categorization scheme for frustration episodes. We found that over the eight weeks the level of frustration dropped, the distribution of causes of frustration changed, and the users' responses to frustration episodes changed. These results suggest that the sorts of errors that are most prominently featured in conventional usability testing are likely of little consequence over longer periods of time.

References

[1]
Baecker, R., Booth, K., Jovicic, S., McGrenere, J. and Moore, G. (2000). Reducing the gap between what users know and what they need to know. Proceedings of the ACM 2000 International Conference on Intelligent User Interfaces, 17--23.
[2]
Bessiere, K., Ceaparu, I., Lazar, J., Robinson, J., and Shneiderman, B. (2003). Social and psychological influences on computer user frustration, CS Technical Report 4410, Department of Computer Science. University of Maryland.
[3]
Borella, M.S., Sears, A., and Jacko, J.A. (1997, November). The Effects of internet latency on user perception of information content. Proceedings of IEEE Global Telecommunications Conferences, 1932--1936.
[4]
Carletta, J. Assessing agreement on classification tasks: the kappa statistic. Computational Linguistics, 22(2) (1996), 249--254.
[5]
Carlsson, M., Lofstom, L., and Ahlfeldt, H. (2001). Classification of procedures in the domain of thoracic surgery-A study of reliability in coding. Journal of Medical Systems, 25(1).
[6]
Carroll, J., and Carrithers, C. (1984). Training wheels in a user interface. Communications of the ACM, 27(8), 800--806.
[7]
Ceaparu, I., Lazar, J., Bessiere, K., Robinson, J., and Shneiderman, B. (2004). Determining causes and severity of end-user frustration, International Journal of Human-Computer Interaction, 17(3), 333--356.
[8]
Cook R., Kay, J., Ryan, G., and Thomas, R. (1995). A toolkit for appraising the long term usability of a text editor. Software Quality Journal, 4(2), 131--154.
[9]
Hazlett, R. (2003). Measurement of user frustration: a biologic approach. Conference on Human Factors in Computing Systems (CHI 2003), 734--735.
[10]
Hilbert, D. (1998). A survey of computer-aided techniques for extracting usability information from user interface events, Technical Report UCI-ICS-98-13, Department of Information and Computer Science, University of California at Irvine, March, 1998.
[11]
Lazar, J. & Huang, Y. (2003). Improved error message design in Web browsers. In J. Ratner (ed.). Human Factors and Web Development (2nd ed.), 167--182. Mahwah, NJ: Lawrence Erlbaum Associates.
[12]
Lazar J., Meiselwitz, G., and Norcio, A. (2003). Novice user perception of error on the Web. Universal Access in the Information Society, 3(3), 202--208.
[13]
Lazar, J., and Norcio, A. (2000). System and training design for end-user error. In S. Clarke & B. Lehaney (Eds.), Human-Centered Methods in Information Systems: Current Research and Practice. Hershey, PA: Idea Group Publishing, 76--90.
[14]
Mentis, H. M. & Gay, G. K. (2003). User recalled occurrences of usability errors: Implications on the user experience. Extended Abstracts of the Conference on Human Factors in Computing Systems, Ft. Lauderdale, Fl, 736--737.
[15]
Norman, D. (1983). Design rules based on analyses of human error. Communications of the ACM, 26(4), 254--258.
[16]
Novick, D. (2000). Testing documentation with "low-tech" simulation, Proceedings of IPCC/SIGDOC 2000, Cambridge, MA, September, 2000.
[17]
Reason, J. (1990) Human Error. Cambridge: University Press, Cambridge.
[18]
Schleifer, L., and Amick, B. (1989). System response time and method of pay: Stress effects in computer-based tasks, International Journal of Human Computer Interaction, 1(1): 23--39.
[19]
Shneiderman, B. (1998). Designing the user interface: Strategies for effective human-computer interaction. 3d ed. Reading, MA: Addison-Wesley.
[20]
Shneiderman, B. (2000). Universal usability: Pushing human-computer interaction research to empower every citizen. Communications of the ACM, 43, 5, p.84--91.
[21]
Snoddy, S., and Novick, D. (2004). Post-training support for learning technology, Proceedings of SIGDOC 2004, Memphis, TN, October 10-13, 2004.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGDOC '05: Proceedings of the 23rd annual international conference on Design of communication: documenting & designing for pervasive information
September 2005
176 pages
ISBN:1595931759
DOI:10.1145/1085313
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 September 2005

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. training
  2. usability

Qualifiers

  • Article

Conference

SIGDOC05
Sponsor:
SIGDOC05: ACM 23rd Annual International Conference on Documentation
September 21 - 23, 2005
Coventry, United Kingdom

Acceptance Rates

Overall Acceptance Rate 355 of 582 submissions, 61%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)39
  • Downloads (Last 6 weeks)7
Reflects downloads up to 15 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media