skip to main content
10.1145/1753326.1753677acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

API usability peer reviews: a method for evaluating the usability of application programming interfaces

Published: 10 April 2010 Publication History

Abstract

We describe a usability inspection method to evaluate Application Programming Interfaces (APIs). We found the method useful as it identified usability defects in Microsoft's .NET Framework, of which 59% were new and 21% were fixed. Based on a comparison of usability defects identified between API usability peer reviews and API usability tests, API usability tests were found to expose design issues related to actually using an API whereas API usability peer reviews were found to expose the design rationale of an API. We reflect on the efficiency and productivity of each method: each API usability test is equivalent to 16 API usability peer reviews with the former having a 2.5x productivity advantage. We discuss how API usability peer reviews can be used in conjunction with API usability tests to increase usability coverage on APIs.

References

[1]
Ackerman, A.F., Buchwald, L.S., and Lewski, F.H. Software inspections: An effective verification process. IEEE Software 6, 3 (May 1989), 31--36.
[2]
Bell, B. Using programming walkthroughs to design a visual language. Ph.D. dissertation, University of Colorado, 1992.
[3]
Bell, B., Citrin, W., Lewis, C., Rieman, J., Wilde, N., and Zorn, B. The programming walkthrough: A structured method for assessing the writability of programming languages. Software Practice and Experience 24, 1 (January 1994), 1--25.
[4]
Bloch, J. How to write a good API and why it matters. Keynote address for LCSD workshop at OOPSLA, 2005. http://lcsd05.cs.tamu.edu/#keynote.
[5]
Cuomo, D.L. and Bowen, C.D. Understanding usability issues addressed by three user-system interface evaluation techniques. Interacting with Computers 6, 1 (1994), 86--108.
[6]
Cwalina, K. and Abrams, B. Framework design guidelines. Addison-Wesley, 2005.
[7]
Daughtry, J.M., Farooq, U., Stylos, J., and Myers, B.A. API usability: CHI'09 Special Interest Group. Proc. CHI 2009, ACM Press (2009), 2771--2774.
[8]
Daughtry, J.M., Farooq, U., Myers, B.A, and Stylos, J. API usability: Report on Special Interest Group at CHI. Software Engineering Notes 34, 4 (July 2009), 27--29.
[9]
Desurvire, H.W. Faster, Cheaper!! Are Usability Inspection Methods as Effective as Empirical Testing? In Nielsen, J. and Mack, R.L. Usability inspection methods (Ed). John Wiley & Sons, 1994, 173--201.
[10]
Fagan, M.E. Design and code inspection to reduce errors in program development. IBM Systems Journal 15, 3 (1976), 182--211.
[11]
Fagan, M.E. Advances in software inspection. IEEE Transactions on Software Engineering 12, 7 (July, 1986), 744--751.
[12]
Feature Crews: How Microsoft Does It. CodePlex: Open Source Community, 2007. http://www.codeplex.com/ BranchingGuidance/Wiki/View.aspx?title=Feature%20Crews%3a%20How%20Microsoft%20Does%20It&referringTitle=Home.
[13]
Freedman, D. and Weinberg, G.M. Handbook of Walkthroughs, Inspections, and Technical Reviews: Evaluating Programs, Projects, and Products. New York: Dorset House, 1990.
[14]
Google Maps API. http://code.google.com/apis/maps/.
[15]
Gray, W.D. and Salzman, M.C. Damaged merchandise? A review of experiments that compare usability evaluation methods. Human Computer Interaction 13, 3 (1998) 203--261.
[16]
Green, T.R.G. and Petre, M. Usability analysis of visual programming environments: A 'cognitive dimensions' framework. Journal of Visual Languages & Computing 7, 2 (1996), 131--174.
[17]
Greenberg, S. and Buxton, B. Usability evaluation considered harmful (some of the time). Proc. CHI 2008, ACM Press (2008), 111--120.
[18]
Hammond, N., Hinton, G., Barnard, P., MacLean, A., Long, J., and Whitefield, A. Evaluating the interface of a document processor: A comparison of expert judgment and user observation. Proc. of IFIP INTERACT 1994, Elsevier Science Publishers (1994), 725--729.
[19]
Henning, M. API Design Matters. Communications of the ACM 52, 5 (May 2009), 46--56.
[20]
iPhone Developer Program. http://developer.apple.com/iPhone/program.
[21]
Jeffries, R. and Desurvire, H. Usability testing vs. heuristic evaluation: Was there a contest? ACM SIGCHI Bulletin 24, 4 (October 1992), 39--41.
[22]
Jeffries, R., Miller, J.R., Wharton, C., and Uyeda, K.M. User Interface Evaluation in the Real World: A Comparison of Four Techniques. Proc. CHI 1991, ACM Press (1991), 119--124.
[23]
John, B.E. and Marks, S.J. Tracking the effectiveness of usability evaluation methods. Behaviour & Information Technology 16, 4/5 (1997), 188--202.
[24]
Kahn, M.J. and Prail, A. Formal usability inspections. In Nielsen, J. and Mack, R.L. Usability inspection methods (Ed). John Wiley & Sons, 1994, 141--171.
[25]
Karat, C.-M., Campbell, R., and Fiegel, T. Comparison of empirical testing and walkthrough methods in user interface evaluation. Proc. CHI 1992, ACM Press (1992), 397--404.
[26]
Lewis, C., Polson, P.G., Wharton, C., and Rieman, J. Testing a Walkthrough Methodology for Theory-Based Design of Walk-Up-and-Use Interfaces. Proc. CHI 1990, ACM Press (1990), 235--242.
[27]
Mack, R.L. and Nielsen, J. Executive summary. In Nielsen, J. and Mack, R.L. Usability inspection methods (Ed). John Wiley & Sons, 1994, 1--23.
[28]
Nielsen, J. and Mack, R.L. Usability inspection methods (Ed). John Wiley & Sons, 1994.
[29]
Nielsen, J. and Phillips, V. Estimating the relative usability of two interfaces. Proc. CHI 1993, ACM Press (1993), 214--221.
[30]
Olson, G.M., and Moran, T.P. Commentary on "Damaged Merchandise". Human Computer Interaction 13, 3 (1998), 263--323.
[31]
Rieman, J., Franzke, M., and Redmiles, D. Usability evaluation with the cognitive walkthrough. Proc. CHI 1995, ACM Press (1995), 387--388.
[32]
Russell, G.W. Experience with inspection in ultralarge-scale developments. IEEE Software 8, 1 (January 1991), 25--31.
[33]
Savage, P. User interface evaluation in an iterative design process: A comparison of three techniques. Proc. CHI 1996, ACM Press (1996), 307--308.
[34]
Schwaber, K. and Beedle, M. Agile software development with SCRUM. Prentice Hall, 2002.
[35]
Stevens, S.M. Intelligent interactive video simulation of a code inspection. Communications of the ACM 32, 7 (July, 1989), 832--843.
[36]
Stylos, J. and Clarke, S. Usability implications for requiring parameters in objects' constructors. Proc. ICSE 2007, ACM Press (2007), 529--539.
[37]
Team Foundation Server. http://msdn.microsoft.com/en-us/teamsystem/dd408382.aspx.
[38]
Visual Studio IntelliSense. http://msdn.microsoft.com/en-us/library/hcw1s69b(VS.71).aspx.
[39]
Vygotsky, L.S. Mind in society: The development of higher psychological processes. Harvard University Press, 1978.
[40]
Weller, E.F. Lessons from three years of inspection data. IEEE Software 10, 5 (September, 1993), 38--45.
[41]
Wharton, C., Rieman, J., Lewis, C. and Polson, P. The cognitive walkthrough method: A practitioner's guide. In J. Nielsen & R.L. Mack, Usability inspection methods, John Wiley & Sons, 1994 (pp. 105--140).
[42]
Wiegers, K.E. Peer reviews in software: A practical guide. Addison-Wesley, 2002.

Cited By

View all

Index Terms

  1. API usability peer reviews: a method for evaluating the usability of application programming interfaces

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2010
    2690 pages
    ISBN:9781605589299
    DOI:10.1145/1753326
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 April 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. api usability
    2. software bugs
    3. usability breakdowns
    4. usability evaluation method (uem)
    5. usability inspection

    Qualifiers

    • Research-article

    Conference

    CHI '10
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)29
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 16 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media