skip to main content
10.1145/3424953.3426633acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Accessibility of mobile applications: evaluation by users with visual impairment and by automated tools

Published: 23 December 2020 Publication History

Abstract

Providing accessible mobile applications to people with visual disabilities demands appropriate evaluation techniques and tools to identify problems during the design of such systems. Automated accessibility evaluation tools are important to support evaluation tasks and to make evaluators more productive to perform repetitive analyses. However, automated tools cannot find alone all problems that users encounter in accessibility evaluations of mobile applications. Despite previous investigations on the coverage of accessibility problems encountered by automated tools on websites, there is little knowledge about the relationship between the problems encountered by those tools and problems faced by users with visual impairments in mobile applications. This paper presents a study comparing issues encountered by the automated tools MATE (Mobile Accessibility Testing) and Accessibility Scanner with a set of 415 instances of accessibility problems encountered in a previous user study involving six blind and five partially-sighted users on four mobile applications. The results showed that 36 types of problems were encountered only by users, tree types of problems were encountered both by users and by the tools, and 11 types of problems were encountered only by the automated tools. The results show the kinds of relevant problems that automated tools can identify, aiding in the early identification of such problems. The study also contributes to determining the types of problems that are only encountered by evaluations with users, reinforcing the importance of involving users in accessibility evaluation and characterizing the problems in mobile applications that can go unnoticed if automated tools are used alone.

References

[1]
Abdulaziz Alshayban, Iftekhar Ahmed, and Sam Malek. 2020. Accessibility Issues in Android Apps: State of Affairs, Sentiments, and Ways Forward. In International Conference on Software Engineering (ICSE), Vol. 12. ACM.
[2]
Catharine Ferreira Bach. 2009. Avaliação de acessibilidade na web: estudo comparativo entre métodos de avaliação com a participação de deficientes visuais. Master's thesis. Universidade Federal do Estado do Rio de Janeiro.
[3]
British Broadcasting Corporation (BBC). 2014. HTML Accessibility Standards v2.0. BBC. Retrieved September 11, 2020 from http://www.bbc.co.uk/guidelines/futuremedia/accessibility/html/.
[4]
British Broadcasting Corporation (BBC). 2014. Mobile Accessibility Standards and Guidelines v1.0. BBC. Retrieved September 11, 2020 from http://shorturl.at/wCPX3.
[5]
Giorgio Brajnik. 2008. A comparative test of web accessibility evaluation methods. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 113--120.
[6]
Giorgio Brajnik, Yeliz Yesilada, and Simon Harper. 2011. The expertise effect on web accessibility evaluation methods. Human-Computer Interaction 26, 3 (2011), 246--283.
[7]
Governo Brasileiro. 2014. eMAG - Modelo de Acessibilidade em Governo Eletronico. Retrieved September 11, 2020 from http://emag.governoeletronico.gov.br/.
[8]
Lucas Pedroso Carvalho, Felipe Silva Dias, and André Pimenta Freire. 2018. An Analysis of Five Different Native and Web-Hybrid Platforms for Building Android Apps and their Accessibility for Screen Readers. SBC Journal on Interactive Systems 9, 3 (2018), 20--33.
[9]
Michael Crystian Nepomuceno Carvalho, Felipe Silva Dias, Aline Grazielle Silva Reis, and André Pimenta Freire. 2018. Accessibility and Usability Problems Encountered on Websites and Applications in Mobile Devices by Blind and Normal-Vision Users. In Proceedings of the 33rd Annual ACM Symposium on Applied Computing (SAC '18). Association for Computing Machinery, New York, NY, USA, 2022--2029.
[10]
Instituto Brasileiro de Geograria e Estatística (IBGE). 2010. Censo Demográfico 2010. p.1--215, 2010, tabela 1.3.1, Retrieved June 30, 2020 from shorturl.at/myJ03.
[11]
Felipe Silva Dias. 2018. Análise de causas técnicasde problemas de acessibilidade encontrados em aplicativos nativos para dispositivos móveis. Capstone Project of the Computer Science Degree, Universidade Federal de Lavras.
[12]
Marcelo Medeiros Eler, Leandro Orlandin, and Alberto Dumont Alves Oliveira. 2019. Do Android app users care about accessibility? an analysis of user reviews on the Google play store. In Proc. of the 18th Brazilian Symposium on Human Factors in Computing Systems. 1--11.
[13]
Marcelo Medeiros Eler, José Miguel Rojas, Yan Ge, and Gordon Fraser. 2018. Automated accessibility testing of mobile apps. In 2018 IEEE 11th International Conference on Software Testing, Verification and Validation (ICST). IEEE, 116--126.
[14]
ISO International Organization for Standardization. 2018. ISO 9241-11. Usability: Definitions and concepts. Retrieved April 23, 2020 from https://www.iso.org/obp/ui/iso:std:iso:9241:-11:ed-2:v1:en.
[15]
Paul T Jaeger. 2006. Assessing Section 508 compliance on federal e-government Web sites: A multi-method, user-centered evaluation of accessibility for persons with disabilities. Government Information Quarterly 23, 2 (2006), 169--190.
[16]
Fernando S. Meirelles. 2019. 31a Pesquisa Anual do Uso de TI. Centro de Tecnologia da Informação Aplicada, Fundação Getúlio Vargas, Retrieved September 11, 2020 from https://eaesp.fgv.br/sites/eaesp.fgv.br/files/u68/fgvcia2020pesti-resultados_0.pdf.
[17]
Kyudong Park, Hyo-Jeong So, and Hyunjin Cha. 2019. Digital equity and accessible MOOCs: Accessibility evaluations of mobile MOOCs for learners with visual impairments. Australasian Journal of Educational Technology 35, 6 (2019), 48--63.
[18]
Christopher Power, André Freire, Helen Petrie, and David Swallow. 2012. Guidelines are only half of the story: accessibility problems encountered by blind users on the web. In Proceedings of the SIGCHI conference on human factors in computing systems. 433--442.
[19]
Fiamma Eva Mendoza Quispe and Marcelo Medeiros Eler. 2018. Accessibility recommendations for mobile applications: a contribution to the Brazilian digital government standards. In Proceedings of the XIV Brazilian Symposium on Information Systems. 1--8.
[20]
Dagfinn Rømen and Dag Svanæs. 2012. Validating WCAG versions 1.0 and 2.0 through usability testing with disabled users. Universal Access in the Information Society 11, 4 (2012), 375--385.
[21]
Gouveia T. B. Macedo J. da Silva F. Q. Santos A. L. Correia W. ... Florentin F. Siebra, C. 2017. Mobile Accessibility, guide for developing accessible mobile apps. Retrieved June 30, 2020 from http://www.sidi.org.br/guiadeacessibilidade/en/index.html.
[22]
Camila Silva and Marcelo Eler. 2018. Um estudo sobre a avaliação automática de diretrizes de acessibilidade para dispositivos móveis. In Anais do XIV Simpósio Brasileiro de Sistemas de Informação. SBC, 111--113.
[23]
Camila Silva, Marcelo Medeiros Eler, and Gordon Fraser. 2018. A Survey on the Tool Support for the Automatic Evaluation of Mobile Accessibility. In Proceedings of the 8th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-Exclusion (DSAI 2018). Association for Computing Machinery, New York, NY, USA, 286--293.
[24]
Carlos Alberto Silva, Arthur FBA de Oliveira, Delvani Antônio Mateus, Heitor Augustus Xavier Costa, and André Pimenta Freire. 2019. Types of problems encountered by automated tool accessibility assessments, expert inspections and user testing: a systematic literature mapping. In Proceedings of the 18th Brazilian Symposium on Human Factors in Computing Systems. 1--11.
[25]
Christopher Vendome, Diana Solano, Santiago Liñán, and Mario Linares-Vásquez. 2019. Can everyone use my app? An Empirical Study on Accessibility in Android Apps. In 2019 IEEE International Conference on Software Maintenance and Evolution (ICSME). IEEE, 41--52.
[26]
Markel Vigo, Justin Brown, and Vivienne Conway. 2013. Benchmarking Web Accessibility Evaluation Tools: Measuring the Harm of Sole Reliance on Automated Tests. In Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility (W4A '13). Association for Computing Machinery, New York, NY, USA, Article 1, 10 pages.
[27]
Web Accessibility Initiative World Wide Web Consortium. 1999. Web Content Accessibility Guidelines 1.0. Retrieved April 23, 2020 from https://www.w3.org/TR/WCAG10/.
[28]
Web Accessibility Initiative World Wide Web Consortium. 2018. W3C. Web Content Accessibility Guidelines (WCAG) 2.1. Retrieved April 23, 2020 from https://www.w3.org/TR/WCAG21/.
[29]
Shunguo Yan and PG Ramachandran. 2019. The current status of accessibility in mobile apps. ACM Transactions on Accessible Computing (TACCESS) 12, 1 (2019), 1--31.

Cited By

View all

Index Terms

  1. Accessibility of mobile applications: evaluation by users with visual impairment and by automated tools

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      IHC '20: Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems
      October 2020
      519 pages
      ISBN:9781450381727
      DOI:10.1145/3424953
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 December 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. automated tests
      2. mobile accessibility
      3. user evaluation

      Qualifiers

      • Research-article

      Funding Sources

      • Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
      • Fundação de Amparo à Pesquisa do Estado de São Paulo

      Conference

      IHC '20

      Acceptance Rates

      IHC '20 Paper Acceptance Rate 60 of 155 submissions, 39%;
      Overall Acceptance Rate 331 of 973 submissions, 34%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)184
      • Downloads (Last 6 weeks)15
      Reflects downloads up to 28 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Help-Seeking Situations Related to Visual Interactions on Mobile Platforms and Recommended Designs for Blind and Visually Impaired UsersJournal of Imaging10.3390/jimaging1008020510:8(205)Online publication date: 22-Aug-2024
      • (2024)Investigating Accessibility at the Brazilian Symposium on Human Factors in Computing Systems (IHC)Proceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702098(1-18)Online publication date: 7-Oct-2024
      • (2024)Color Contrast ComplianceProceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702065(1-15)Online publication date: 7-Oct-2024
      • (2024)Nothing About Us Without Us: Reflections on the Protagonism of a Person with Low Vision in Human-Computer InteractionProceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702048(1-13)Online publication date: 7-Oct-2024
      • (2024)Accessibility Inspections Using the Web Content Accessibility Guidelines by Novice Evaluators: an Experience ReportProceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702040(1-10)Online publication date: 7-Oct-2024
      • (2024)A Universal Web Accessibility Feedback Form: A Participatory Design StudyProceedings of the 21st International Web for All Conference10.1145/3677846.3677853(106-117)Online publication date: 13-May-2024
      • (2024)Towards Automated Accessibility Report Generation for Mobile AppsACM Transactions on Computer-Human Interaction10.1145/367496731:4(1-44)Online publication date: 19-Sep-2024
      • (2024)Assessing Accessibility Levels in Mobile Applications Developed from Figma TemplatesProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3652075(316-321)Online publication date: 26-Jun-2024
      • (2024)Exploring Mobile Device Accessibility: Challenges, Insights, and Recommendations for Evaluation MethodologiesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642526(1-17)Online publication date: 11-May-2024
      • (2024)MotorEase: Automated Detection of Motor Impairment Accessibility Issues in Mobile App UIsProceedings of the IEEE/ACM 46th International Conference on Software Engineering10.1145/3597503.3639167(1-13)Online publication date: 20-May-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media