skip to main content
10.1145/3411764.3445570acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Public Access

Who Is Included in Human Perceptions of AI?: Trust and Perceived Fairness around Healthcare AI and Cultural Mistrust

Published: 07 May 2021 Publication History

Abstract

Emerging research suggests that people trust algorithmic decisions less than human decisions. However, different populations, particularly in marginalized communities, may have different levels of trust in human decision-makers. Do people who mistrust human decision-makers perceive human decisions to be more trustworthy and fairer than algorithmic decisions? Or do they trust algorithmic decisions as much as or more than human decisions? We examine the role of mistrust in human systems in people’s perceptions of algorithmic decisions. We focus on healthcare Artificial Intelligence (AI), group-based medical mistrust, and Black people in the United States. We conducted a between-subjects online experiment to examine people’s perceptions of skin cancer screening decisions made by an AI versus a human physician depending on their medical mistrust, and we conducted interviews to understand how to cultivate trust in healthcare AI. Our findings highlight that research around human experiences of AI should consider critical differences in social groups.

References

[1]
[n.d.]. Responsible AI Practices – Google AI. https://ai.google/responsibilities/responsible-ai-practices/?category=fairness. (Accessed on 09/13/2020).
[2]
[n.d.]. Skin Cancer Facts & Statistics - The Skin Cancer Foundation. https://www.skincancer.org/skin-cancer-information/skin-cancer-facts/. (Accessed on 09/13/2020).
[3]
MJ Arnett, Roland J Thorpe, DJ Gaskin, JV Bowie, and TA LaVeist. 2016. Race, medical mistrust, and segregation in primary care as usual source of care: findings from the exploring health disparities in integrated communities study. Journal of Urban Health 93, 3 (2016), 456–467.
[4]
Karen Stansberry Beard and Kathleen M Brown. 2008. ‘Trusting’schools to meet the academic needs of African-American students? Suburban mothers’ perspectives. International Journal of Qualitative Studies in Education 21, 5(2008), 471–485.
[5]
Ruha Benjamin. 2019. Race after technology : abolitionist tools for the New Jim Code. Polity, Cambridge, UK Medford, MA.
[6]
Noah Castelo, Maarten W Bos, and Donald R Lehmann. 2019. Task-dependent algorithm aversion. Journal of Marketing Research 56, 5 (2019), 809–825.
[7]
Juliet Corbin, Anselm Strauss, and Anselm L Strauss. 2014. Basics of Qualitative Research. Sage.
[8]
John Danaher. 2016. The threat of algocracy: Reality, resistance and accommodation. Philosophy & Technology 29, 3 (2016), 245–268.
[9]
John Danaher, Michael J Hogan, Chris Noone, Rónán Kennedy, Anthony Behan, Aisling De Paor, Heike Felzmann, Muki Haklay, Su-Ming Khoo, John Morison, 2017. Algorithmic governance: Developing a research agenda through the power of collective intelligence. Big Data & Society 4, 2 (2017).
[10]
Berkeley J Dietvorst, Joseph P Simmons, and Cade Massey. 2015. Algorithm aversion: People erroneously avoid algorithms after seeing them err.Journal of Experimental Psychology: General 144, 1 (2015), 114.
[11]
Andre Esteva, Brett Kuprel, Roberto A Novoa, Justin Ko, Susan M Swetter, Helen M Blau, and Sebastian Thrun. 2017. Dermatologist-level classification of skin cancer with deep neural networks. nature 542, 7639 (2017), 115–118.
[12]
Mark Fondacaro, Bianca Frogner, and Rudolf Moos. 2005. Justice in health care decision-making: Patients’ appraisals of health care providers and health plan representatives. Social justice research 18, 1 (2005), 63–81.
[13]
Cary Funk, Meg Hefferon, Brian Kennedy, and Courtney Johnson. 2019. Trust and mistrust in Americans’ views of scientific experts. Pew Research Center (2019).
[14]
Peter W Groeneveld, Seema S Sonnad, Anee K Lee, David A Asch, and Judy E Shea. 2006. Racial differences in attitudes toward innovative medical technology. Journal of general internal medicine 21, 6 (2006), 559–563.
[15]
Paul Hitlin. 2016. Research in the crowdsourcing age, a case study. Pew Research Center 11(2016).
[16]
John Hoberman. 2012. Black and blue: The origins and consequences of medical racism. Univ of California Press.
[17]
Rana A Hogarth. 2017. Medicalizing Blackness: making racial difference in the Atlantic world, 1780-1840. UNC Press Books.
[18]
Bernice Roberts Kennedy, Christopher Clomus Mathis, and Angela K Woods. 2007. African Americans and their distrust of the health care system: healthcare for diverse populations.Journal of cultural diversity 14, 2 (2007).
[19]
Roderick M Kramer. 1999. Trust and distrust in organizations: Emerging perspectives, enduring questions. Annual review of psychology 50, 1 (1999), 569–598.
[20]
Markus Langer, Cornelius J König, and Maria Papathanasiou. 2019. Highly automated job interviews: Acceptance under the influence of stakes. International Journal of Selection and Assessment 27, 3(2019), 217–234.
[21]
Emily LaRosa and David Danks. 2018. Impacts on trust of healthcare AI. In Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society. 210–215.
[22]
Chioun Lee, Stephanie L Ayers, and Jennie Jacobs Kronenfeld. 2009. The association between perceived provider discrimination, health care utilization, and health status in racial and ethnic minorities. Ethnicity & disease 19, 3 (2009), 330.
[23]
Min Kyung Lee. 2018. Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management. Big Data & Society 5, 1 (2018), 1–16.
[24]
Min Kyung Lee and Su Baykal. 2017. Algorithmic mediation in group decisions: Fairness perceptions of algorithmically mediated vs. discussion-based social division. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. ACM, 1035–1048.
[25]
Min Kyung Lee, Daniel Kusbit, Evan Metsky, and Laura Dabbish. 2015. Working with machines: The impact of algorithmic and data-driven management on human workers. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 1603–1612.
[26]
Jennifer M Logg, Julia A Minson, and Don A Moore. 2019. Algorithm appreciation: People prefer algorithmic to human judgment. Organizational Behavior and Human Decision Processes 151 (2019), 90–103.
[27]
Chiara Longoni, Andrea Bonezzi, and Carey K Morewedge. 2019. Resistance to medical artificial intelligence. Journal of Consumer Research 46, 4 (2019), 629–650.
[28]
Keri Mallari, Kori Inkpen, Paul Johns, Sarah Tan, Divya Ramesh, and Ece Kamar. 2020. Do I Look Like a Criminal? Examining how Race Presentation Impacts Human Judgement of Recidivism. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[29]
Rich Morin and Renee Stepler. 2016. The racial confidence gap in police performance. Pew Research Center 29(2016).
[30]
Safiya Noble. 2018. Algorithms of oppression : how search engines reinforce racism. New York University Press, New York.
[31]
Shayla C Nunnally. 2012. Trust in Black America: Race, discrimination, and politics. NYU Press.
[32]
Michael Q Patton. 1980. Qualitative Research and Evaluation Methods. Sage.
[33]
Dorothy E Roberts. 1999. Killing the black body: Race, reproduction, and the meaning of liberty. Vintage.
[34]
Rachel C Shelton, Gary Winkel, Stacy N Davis, Nicole Roberts, Heiddis Valdimarsdottir, Simon J Hall, and Hayley S Thompson. 2010. Validation of the group-based medical mistrust scale among urban black men. Journal of General Internal Medicine 25, 6 (2010), 549–555.
[35]
Harold Sigall and Judson Mills. 1998. Measures of independent variables and mediators are useful in social psychology experiments: But are they necessary?Personality and Social Psychology Review 2, 3 (1998), 218–226.
[36]
Francis Terrell and Sandra L Terrell. 1981. An inventory to measure cultural mistrust among Blacks. Western Journal of Black Studies 5, 3 (1981), 180–184.
[37]
David H Thom and Bruce Campbell. 1997. foTlG INALRESEARCH Patient-Physician Trust: An Exploratory Study. The Journal of family practice 44, 2 (1997), 169.
[38]
Hayley S Thompson, Heiddis B Valdimarsdottir, Gary Winkel, Lina Jandorf, and William Redd. 2004. The Group-Based Medical Mistrust Scale: psychometric properties and association with breast cancer screening. Preventive medicine 38, 2 (2004), 209–218.
[39]
Alexandra To, Wenxia Sweeney, Jessica Hammer, and Geoff Kaufman. 2020. ” They Just Don’t Get It”: Towards Social Technologies for Coping with Interpersonal Racism. Proceedings of the ACM on Human-Computer Interaction 4, CSCW1(2020), 1–29.
[40]
Michael Veale, Max Van Kleek, and Reuben Binns. 2018. Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision-Making. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 440.
[41]
Christopher W Wheldon, Stephanie K Kolar, Natalie D Hernandez, and Ellen M Daley. 2017. Factorial invariance and convergent validity of the group-based medical mistrust scale across gender and ethnoracial identity. Journal of Health Care for the Poor and Underserved 28, 1(2017), 88–99.
[42]
Allison Woodruff, Sarah E Fox, Steven Rousso-Schindler, and Jeffrey Warshaw. 2018. A qualitative exploration of perceptions of algorithmic fairness. In Proceedings of the 2018 chi conference on human factors in computing systems. 1–14.

Cited By

View all

Index Terms

  1. Who Is Included in Human Perceptions of AI?: Trust and Perceived Fairness around Healthcare AI and Cultural Mistrust
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Information & Contributors

            Information

            Published In

            cover image ACM Conferences
            CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
            May 2021
            10862 pages
            ISBN:9781450380966
            DOI:10.1145/3411764
            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Sponsors

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            Published: 07 May 2021

            Permissions

            Request permissions for this article.

            Check for updates

            Author Tags

            1. Black Perspectives
            2. Fairness
            3. Group-Based Medical Mistrust Scale (GBMMS)
            4. Healthcare AI
            5. Perceptions of Algorithmic Decisions
            6. Trust

            Qualifiers

            • Research-article
            • Research
            • Refereed limited

            Funding Sources

            Conference

            CHI '21
            Sponsor:

            Acceptance Rates

            Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

            Upcoming Conference

            CHI 2025
            ACM CHI Conference on Human Factors in Computing Systems
            April 26 - May 1, 2025
            Yokohama , Japan

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • Downloads (Last 12 months)2,439
            • Downloads (Last 6 weeks)355
            Reflects downloads up to 05 Feb 2025

            Other Metrics

            Citations

            Cited By

            View all

            View Options

            View options

            PDF

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format.

            HTML Format

            Login options

            Figures

            Tables

            Media

            Share

            Share

            Share this Publication link

            Share on social media