For years, racism mandated that Black people and other people of color in the United States use back doors to enter restaurants, movie theaters, and other public places. While these practices have ended, digital back doors may once again make them and others second-class citizens when it comes to health.
Digital back doors are technological processes and tools used in health care, such as racially biased algorithms, infrastructural limitations, and dirty data. These unwittingly exacerbate existing health inequities, which the World Health Organization defines as “systematic differences in the health status of different population groups.”
How are digital back doors created?
Their root cause is human made, due to the development and application of technology by some health information technology (health IT) developers and clinicians who fail to fully or explicitly consider equity in health care.
Almost everyone today must navigate a wide set of interactions with health information and health care that are mediated through computers, mobile applications, wearable devices, telehealth and telemedicine — collectively known as digital health. Companies using technology to deliver health services and products aim to help people on these journeys by designing digital front doors.
Here’s how Mutaz Shegewi, research director of Intersystems, a provider of data solutions for health care systems and other organizations, describes these portals: “The digital front door gravitates health care toward a more consumer-friendly, patient-centric paradigm,” and is “powered by digital touchpoints that drive better access, engagement and experiences across the service continuum.”
Digital front doors are extending health care beyond brick-and-mortar buildings by using technology people have already incorporated into their lives.
What worries me about the digital front door concept is that it takes a health care consumerism approach in designing experiences for patients. Rather than viewing health care as a right, the digital front door approaches it more as a commodity.
Despite the seemingly democratizing appeal of digital front doors in health care, many people of color interact with health through digital back doors. Like the racist physical back door etiquette that existed for much of the 20th century, the digital back door creates an inequitable path to health care.
In the tech industry, the term “back door” generally refers to alternative, often covert and nefarious, access to computer systems that circumvents security mechanisms. In health care, the digital back door also evades the improved health outcomes and wellness that health technology often promises.
Through my work as director of COVID Black, an organization that uses data and technology to advocate for health equity, I have identified three key components of the digital back door in health care that lead communities of color down a path of health inequity: internet access, artificial intelligence, and electronic health record interoperability.
Telehealth and internet access
The necessary and swift transition to telehealth during the pandemic to replace most in-person medical visits made clear to health care providers what activists and social reformers have long known: not all Americans have broadband access to the internet, or access at all. This inequity is, in part, due to digital redlining, which the National Digital Inclusion Alliance defines as “discrimination by internet service providers in the deployment, maintenance, or upgrade of infrastructure or delivery of services.” Poor internet access is also due to a persistent digital divide, the gap between people who have ready access to computers and the internet in the United States. This divide is partially due to the expense of broadband and a lack of digital proficiency, as well as individuals’ effective use and engagement with digital technology.
The Pew Research Center and others have reported that people of color tend to have less access than white people to broadband service, a home computer, and internet-enabled devices, which limits their access to telehealth. A survey conducted by the Office of the Assistant Secretary for Planning and Evaluation, which is part of the Department of Health and Human Services, indicates that while people of color were more likely to use telehealth than white people, they were less likely to use video-enabled telehealth services than audio appointments during the pandemic.
Video-enabled appointments offer opportunities for a partial physical exam, an assessment of nonverbal communication, and occasions for clinicians to evaluate a patient’s home environment for safety. Researchers suspect that disparity in broadband access, along with other factors such as digital proficiency and access to a mobile device or computers with a camera, may manifest in low rates of video-enabled health visits among communities of color and function as a digital back door into reduced health care services.
Artificial intelligence is becoming more and more central to health care by facilitating diagnosis and treatment recommendations; improving the organization, storage, and communication of health information; and patient engagement and monitoring through machine learning that predicts no-shows and cancellations for medical appointments, as well as sends reminders to patients to take essential medications.
However, algorithmic bias in AI — harmful skewing of predictions from a sequence of well-defined instructions typically used to perform a computation on health data — also ushers communities of color through a digital back door to health care.
Health IT developers build and train algorithms using datasets to predict and solve health care problems. If these data lack diversity, are biased, or flawed, the algorithm can make predictions that misdiagnose patients, or favor white patients for extra medical care over Black patients.
Since much of health IT is developed in a black box, in which the intricacies of inner workings is opaque, it is often difficult to pinpoint the exact source, beyond problems with training data, of algorithmic oppression, or discrimination by computer code. What is known is that racial bias in AI is also a problem of design. For example, a widely cited study highlighted how an algorithm designed to assign risk scores to individuals based on total health-care costs accrued for the year resulted in Black people, who were sicker than white people, being less likely to be identified for personalized care. In this case, health IT developers lacked an understanding of how structural racism creates a system where Black patients may pay less overall for health care even as they may experience poorer health outcomes. Poor data and faulty AI design function as a digital back door that compromise the health of patients of color.
Electronic health record interoperability
Technology has transformed health records, providing clinicians with more methods for documenting encounters with their patients. Electronic health records have proven to be efficient containers of health information, but they also expose racial bias in some clinicians’ perceptions of their patients.
University of Chicago researchers used machine learning tools to demonstrate that Black patients are more than twice as likely as white patients to have at least one negative descriptor in their EHR. Although negative descriptors are not automatically racially stigmatizing, they can have an adverse impact by following Black patients into other health care settings and influencing the care they receive from other clinicians.
The electronic exchange of health information, called EHR interoperability, facilitates sharing patient information between different EHR systems and health care providers. In most cases, EHR interoperability improves the ease in which clinicians deliver health care. However, clinicians’ bias, as reflected in a disproportionate number of negative descriptors about Black patients, can turn EHR interoperability into a digital back door that exacerbates and creates new health inequity.
If health tech is truly committed to accessible digital front doors to health care, it must first acknowledge the digital back doors evident in digital redlining, algorithmic bias, stigmatizing language in EHRs, and other forms of racism that are enabled through how health IT developers and clinicians use and apply digital technologies in health care. By jeopardizing the health of patients of color, digital back doors are a feature of health inequity.
The health care industry’s promise of innovative, end-to-end digital health platforms must start by closing digital back doors and create equitable digital front doors to truly transform the state of the nation’s health.
Kim Gallon is the director of COVID Black, and a Just Tech Fellow with the Social Science Research Council.