Yves here. This post uses black patients as a case study in how doctors who have biased reactions and incorporate them into medical notes can then harm care, particularly when the doctor depicts the patient as non-compliant.

This is hardly the only example of doctor prejudice having the potential to lead to ignoring patient issues and otherwise undermining care. Bias against women is well documented. Women are often seen as exaggerating the severity of their symptoms, which can lead to ignoring conditions in early stages when they are easier to treat. IM Doc has reported that his former students who wind up working in the Deep South have told him of a widespread dismissiveness towards female patients.

By Sara Novak, a science journalist based in South Carolina. She’s a contributing writer to Discover Magazine. Her work also appears in Scientific American, Popular Science, Sierra Magazine, Astronomy Magazine, and many more. Originally published at Undark

In the mid-1990s, when Somnath Saha was a medical resident at the University of California, San Francisco School of Medicine, he came across a cluster of studies showing that Black people with cardiovascular disease were treated less aggressively compared to White people. The findings were “appalling” to the young physician who describes himself as a “Brown kid from suburban St. Louis, Missouri.”

Saha had experienced racism growing up, but was surprised to see such clear signs of inequity within the field of medicine. “There was an injustice happening in my own backyard,” he said.

Indeed, bias towards Black patients can be challenging because many doctors either don’t realize their biases or won’t admit to them. Saha, now a professor of medicine at Johns Hopkins University, likens implicit bias — unconscious judgments that can affect behavior — to “an invisible force.”

While numerous studies have found evidence of racial discrimination in medicine through patient reports, less is known about how implicit bias shows up in medical records, and how stigmatizing language in patient notes can affect the care that Black patients receive.

That’s part of the reason why, about seven years ago, Saha began poring through medical records. For him, they offered a window into doctors’ feelings about their patients.

As part of his latest research, Saha’s team examined the records of nearly 19,000 patients, paying particular attention to negative descriptions that may influence a clinician’s decision-making. The data, which was recently presented at the 2023 American Association for the Advancement of Science annual meeting, isn’t yet published, but it suggests what researchers have long speculated: Doctors are more likely to use negative language when describing a Black patient than they are a White patient. The notes provide, at times, a surprisingly candid view of how patients are perceived by doctors, and how their race may affect treatment.

The study adds to a concerning body of literature that explores how racial bias manifests in health care. Researchers like Saha are interested in how such prejudice leaves a paper trail, which can then reinforce negative stereotypes. Because medical notes get passed between physicians, Saha’s research suggests they can affect the health of Black patients down the line.

“The medical record is like a rap sheet, it stays with you,” Saha said, adding that “these things that we say about patients get eternalized.”


Research has long shown that Black patients experience worse health outcomes compared to White patients, in part due to biased medical care. Black women, for example, are three times more likely to die from pregnancy-related complications compared to White women. And Black patients often report feeling like physicians don’t listen to their needs or don’t believe their concerns.

Studies appear to back that up. Last year, researchers at the University of Washington found that non-Hispanic White children who went to the emergency room for migraines were more likely to receive pain medications compared to children of color — even though the two groups reported similar pain scores. Other studies echo similar results for adults as well.

While Michael Sun, a resident physician at the University of Chicago, knew about such health disparities, by his own admission, he was naive about the biases in medical records. At that time, Sun had “no experience in the medical record, in documentation, or in physician language and culture,” he said.

But in his first year of medical school, his professor shared the story of a longtime patient, whom she had referred to an outside specialist. In Sun’s recollection, the professor regarded her patient in kind terms, having worked with her for some time to treat a chronic illness. But when she got the specialist’s notes back, she was confused by the description of her patient: Terms like “really difficult,” “non-compliant,” and “uninterested in their health.” This was not the patient she remembered.

“This, as a first-year medical student, really shocked me because I had taken at face value that any words used in notes were true, were valid, or rightfully used,” said Sun. “I realized all the ways that bias, untold stories, and unknown context may change the way that we view our patients.”

Like Saha, Sun became interested in how bias influenced the relationship between doctor and patient, and how these interactions were memorialized in the medical record. In a study published last year, he and his colleagues looked at more than 40,000 medical notes from 18,459 patients. Researchers first manually combed through the notes, then used this information to teach a machine learning algorithm to interpret the connotations of words. Compared to White patients, Black patients were about 2.5 times more likely to be described negatively, with terms like “challenging,” “angry,” and “noncompliant.”

Saha has used similar methodology — and found similar results — in his own research. For the study presented at the AAAS meeting, his team first read through more than 100,000 medical notes to identify language their team considered to be disparaging — which they chose based on a list of words and phrases from prior research. They then used machine learning to find those terms in medical notes, taking care to ensure context was considered. For example, if the word “aggressive” was used to describe a treatment plan, it was excluded from their analysis. But if “aggressive” was used to describe the patient, it was included.

Saha pointed to three categories of stigmatizing language that were the most pronounced: expressing doubt or disbelief in what the patient said, such as reporting they “claimed” to experience pain; insinuating that the patient was confrontational, using words like “belligerent” or combative”; and suggesting a patient was not cooperating with a doctor’s orders by saying they “refused” medical advice.

“We’ve known for some time that in health care we sometimes use language that can be confusing or even insulting,” Matthew Wynia, director of the Center for Bioethics and Humanities at the University of Colorado, wrote in an email to Undark. But he noted that research such as Saha’s has drawn attention to a previously overlooked issue. Describing a patient as “non-compliant” with medications, he said, “makes it sound like the patient is intentionally refusing to follow advice when, in fact, there are many reasons why people might not be able to follow our advice and intentional refusal isn’t even a very common one.”

Saha noted that if a patient isn’t taking their medication, it’s important that doctors note that, so that the next physician doesn’t overprescribe them. But the concern, he said, is whether doctors are using these terms appropriately and for the right reasons because of the implications they have on patients.

If a doctor portrays their patient negatively, Saha said, it can “trigger the next clinician to read them and formulate a potentially negative opinion about that patient” before they’ve even had a chance to interact.

Still, stigmatizing language is only one small piece of the puzzle. What also matters, Saha said, is how those words can have an impact on care. In prior work, Saha has shown how implicit and, in some cases, explicit bias, affects a patient’s treatment recommendations.

In a 2018 study, Saha, along with his wife, Mary Catherine Beach — also a professor at Johns Hopkins University — combed through reports of patients with sickle cell anemia. Their team focused on that particular population since sickle cell patients are some of the most stigmatized in the health care system: Most patients are Black and many require regular doses of opioids for pain management.

In the notes, they found numerous examples of details that were irrelevant to patients’ health concerns: phrases like “girlfriend requests bus token,” “cursing at nurse,” “girlfriend on bed with shoes on,” and “narcotic dependent.”

Saha and Beach wanted to see how these remarks might influence a physician’s treatment recommendations, so they used vignettes they had found in the medical records of sickle cell patients. They showed either a vignette which had described patients negatively, or one that was edited with neutral language. Then they asked medical students and residents about the dose of pain medication they would hypothetically recommend. Beach said that the purpose was to see how what she called “dog whistles about social class or race or something that would make the person seem less educated” would impact treatment recommendations.

The study found that medical notes with stigmatizing language were associated with “less aggressive management of the patient’s pain.” Doctors who read the stigmatizing language chart notes prescribed less pain medication to patients even in cases when they commented that their pain was a 10 out of 10.

“The fact that we were able to show that this bias transmits to the next doctor has been the thing that I think motivates doctors to take it seriously,” said Beach.


Pain management has become a focal point for researchers because many of the most glaring racial tropes about patient care have revolved around pain. In 2016, a study conducted at the University of Virginia found that half of the 418 medical students and residents surveyed endorsed false beliefs about Black patients. For example, that “Blacks’ nerve endings are less sensitive than whites” and “Blacks’ skin is thicker than whites.” What’s more, those who endorsed these false beliefs also rated Black patients’ pain as lower than White patients’.

Antoinette M. Schoenthaler, a professor of population health and medicine at New York University’s Grossman School of Medicine and associate director of research at the school’s Institute for Excellence in Health Equity, said that disparities in pain management are pervasive and widespread across the medical profession. They seep into treatments for sickle cell anemia, but also prenatal care. As a result, she said, Black patients across the board are often fearful of attending appointments.

“Patients of color go into an appointment with feelings of heightened anxiety because they’re expecting mistreatment,” said Schoenthaler. “We’ve seen minoritized patients have higher blood pressure in the context of a clinical visit because of these expectations of anxiety and fear, and disappointment.”

Disparities in health care between Black and White patients is a complex issue — one which can’t be solved by addressing medical records alone. But, for researchers like Saha, Beach, and Sun, they can offer a roadmap that outlines where differences in care begin. The words a clinician uses sets the path for how a patient may be treated in the future.

One way to combat implicit bias, Saha suggested, is to use an algorithm that identifies stigmatizing language to “give hospital departments or clinicians report cards on how much of this language that they’re using.” By benchmarking averages against one another, clinicians could know if they’re using stigmatizing language at an above average rate. This is something he is considering for future research.

When clinicians are made aware of their biases — when the unconscious becomes conscious — Saha told Undark that he’s optimistic they’ll work to change them: “We’re using language that we’ve used forever without realizing the potential impact that it has on patient care.”


This entry was posted in Guest Post, Health care, Income disparity, Social policy on by Yves Smith.