Is ChatGPT Health the new WebMD?

For a few days around Christmas, Holly Jespersen, a 50-year-old living in New York, did not feel great. It felt like a cold was coming on, but she wasn’t sure if she should go to the doctor or not.

“When you go to urgent care, you pay a $75 copay, and they’re like, ‘it’s viral and there’s nothing we can do for you,’” Jespersen said. So she turned to ChatGPT, the AI-powered chatbot from OpenAI, and asked: “Should I go to the doctor or not?” The answer was no.

A few days later, she took a three-hour nap — an unusual occurrence for her, who described herself as “not a napper.” Sinus pressure, headaches, fatigue and a high fever followed. She finally went to urgent care, using ChatGPT, to help her decide when it was time to see a doctor. At the medical facility, she was tested for strep, COVID-19 and influenza. The test revealed she had influenza A, a type of flu virus.

Advertisement:

Jespersen is far from the only one turning to ChatGPT to make health-related decisions. According to OpenAI, more than 40 million people ask ChatGPT a health-related question every day, with some 230 million asking a health-related question per week. From a business perspective, it did not come as a surprise to many that on Jan. 7, the company announced that over the coming weeks it would be rolling out ChatGPT Health.

In the announcement, the company stated that consumers will be able to upload their medical records and wellness apps to make the service more useful and customized. ChatGPT Health is meant “to support, not replace,” medical care, and it is not intended for “diagnosis or treatment,” the company said. Instead, it will help users navigate “everyday questions” and “understand patterns over time” to help people feel prepared and informed

“While ChatGPT is more interactive, neither resource is without pitfalls.”

This may sound familiar to those who recall the rollout of the health care information website WebMD, which also promised to empower and inform people about their bodies. It quickly became the most popular source of health information in the United States and paved a way for a new digital era in health information — giving people a chance to self-diagnose or symptoms check for a health problem — but also helped introduce terms like “cyberchondria,” in which a person engages in excessive online searching for health information, leading to increased anxiety, unfounded distress and misinterpretation of symptoms.

Advertisement:

As ChatGPT Health enters the chat, so to speak, many are wondering if it will be a helpful tool in America’s broken health care system or if it will exacerbate long-standing problems about how the digital age is changing access to health information.

“I have often said that ChatGPT is similar to the WebMD Symptom Checker,” family physician Dr. Alexa Mieses Malchuk told Salon. “While ChatGPT is more interactive, neither resource is without pitfalls.”

A study published in NPJ Digital Medicine recently found that this technology, known as large language models, are built to prioritize being helpful over being accurate when it comes to medical information. Indeed, one major concern about ChatGPT Health is that it won’t provide accurate health information.

“It’s great that anyone can access information, but on the flipside, if you don’t have medical training, you don’t know how to sift through that information and figure out what’s important,” Malchuk said. “ChatGPT Health will integrate with certain personal health data and make recommendations based on codes and algorithms.”

Advertisement:

However, Malchuk emphasized, it cannot replace “the experience of a medical professional” who is better equipped to understand “the nuance” of a person’s situation.

The stakes are lower when a person is trying to figure out if they have a cold or the flu. But when it comes to more serious health issues, like cancer, the consequences can be big. A 2023 study found that when ChatGPT was asked to generate treatment plans for various cancers, its plans contained numerous errors, including some “difficult even for experts to detect.”

More recently, the American Society of Clinical Oncology said that newer research has shown that ChatGPT’s answers to questions about symptoms, prevention and screening of colon cancer specifically were “highly accurate and reliable by a review panel of expert oncologists.” This suggests that LLMs could be useful in answering patients’ questions, but not in creating diagnoses or treatment plans.

Malchuk emphasized to Salon that prompts — the commands or requests sent to chatbots — matter, too.

“ChatGPT is also limited based on the prompts that you provide,” Malchuk said. “Again, without any medical knowledge, you might not be asking the right questions of ChatGPT.”

In 2013, researcher Thomas A. Fergus published a study in the journal Cyberpsychology, Behavior and Social Networking, turning attention to the impact of having limitless health information online on anxiety. Specifically, Fergus suspected that searching for symptoms online had the potential to exacerbate health anxiety. The results of his 512 person study confirmed this. People with higher levels of “intolerance of uncertainty” were “especially likely” to experience health anxiety as a result of Internet searches.

Advertisement:

Elizabeth Sadock, a clinical psychologist in Virginia, said that anxiety preys on uncertainty. But part of the process in overcoming anxiety, especially health anxiety, is sitting with the uncertainty. That can be difficult when people rely on a machine for constant reassurance.

“I think it is set up in a way that makes it very hard to resist, always available at any time of day, affirming, never tiring of questions,” Sadock told Salon. “And reassurance seeking is a way to feed that anxiety and keep us on the hook, always looking for more reassurance.”

ChatGPT, Sadock said, can “keep feeding that reassurance-seeking behavior.”

For her patients with health anxiety, some would use WebMD “excessively,” which was not healthy for their anxiety.

“And so, their homework would be to limit their use, as to not reinforce their anxiety,” she said. “And now, limiting ChatGPT can be part of the treatment plan as well, again, assuming it is interfering with their life and functioning.”

Advertisement:


Start your day with essential news from Salon.
Sign up for our free morning newsletter, Crash Course.


Then there are concerns over security and privacy. Bradley Malin, an accenture professor of biomedical informatics at Vanderbilt University, told Salon he isn’t so much concerned about how secure the data is, as he believes OpenAI has made a good faith effort to secure data thus far, but what the consequences are — and for whom — in the event of a breach.

“I wouldn’t say it’s even a security issue, it’s the fact that it’s not regulated in the same way that a health care system would be regulated with respect to a medical record system,” Malin said. “It’s unclear to me how the protections that OpenAI is putting in place relate to what HIPAA actually requires.”

Malin said it’s positive that OpenAI won’t use medical records for training their LLMs, and that they’re separating the health information from other information. But for medical record systems, HIPAA requires specific security measures to be applied.

Advertisement:

“It does make you question if a patient understands that their rights might actually change when they’re shifting from looking at their medical record in a secured environment and then you allow that data to flow to another application that’s outside the control of that organization,” Malin said. “As well as outside of the protections that HIPAA affords the patient.”

Dr. Neal Kumar, board-certified dermatologist, said ChatGPT Health is not about replacing doctors, but instead of giving people “another layer of support.” From a dermatologist’s perspective, ChatGPT Health can be helpful as an educational tool. It can help people clarify basic medical terminology.

“For example, ChatGPT can help educate patients about topics such as nutrition for hair health, or types of sunscreen before an appointment,” Kumar said. “ChatGPT is not a certified or FDA-approved medical device capable of providing medical diagnoses; accurate dermatologic evaluation still requires interpretation from a licensed clinician.”

Advertisement:

When asked if ChatGPT Health will be the new WebMD, Kumar said “not quite.”

“WebMD provides curated, medically-reviewed content, whereas ChatGPT Health generates responses using AI language models, which can occasionally be inaccurate or misleading,” Kumar said.

Read more

about health


Advertisement:

Comments

Leave a Reply

Skip to toolbar