"The groundwork of all happiness is health." - Leigh Hunt

Commercial Instagram accounts publish incorrect health information

November 9, 2022 – Nearly one in 4 popular Instagram posts about hepatitis B contained misinformation, and these posts were way more likely than accurate posts to come back from for-profit accounts or accounts selling a services or products.

According to latest Research recently presented on the 2022 Annual Scientific Meeting of the American College of Gastroenterology in Charlotte, NC.

“Users who spread misinformation about hepatitis B also have a greater reach with a higher number of followers and engagement with more likes than users who do not spread misinformation,” the moderator said Zachary C. WarnerMD, MPH, resident in internal medicine on the University of Arizona. “It's possible that patients with chronic conditions – conditions for which there are no easy treatment options – are vulnerable to online health misinformation and profit-seeking users.”

As misinformation and skepticism about evidence-based medicine grow online, Warner says patients are turning to social media and other user-generated web sites for information and support about their health.

“While these sites are useful because they provide access to social support and information that patients would otherwise lack, medical information on social media is not regulated,” he warned.

Unverified Intel

Although the implications of exposure to misinformation online will not be well researched, negative effects are possible.

“Using unproven cures and symptom treatments can increase patients' risk of poorer health outcomes and financial hardship,” Warner said. “Unproven cures and symptom treatments are unlikely to be covered by health insurance, potentially leaving patients with significant out-of-pocket costs.”

Warner and his team narrowed their search on Instagram to a moment in a month in December 2021. They searched for all publicly available posts that mentioned hepatitis B and hepatitis B. After removing duplicates from the highest 55 posts for every term, they coded the remaining 103 posts using a validated misinformation assessment tool. The tool's variables included engagement akin to likes and comments, user characteristics akin to variety of followers, and claims containing misinformation assessed by health workers.

The researchers then analyzed the connection between profitability and misinformation within the posts. Nearly 1 / 4 of the posts (23%) contained misinformation about hepatitis B or its treatment. These posts also had higher average engagement, with 1,599 likes, in comparison with posts with accurate details about hepatitis B, which had a mean of 970 likes. Accounts with posts containing misinformation also followed more accounts on average (1,127) than accounts with accurate posts about hepatitis B (889 accounts). But the accounts that posted misinformation had, on average, a couple of third as many followers (22,920) because the accounts that posted accurate information (70,442 followers).

“We believe it is wise to maintain a high level of skepticism toward information that promises results that are 'too good to be true,' that uses anecdotal evidence, or that is experimental,” Warner said. “We recommend the CRAAP test, which helps individuals evaluate sources of health information.”

Does this pass the CRAAP test?

  • Consider the CTimeliness of knowledge
  • The RRelevance to your needs
  • AAuthority of the source
  • AAccuracy of content and
  • The PPurpose of the source

In their study, researchers found that slightly below a 3rd (30%) of hepatitis B posts referenced a conspiracy theory and an identical proportion (29%) got here from for-profit accounts. And just over a 3rd (34%) of posts got here from accounts selling a services or products on Instagram.

Overall, greater than 3 times as many posts containing misinformation got here from for-profit accounts (47%) than posts containing accurate information (14%). An identical proportion of posts containing misinformation (43%) got here from accounts selling a services or products, in comparison with accurate posts (13%) from accounts selling a services or products.

These results didn’t surprise David Gorski, MD, PhD, professor of surgery at Wayne State University School of Medicine.

“Although health misinformation is often ideological and belief-driven, it is almost always driven by the profit motive of doctors who sell treatments based on that misinformation,” he said.

“In other words, most quacks believe in the quackery they are selling, and believers are far more effective salespeople than scammers who know what they are selling is quackery,” Gorski said.

“We strongly urge patients to do their best to understand the possible motivations behind the individuals or organizations that create the health information that is visible online, particularly on social media sites,” Warner said, also advising doctors and health organizations to have open conversations about misinformation online and in person.

“Persistent believers in misinformation are almost always unreachable and unteachable, and it is largely a waste of time to try to change their minds,” Gorski said. “People who are undecided and unsure, however, are reachable. We should focus our educational efforts on them, not on those peddling the quackery.”