Latest Mental Health News
By Alan Mozes HealthDay Reporter
MONDAY, Oct. 4, 2021 (HealthDay News)
In a health emergency, social media giants like Facebook can be both quagmires of misinformation and sources of social support and reliable guidance, a small, new study suggests.
Researchers surveyed 32 Facebook users weekly for eight weeks. All were asked about their online experiences during March and April 2020, when COVID-triggered lockdowns unfolded.
The Facebook users — almost all white U.S. women, average age 43 — reported the site initially served as an invaluable communication resource, disseminating urgently needed and accurate information with neighborly good will.
At first, users also said they looked to their Facebook community for behavioral role models to learn how best to implement advice promoted by the U.S. Centers for Disease Control and Prevention.
But that “kumbaya town hall period,” as study lead author Jude Mikal called it, lasted only a couple of weeks.
“At the very beginning, people kind of took over Facebook,” said Mikal, of the University of Minnesota. “They co-opted it, in the spirit of an emergency, and used it to share really important social, emotional, resource and informational support. It was amazing. A kind of post-disaster pop-up support structure, with a huge flare of community involvement and unity. I’ve been studying social media for about 15 years, and this is really the first time I’ve ever seen this.”
The neighborliness wasn’t long-lived, however. “It all really went into the toilet,” lamented Mikal, vice chair of the university’s Division of Health and Policy Management’s research committee.
Survey responses revealed that by week three, “a very politicized form of engagement, coupled with the questioning of science” took hold, Mikal explained.
So what happened?
Mikal suggested that as the new normal set in, scarce information coupled with boredom caused users to default to old habits, “using social media in the way they had been using it all along.” That meant a rise in the sharing of unreliable and/or misleading information, increased political bickering, and growing frustration and distrust.
Things only went downhill from there.
By weeks six and eight of the study, much of the unity of purpose and trust that had characterized the early weeks had morphed into suspicion, distrust and an increasingly critical take on the advice and behavior of others.
“This was the worst cycle,” said Mikal. “Essentially it was a period of ‘community policing,'” during which users started to actively and publicly referee how pandemic-safe or unsafe they judged others to be.
So what does this all mean for public health? Perhaps a missed opportunity.
“I do think that there were mechanisms or strategies that the CDC could have employed that might have helped prolong that first moment and momentum when people were looking to connect in the service of community,” Mikal said.
For example, “the CDC said, mask up, wash hands,” he noted. “It was really broad advice. So big that implementation was left to your average social media user. And that led to some people being careless, some being overly careful, and many sharing misinformation.”
Mikal said that by closely monitoring these Facebook users over just eight weeks it became clear to his team where that misinformation was coming from.
“So why couldn’t the CDC do the same thing? And then jump in and produce videos that might help to clarify things and offer good guidance, and maybe by so doing stem the tide of bad information,” he said.
For now, many health experts warn people not to use social media as a source of medical information.
Public health experts should always be the go-to during a public health emergency, said Melissa Hunt, associate director of clinical training in the University of Pennsylvania psychology department.
“Trust the experts on those issues, not a random post your cousin happened to see and share,” stressed Hunt, who was not involved in the study.
“People should not use social media for news or medical guidance,” she cautioned. “Facebook algorithms promote high ‘engagement’ posts, which basically means that the more outrageous or alarming it is, the more likely you are to see it in your feed. This is not a good way to learn the truth about a pandemic, or vaccine safety, or anything else.”
Findings from the new study were published recently in Computers in Human Behavior Reports.
SOURCES: Jude P. Mikal, PhD, vice chair, research committee, Division of Health and Policy Management, University of Minnesota, Minneapolis; Melissa G. Hunt, PhD, associate director of clinical training, Department of Psychology, University of Pennsylvania, Philadelphia; Computers in Human Behavior Reports, Aug. 21, 2021, online
Copyright © 2021 HealthDay. All rights reserved.