Facebook is conducting a vast behind-the-scenes study of doubts expressed by U.S. users about vaccines, a major project that attempts to probe and teach software to identify the medical attitudes of millions of Americans, according to documents obtained by The Washington Post.
The research is a large-scale attempt to understand the spread of ideas that contribute to vaccine hesitancy, or the act of delaying or refusing a vaccine despite its availability, on social media – a primary source of health information for millions of people. It shows how the company is probing ever more nuanced realms of speech, and illustrates how weighing free speech versus potential for harm is more tenuous than ever for technology companies during a public health crisis.
While Facebook has banned false and misleading statements about coronavirus vaccines since December, a huge realm of expression about vaccines sits in a gray area. One example could be comments by someone expressing concern about side effects that are more severe than expected. Those comments could be both important for fostering meaningful conversation and potentially bubbling up unknown information to health authorities – but at the same time they may contribute to vaccine hesitancy by playing upon people’s fears.
The research explores how to address that tension by studying these types of comments, which are tagged “VH” by the company’s software algorithms, as well as the nature of the communities that spread them, according to the documents. Its early findings suggest that a large amount of content that does not break the rules may be causing harm in some communities, where it has an echo chamber effect.
The company’s data scientists divided the company’s U.S. users, groups, and pages into 638 different population segments to explore which types of groups hold vaccine-hesitant beliefs. The document did not identify how Facebook defined a segment or group different communities, but it noted that the segments could be at least 3 million people.
Some of the early findings are notable: 10 out of the 638 population segments contained 50% of all vaccine hesitancy content on the platform. And in the population segment with the most vaccine hesitancy, 111 users contributed half of vaccine hesitant content.
Facebook could use the findings to inform discussions of its policies for addressing problematic content or to direct more authoritative information to the specific groups, but the company was still developing its solution, said spokeswoman Dani Lever.
According to the documents, the research effort also discovered early evidence of significant overlap between communities that are skeptical of vaccines and those affiliated with QAnon, a sprawling set of baseless claims that has radicalized its followers and been associated with violent crimes.
Facebook has partnered with more than 60 health experts around the globe, Lever said in an emailed statement, and it routinely studies a wide variety of content to inform its policies.
“Public health experts have made it clear that tackling vaccine hesitancy is a top priority in the COVID response, which is why we’ve launched a global campaign that has already connected 2 billion people to reliable information from health experts and remove false claims about COVID and vaccines,” she said. “This ongoing work will help to inform our efforts.”
Nearly 30% of Americans – and half of Republican men – say they do not intend to get one of the three federally approved vaccines, according to a March poll by PBS NewsHour, Marist and NPR. An Associated Press/NORC study from late January found that the top reasons for concern over the vaccinations were fear of side effects, distrust of vaccines, and desire to wait and possibly get it later.
Covid-19-related misinformation has flooded the company’s platforms, including false narratives about the disease, which can be cause by the novel coronaviru, being less deadly than the flu, that it is somehow associated with a population-control plot by the philanthropist Bill Gates and that vaccines are associated with the Antichrist. Its content decisions, its potentially anticompetitive behavior, and its use by extremist groups to foment violence have drawn the attention of regulators, leading to Congressional hearings and major antitrust charges by the Justice Department.
Facebook, which owns the WhatsApp messenger and Instagram, collect reams of data on its more than 3.3 billion users worldwide and has a broad reach on those users’ devices. Public health experts say that puts the company in a unique position examine attitudes toward vaccines, testing, and other behaviors, and to push information to people.
But the company’s history of misusing people’s data makes a steep hill to climb when it comes to proving that Facebook’s research efforts serve the public. The company allowed a political research firm to exploit the profiles of tens of millions of Americans, resulting in the Cambridge Analytica privacy scandal, and at one time manipulated people’s emotions for an internal study.
Since April, the social network has partnered publicly with Carnegie Mellon University researchers to conduct the COVID-19 Symptom Survey, a daily survey of Facebook users that asks people about their symptoms, testing, mental health, attitudes about masks, and intent to get vaccinated. A related project has used smartphone data to track whether people are following social distancing orders and lockdowns. At least 16 million people have been surveyed, making it one of the large public health data collection efforts during the pandemic, the researchers have said.
Facebook has also banned a wide range of baseless or misleading claims about vaccines and about the virus – removing more than 12 million pieces of content – and connects people to authoritative information with labels on posts and with a banner atop the Facebook site, according to the company.
Facebook’s research into vaccine hesitancy will force the company to walk a fine line if it decides to further police it, since much of the content regards expressions of concern and doubt versus outright misinformation.
“Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful,” wrote Kang-Xing Jin, Facebook’s head of health, in an opinion article last week in the San Francisco Chronicle. “It’s hard to draw the line on posts that contain people’s personal experiences with vaccines.”
But according to the documents, Facebook worries about the content that does not break its rules outright. “While research is very early, we’re concerned that harm from non-violating content may be substantial,” the documents said.
That risk of harm seems to be disproportionately affecting a few communities, Facebook’s engineers found.
The results from Facebook’s early research track with findings from disinformation researchers who have noted that a small minority of people, particularly influencers and people who post frequently or use underhanded tactics to spread their message, can have an outsize impact on the conversation and act as superspreaders of misleading information.
The researchers noted that while some small percentage of vaccine hesitant comments could be overcome when they were posted in communities with a diverse range of opinions, the concentration of such comments in small groups suggests that they are becoming echo chambers where people simply reinforce people’s preexisting ideas.
In segments that were affiliated with QAnon, the study found widespread vaccine skepticism. “It’s possible QAnon is causally connected to vaccine hesitancy,” the document said. In QAnon communities, skepticism of vaccines was connected to a distrust of elites and institutions.
Last year, external researchers found that QAnon groups in Facebook were influential in fueling the spread of a misinformation-filled documentary called “Plandemic” on social platforms.
The internal Facebook study found that comments that could contribute to vaccine hesitancy overlap with QAnon and go well beyond it, into many other different types of communities. While QAnon groups appeared to be more focused on a possible distrust of authority as a reason for doubting the vaccine, other groups had different ways of expressing their doubts and worries. This finding suggests that public health experts will need to develop nuanced messages when trying to address vaccine hesitancy in the population.