In Person | Renee DiResta and a small band of others interested in disinformation became advisers to Congress after the 2016 election.
SAN FRANCISCO — Before the sun came up on Oct. 31, Renee DiResta sat in bed in her pajamas and logged into a virtual war room.
For years, DiResta had battled disinformation campaigns, cataloging data on how malicious actors spread fake narratives online. That morning, wearing headphones so she wouldn’t wake up her two sleeping children, DiResta watched on her laptop screen as lawyers representing Facebook, Google and Twitter spoke at congressional hearings that focused on the role social media played in a Russian disinformation campaign before the 2016 election.
DiResta knew the lines of questioning inside and out. Along with a handful of people with a similarly obsessive interest in mapping data across social media, she had helped prepare congressional staff members before the hearings. That morning, they gathered in a dedicated channel on the Slack messaging app to watch and listen for answers to questions they had been asking for years.
Education: B.S. in computer science and political science, Stony Brook University, New York
Job: Co-founder and director of marketing at Haven, a shipping-technology company
Spare time: Advising members of Congress and others on disinformation campaigns using social media
The New York Times, LinkedIn
“We were monitoring closely to see when the companies gave misleading or partial answers so that we could follow up,” said DiResta, 36, who became immersed in disinformation campaigns in her spare time outside her job as a founder and head of marketing at Haven, a shipping-technology company.
How a small group of self-made experts came to advise Congress on disinformation campaigns is a testament to just how long tech companies have failed to find a solution to the problem. For years, the informal group — about a dozen or so people — have meticulously logged data and published reports on how easy it was to manipulate social-media platforms.
In 2016, they monitored thousands of Twitter accounts that suddenly started using bots, or automated accounts, to spread salacious stories about the Clinton family. They watched as multiple Facebook pages, appearing out of nowhere, organized to simultaneously create anti-immigrant events. Nearly all those watching were hobbyists, logging countless hours outside day jobs.
“When I put it all together and started mapping it out, I saw how big the scale of it was,” said Jonathan Albright, who met DiResta through Twitter. Albright published a widely read report that mapped, for the first time, connections between conservative sites putting out fake news. He did the research as a “second job” outside his position as research director at the Tow Center for Digital Journalism at Columbia University.
Senate and House staff members, who knew of DiResta’s expertise through her public reports and previous work advising the Obama administration on disinformation campaigns, had reached out to her and others to help them prepare for the hearings.
Rachel Cohen, a spokeswoman for Sen. Mark Warner, D-Va., said in a statement that researchers like DiResta had shown real insight into the platforms, “in many cases, despite efforts by some of the platforms to undermine their research.” Warner is on the Senate Intelligence Committee.
One crucial line of the questioning — on how much influence Russian-bought ads and content had on users — was the result of work by DiResta and others with a Facebook-owned tool. “Facebook has the tools to monitor how far this content is spreading,” DiResta said. “The numbers they were originally providing were trying to minimize it.”
A graduate of Stony Brook University in New York, DiResta wrote her college thesis on propaganda in the 2004 Russian elections. She then spent seven years on Wall Street as a trader, watching the slow introduction of automation into the market. She recalled the initial fear of overreliance on algorithms, as there were “bad actors who could come in and manipulate the system into making bad trades.”
“I look at that now and I see a lot of parallels to today, especially for the need for nuance in technological transformations,” DiResta said. “Just like technology is never leaving Wall Street, social-media companies are not leaving our society.”
DiResta moved to San Francisco in 2011 for a job with O’Reilly Alpha Tech Ventures, a venture-capital firm. But it was not until the birth of her first child a few years later that DiResta started to examine the dark side of social media.
“When my son was born, I began looking into vaccines. I found myself wondering about the clustering effects where the anti-vaccine movement was concentrated,” DiResta recalled. “I was thinking, ‘What on earth is going on here? Why is this movement gaining so much momentum here?’ ”
She started tracking posts made by anti-vaccine accounts on Facebook and mapping the data. What she discovered, she said, was that Facebook’s platform was tailor-made for a small group of vocal people to amplify their voices, especially if their views veered toward the conspiratorial.
“It was this great case study in peer-to-peer misinformation,” DiResta said. Through one account she created to monitor anti-vaccine groups on Facebook, she quickly realized she was being pushed toward other anti-vaccine accounts, creating an echo chamber in which it appeared that viewpoints like “vaccines cause autism” were the majority.
Soon, her Facebook account began promoting content to her on a range of other conspiratorial ideas, ranging from people who claim the Earth is flat to those who believe that “chem trails,” or trails left in the sky by planes, were spraying chemical agents on an unsuspecting public.
“So by Facebook suggesting all these accounts, they were essentially creating this vortex in which conspiratorial ideas can just breed and multiply,” DiResta said.
Her published findings on the anti-vaccine movement brought her to the attention of the Obama administration, which reached out to her in 2015, when officials were examining radical Islamist groups’ use of online disinformation campaigns.
Most Read Business Stories
- Troublesome advanced engines for Boeing, Airbus jets have disrupted airlines and shaken travelers
- Your Wi-Fi security is probably weak — here’s how to fix it
- Pricey Seattle apartment tower ripping out pipes to fix leaks, as tenants fume
- Repealing head tax, Seattle avoids walking into the propeller | Jon Talton
- Hands off my data! 15 more default privacy settings you should change on your TV, phone plan, LinkedIn and more
She recalled a meeting with various tech companies at the White House in February 2016 where chief executives, policy leaders and administration officials were told that U.S.-made social media platforms were key to the dissemination of propaganda by ISIS.
It was during that time that she first met Jonathan Morgan, a fellow social-media-disinformation researcher who had published papers on how the Islamic State spreads its propaganda online.
“We kept saying this was not a one-off. This was a toolbox anyone can use,” DiResta said. “We told the tech companies that they had created a mass way to reach Americans.”
A year and a half later, they hope everyone is finally listening. “I think we are at this real moment, where as a society we are asking how much responsibility these companies have toward ensuring that their platforms aren’t being gamed, and that we, as their users, aren’t being pushed toward disinformation,” DiResta said.