After years of largely disregarding warnings from privacy researchers and developers about what companies like Facebook were doing, the general public suddenly seems to care about what they are saying.

Share story

PALO ALTO, Calif. — Doc Searls met with a group of his fellow internet privacy experts one recent afternoon here at the Computer History Museum. On a whiteboard were the words “OUTRAGE” and “MAKE HAY” — capitalized, underlined and surrounded by lines jutting in all directions like a cartoon “BOOM!”

For the first time in years, their field of expertise was front and center. Facebook had just become embroiled in a controversy over how the political data firm Cambridge Analytica had harvested the information of up to 87 million users of the social network.

Seated in a wide circle of folding chairs, members of the group excitedly discussed what they could do next.

“A lot of geeks in the world are looking at Facebook as a redwood that’s starting to fall,” said Searls, whose given name is David and who created ProjectVRM, a program at Harvard University’s Berkman Klein Center for Internet & Society that seeks to empower internet users to protect personal privacy. “They’re saying, ‘OK, it’s barn-raising time.’ ”

Most Read Business Stories

Unlimited Digital Access. $1 for 4 weeks.

The scandal swirling around Facebook and Cambridge Analytica has begun to usher in a new era for this once-ignored community of privacy researchers and developers. After years of largely disregarding their warnings about exactly what companies like Facebook were doing — that is, collecting enormous amounts of information on its users and making it available to third parties with little to no oversight — the general public suddenly seemed to care about what they were saying.

The outcry over data privacy has been so strong that it pushed Mark Zuckerberg, Facebook’s chief executive, into testifying on Capitol Hill last week over the company’s failures to protect users’ information. Protesters rallied outside the Capitol during his testimony. Someone even arrived at one of the hearings dressed as a Russian troll.

In their personal lives, privacy experts are now fielding a spike in calls from their relatives asking them for advice about protecting their personal data. Engineers are discussing new privacy projects with them. Even teenagers are paying attention to what they have to say.

For many of the developers, this is the right time to push ahead with testing more privacy solutions, including more advanced advertising blockers, peer-to-peer browsers that decentralize the internet, new encryption techniques, and data unions that let users pool their data and sell it themselves. Others want to treat tech giants more as information fiduciaries, which have a legal responsibility to protect user data.

And for the first time, many privacy experts think internet users will be more willing to put up with a little more inconvenience in return for a lot more privacy.

“This is the first blink of awakening of the world to a danger that’s been present for a long time, which is that we are exposed,” Searls said. “Cambridge Analytica is old, old news to privacy folks. They’re just the tip of the crapberg.”

John Scott-Railton, who researches digital rights and privacy at the Citizen Lab at the University of Toronto, said he recently thought back to all the PowerPoint presentations and papers he had given and seen that cautioned about how third parties might access and abuse user data.

“It didn’t stick until now,” he said. “Now it’s changed, or at least people nod along when we talk about it.”

Neema Singh Guliani, legislative counsel at the American Civil Liberties Union, recalled years of efforts by the ACLU to get Facebook to monitor how third parties were using data. Yet few paid attention at the time, even though the group specifically called out Facebook’s quizzes in 2009. (Cambridge Analytica used a third-party quiz app from an independent researcher to harvest Facebook users’ data.)

The social network has said it will investigate many third-party apps that have had access to large amounts of Facebook users’ information. Nonetheless, the ACLU is pushing for users to have tighter control over what Facebook apps can do, and it argues that Facebook ought to audit its developers.

The organization also believes that more privacy protections should be enshrined in law.

“We’re having the conversation now that we should have had over a decade ago,” Singh Guliani said.

One of the reasons it has always been hard to get consumers interested in security and privacy is that the harms were vague and hard to understand. With Facebook and Cambridge Analytica, the harms are identifiable and frightening, said Ashkan Soltani, an independent researcher specializing in privacy and a former chief technologist of the Federal Trade Commission.

“Much like a car accident, the harms on social media are low-probablility events with extremely variable outcomes,” he said. “’So what if my boss saw me doing a keg stand?’ But all of a sudden the ‘so what if’ becomes more serious — ‘I get denied insurance or my information is used by a nation-state actor to manipulate me.’ ”

Cambridge Analytica’s work, which included using Facebook data to build psychological profiles of voters, tapped into an anxiety many Americans already had over the outcome of the 2016 presidential election.

“This one stuck because it was Trump, and we’re looking for someone to blame,” said Bruce Schneier, a cryptographer who runs the Schneier on Security blog and wrote “Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.” “If Hellmann’s mayonnaise did this, we’d be impressed.”

Privacy experts said this shift in public opinion was what they had been waiting for, because it is the only way to bring about change. Facebook will not willingly change its policies without pressure from shareholders or regulators, they added.

For Rohit Ghai, president of the cybersecurity firm RSA, whose SecurID technology is an industry standard for companies protecting access to their internal systems, the change is evident even inside his home in San Jose, California.

He previously tried to talk to his 13-year-old daughter about data privacy and social media — even providing examples of how much the tech companies know about people and what they can do with that information. She shrugged him off.

Then the Cambridge Analytica revelations happened. For once, Ghai said, his teenager came to talk to him.

“She just asked me about Mark Zuckerberg,” he said. “That’s a sign.”