WASHINGTON – Facebook whistleblower Frances Haugen on Tuesday told lawmakers that the company systematically and repeatedly prioritized profits over the safety of its users, painting a detailed picture of an organization where hunger to grow governed decisions, with little concern for the impact on society.

Her Senate committee testimony – based on her experience working for the company’s civic integrity division and thousands of documents she took with her before leaving in May – sought to highlight what she called a structure of incentivization, created by Facebook’s leadership and implemented throughout the company. By directing resources away from important safety programs and encouraging platform tweaks to fuel growth, these performance metrics dictated operations, Haugen said, a design that encouraged political divisions, mental health harms and even violence.

She pointed to Facebook chief executive Mark Zuckerberg as the enforcer of this system, arguing that he controls the most important decisions made at the company.

“Until the incentives change, Facebook will not change. Left alone, Facebook will continue to make choices that go against the common good, our common good,” she said.

The hearing signaled the start of a new crisis for Facebook in Washington, galvanizing lawmakers from both parties around regulatory efforts to tamp down on what they say is a wide-ranging set of the societal ills prompted by the social media giant. Repeatedly, senators compared the company to Big Tobacco, purveyors of products that are addictive and profitable but ultimately bad. The tobacco industry was ultimately contained by landmark regulation, an action lawmakers promised to replicate.

“I think the time has come for action, and I think you are the catalyst for that action,” Sen. Amy Klobuchar, D-Minn., told Haugen during the hearing.


The Senate panel hearing was Haugen’s first public appearance after she revealed herself Sunday evening as the source of thousands of pages of internal company research leaked to the Securities and Exchange Commission and the Wall Street Journal. Throughout, Haugen gave a detailed account of the ways she said Facebook employees are incentivized to turn a blind eye to problems its services cause, coupling her own experiences at the company with data from the internal Facebook documents she took with her.

Already revelations from the documents have intensified concerns on Capitol Hill about Facebook’s influence, particularly on children’s and teens’ mental health, a topic expected to be the key focus of the hearing. But Haugen and lawmakers covered huge swaths of ground, touching on national security risks, the spread of misinformation and the deadly mob attack on the Capitol on Jan. 6.

As she was speaking, Facebook’s representatives tweeted and emailed talking points rebutting Haugen’s testimony. One of their main points: Haugen didn’t work on many of the issues, including teenagers and child safety, that were covered in the documents that she downloaded.

After weeks of silence on Haugen’s revelations, Facebook CEO Mark Zuckerberg pushed back on her testimony in an email to employees on Tuesday, writing that it’s “just not true” that the company prioritizes profit over safety.

“We care deeply about issues like safety, well-being and mental health,” he wrote in the email, which he also posted on Facebook. “It’s difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted.”

He said he was particularly concerned about the questions being raised about Facebook’s impact on kids, and he committed to doing more research on the issue and making research publicly available.


Haugen brought specific examples from her time at the company. Because of understaffing, she recognized a pattern where there was “implicit discouragement” from building rigorous systems to detect problematic content, she said. Haugen saw this play out in her work on counterespionage, in which her team could only handle about a third of the cases it was made aware of. If Facebook had invested in building a detector, the team knew it would have many more cases.

Sen. Marsha Blackburn, R-Tenn., raised concerns about Facebook executive Antigone Davis’s testimony to Congress last week that Facebook had removed 600,000 accounts in a three-month period of children younger than 13. Haugen said that there are probably far more children on the platform, and that the company has ways to determine people’s ages.

“Facebook could do substantially more to detect more of those children, and they should have to publish for Congress those processes . . . they could be much more effective than probably what they’re doing,” Haugen said.

She also said she provided documents to Congress that show Zuckerberg knew the company could have intervened to prevent the spread of hate speech and misinformation in at-risk countries, but he did not, because it would have negatively affected “meaningful social interaction,” a key metric Facebook uses to measure communications between family and friends. Haugen said the company tied the metric to employees’ bonuses and chose not to make changes that could cost Facebook money.

“People stay or leave the company based on what they get paid, and if you hurt MSI, a bunch of people aren’t going to get their bonuses,” Haugen said.

During congressional testimony and on other occasions, Zuckerberg has insisted that the company does not reward employees based on how much time people spend on the platform. But on Monday, Facebook representatives declined to answer whether rewards or other incentives for employees were tied to the performance of the algorithm.


Engagement-based ranking, which Facebook uses to prioritize posts in people’s feeds that are more likely to elicit reactions – and therefore, clicks – also concerned Haugen. The downside is the recommendation of divisive or harmful content. That has led to Facebook’s algorithms causing teens to be exposed to more anorexia content, encouraging rifts within families, and fueling ethnic violence in Ethiopia, she told lawmakers.

Facebook could shift to posts being ranked in other ways, such as chronologically, and still remain profitable, she said. That argument resonated with lawmakers.

“This company could be vastly profitable but so much safer,” Sen. Richard Blumenthal, D-Conn., said during a news conference after the hearing. “They’ve just chosen the greediest path. . . . If they took a little bit less money but emphasized Facebook in the way they apply algorithms, they could help save lives.”

Facebook has long said that it made the change to its news feed algorithm to prioritize what types of interaction people found most valuable, and the shift was made in the context of a bigger prioritization of friends, family and groups over news publishers. News outlets, including The Washington Post, reported at the time that engagement among younger U.S. users was slowing, and Facebook was interested in finding ways to boost time spent on the platform.

Today, the company says its news feed algorithm takes in more than 10,000 data points when deciding what content to show people, and some of those data points include promoting trustworthy publishers and demoting those that are known to publish clickbait. But engagement – or the amount of likes, clicks, views and shares that a post generates – is still one of the most heavily weighted parts of the algorithm, executives have said.

The wide-ranging revelations marked a stark departure from dozens of other Capitol Hill technology hearings, where lawmakers have grilled company executives. These leaders largely gave non-answers, in contrast to the open and frank information that Haugen delivered on Tuesday.


Haugen’s call for regulation of Facebook was expansive and ambitious. She asked lawmakers to “break out of previous regulatory frames.” She warned lawmakers that some of the most widely debated proposals, including privacy protections or tweaks to Section 230, a decades-old law that protects companies from lawsuits over what users post, would be insufficient.

She proposed changing Section 230 to make Facebook responsible for “the consequences of their intentional ranking decisions.”

Gaining such expansive oversight of Facebook’s closed design will be an uphill battle for lawmakers, who have publicly called for regulation of social media for years but have been unable to pass bipartisan proposals that would force greater transparency. Tuesday’s hearing crystallized that while lawmakers are collectively outraged over Haugen’s allegations, there is little consensus about exactly what kind of legislation they might advance. Lawmakers name-dropped individual bills that they have introduced during the hearing, including proposals that would address children’s online privacy and bills that would increase transparency around tech companies’ algorithms.

Lawmakers from both parties have become increasingly motivated to pass stronger competition laws, and some bills would probably result in a breakup of Facebook’s existing business. But Haugen warned lawmakers against breaking up the company, arguing that it needs all the profitability it can get to properly police itself, especially in areas such as Africa where Facebook-owned services are the primary way people access the Internet.

“If you split Facebook and Instagram apart, it’s likely that most advertising dollars will go to Instagram and Facebook will continue to be this Frankenstein that is endangering lives around the world,” she said. “Only now there won’t be money to fund it.”

The revelations amount to perhaps the most significant crisis in Facebook’s history, one in a slew of such scandals over the years, but Zuckerberg did not publicly comment on until Tuesday night. Blumenthal said the Facebook chief needs to testify before the committee about Haugen’s revelations.


“No apology, no admissions, no acknowledgment, nothing to see here, we are going to deflect it and go sailing,” said Blumenthal, referencing the CEO’s frequent social media photos of his activities on the water.

This will probably not be Haugen’s last appearance on Capitol Hill. She told lawmakers that Facebook’s consistent understaffing of counterespionage staff is a national security concern, and that she is talking about it with members of Congress. Rep. Adam Schiff, D-Calif., a member of the select committee to investigate the Jan. 6 riot, said that committee would need to hear from her about Facebook’s role in the violence at the Capitol.

Blumenthal also called for both the Securities and Exchange Commission and the Federal Trade Commission to investigate Haugen’s revelations. Haugen’s lawyers have made at least eight complaints to the SEC, which has broad oversight of financial markets and the power to bring charges against companies that mislead investors.

Blumenthal is hopeful that Haugen’s decision to speak will inspire other tech workers who wish to expose similar concerns.

“I think there are other truth-tellers in the tech world who want to go forward, and I think you are leading by example,” he said. Haugen is “showing them that there’s a path to make this industry more responsible and more caring about kids and about the nature of our public discourse.”