Facebook on Thursday defied public calls to adopt significant new limits on political advertising ahead of the 2020 presidential election, opting instead to introduce minor changes that it said would give users a measure of control over the ads they see.

The company’s new rules continue to permit politicians to make false claims in their posts – including ones they pay Facebook to promote – and they preserve the powerful yet controversial tools that long have helped Democrats and Republicans deliver messaging to narrowly segmented audiences on the social networking site.

Pressure to rethink its approach to political ads came from a wide array of federal regulators, digital experts and privacy advocates, as well as some of Facebook’s own employees. They argued that the company’s policies coarsened American political debate and exposed users to serious risks, including viral disinformation, which malicious actors could pay to promote on the site.

But Facebook ultimately sided with President Donald Trump’s reelection campaign and other political strategists – Democrats and Republicans – who’d fought fiercely behind the scenes to keep the digital tools that have helped them find new supporters, solicit donations and mobilize voters on Election Day.

The decision sparked a series of rebukes, however: Ellen Weintraub, who serves on the Federal Election Commission, sharply criticized the tech giant’s approach as “weak” and motivated by a desire to boost its profits.

Top 2020 Democratic candidates including Sen. Elizabeth Warren of Massachusetts also pilloried the company, expressing concern that Facebook had essentially paved the way for Trump to lie to social-networking users with impunity. And on Capitol Hill, Sen. Ron Wyden, D-Ore., accused Facebook of trying to “fool people with fig leaves instead of taking real action.”

Under its new policies, Facebook said it would give users a choice to see fewer ads about political candidates and social issues, using a tool under development that it plans to roll out in the summer. Users can also select to stop seeing ads from particular campaigns and other entities, including businesses, that target them using custom lists of data, such as their email addresses. And the company announced it would provide more information in its public archive about the total number of people targeted in an ad campaign.

In a blog post announcing the changes, Rob Leathern, Facebook’s director of product management for ads, wrote that the company is “not deaf” to criticism about its rules around political ads. But he maintained that the changes would “increase the level of transparency it provides for people and [give] them more control over the ads they see.”

Computer scientists who have studied Facebook’s ad tools cast doubt on that conclusion.

“Of course giving users choice is a step in the right direction, but an opt-out option probably won’t make much of a difference because most users stick to the default,” said Piotr Sapiezynski, a research scientist at Northeastern University.

Thursday’s announcement marked the latest instance in which Facebook – under fire for a wide range of its business practices – sought to wait out a controversy before ultimately offering modest changes that do not satisfy skeptical government officials. On Tuesday, for example, the company issued a new policy on video manipulation called deepfakes that still allows on the site an infamous clip altered to depict House Speaker Nancy Pelosi, D-Calif., as being drunk. The move came months after the clip went viral, sparking bipartisan scorn.

With its affirmation that it will not curtail targeting, Facebook set itself apart from Google and Twitter, each of which introduced major changes last year in response to a prolonged outcry over the capacity to narrowly tailor messages to voters on social media. Trump’s reelection campaign helped catalyze the changes after his team ran false ads about 2020 Democratic presidential candidate Joe Biden. Facebook and Google refused to take down the ads about the former vice president, sparking a widespread outcry.

Twitter’s approach was bluntest, banning all advertisements about candidates, elections and political issues such as abortion and immigration. The ability to reach voters online should be “earned, not bought,” the company’s chief executive, Jack Dorsey, said in announcing the move, which drew sharp rebukes from the Trump camp.

Google opted to preserve political advertising, including certain targeting capabilities, but the company limited some of the most precise tools for reaching specific users, prompting bipartisan backlash from political outfits that have grown accustomed to such powerful technologies. In doing so, it also preserved its own policy against fact-checking political ads on its services, including YouTube.

Facebook, however, said it sought to take a different approach: “While Twitter has chosen to block political ads and Google has chosen to limit the targeting of political ads, we are choosing to expand transparency and give more controls to people when it comes to political ads,” Leathern wrote.

Generally, Facebook’s advertising tools allow tailoring messages to lists of individual voters or to small groups based on characteristics such as age, education, Zip code, income, relationship status, interests or political leanings. Most powerfully, Facebook also allows the creation of “custom audiences” based on lists of individuals who, for example, donate to a cause or visit a web page that Facebook tracks. The result can be a torrent of different messages to different voters, with Trump’s campaign in 2016 using tens of thousands of different ads each day.

Internal discussions about limiting political microtargeting began in 2017 as Facebook reeled from revelations about how Russian operatives used the platform to manipulate U.S. voters in the presidential election a year earlier. The idea did not immediately catch on, but interest surged last year as critics sought reforms at Facebook ahead of the 2020 presidential election.

In considering changes, Facebook chief executive Mark Zuckerberg remained steadfast in his view that his company should not serve as an “arbiter of truth,” vetting what politicians can say. Instead, Facebook had considered a wide menu of other reforms – including limiting the size of an audience that could be targeted with an ad and labeling paid political media to indicate that it had not been fact-checked, The Washington Post first reported.

But the Trump campaign asked Facebook this fall to not restrict advertising opportunities. Gary Coby, the campaign’s digital director, argued that reining in targeting would be “dangerous” and a “huge blow to speech.”

On Thursday, Trump campaign spokesman Tim Murtaugh described Facebook’s changes as “much better” than what Google and Twitter had done, saying the lack of limits on Facebook “encourages more Americans to be involved in the process.”

He insisted “our ads are always accurate” despite evidence to the contrary. The president’s campaign ran online ads in 2019 that included inaccuracies about Biden and his ties to Ukraine, according to fact-checkers, a move that triggered the debate over falsehoods on Facebook in the first place. Additional falsehoods promoted by the campaign have touched on issues such as immigration and the investigation of Russia’s influence on the 2016 presidential election.

Democrats also warned Facebook about instituting major changes: They said they relied on targeting tools to raise money and mobilize supporters – critical in matching Trump’s enormous audience on Twitter.

Many in the party, however, urged Facebook to fact-check political figures, fearing their foes might pay to spread falsehoods on the social networking site. On Thursday, Priorities USA, a leading Democratic super PAC, said the changes came up short.

“These changes read to us mostly as a cover for not making the change that is most vital: ensuring politicians are not allowed to use Facebook as a tool to lie to and manipulate voters,” said Madeline Kriger, the group’s integrated media director, who oversees in-house digital ads.

The Democratic party’s 2020 contenders echoed that view: Warren, who once intentionally ran a Facebook ad with a falsehood to call attention to the issue, said the company should be held accountable so “democracy isn’t held hostage to their desire to make money.” A spokesman for Biden, Bill Russo, said it served as “window dressing around their decision to allow paid misinformation.”

Facebook employees in October similarly had called on the company to adopt sweeping changes to the social media giant’s ad rules. Researchers, meanwhile, warned about the dangers of ads focused on narrow communities of users.

“Microtargeting is what’s driving privacy abuses because it’s furthering the desire to grab as much information about people as possible from all possible sources,” said Aleksandra Korolova, a computer scientist at the University of Southern California. “It doesn’t serve the individual’s interest.”