The social-media giant also collects so-called biometric facial data without users’ explicit “opt-in” consent, and helps video-game companies target “high-value players” who are likely to spend on in-app purchases.
Facebook has been called on the carpet for how it has failed to protect the personal data of its users. But lost in the drama of congressional hearings is an understanding of the extent to which Facebook meticulously scrutinizes the minutiae of those users’ online lives.
Facebook’s tracking stretches far beyond the company’s well-known targeted advertisements. And details that people often readily volunteer — age, employer, relationship status, likes and location — are just the start.
The social-media giant also tracks users on other sites and apps. It also collects so-called biometric facial data without users’ explicit “opt-in” consent, and helps video-game companies target “high-value players” who are likely to spend on in-app purchases.
Did Facebook share your data with Cambridge Analytica?
There’s now a way for you to find out, even if you didn’t get a notification from Facebook. On your Facebook page, click on the question mark in the blue bar at the top. In the “quick help” box, type “Cambridge Analytica.” Click on “see more about Cambridge Analytica” topic.
The sifting of users gets into personal — even confidential — matters. The company has allowed marketers to target users who may have an interest in various illnesses, like the 110,000 Facebook users who were listed under the category “diagnosis with HIV or AIDS,” the 51,000 people listed under erectile dysfunction, and 460,000 users listed under “binge-eating disorder awareness,” according to 2015 data submitted as an exhibit in a lawsuit. Facebook says it has removed those “targeting options” and does not create targeted ad audiences involving users’ medical conditions.
Most Read Nation & World Stories
“Facebook can learn almost anything about you by using artificial intelligence to analyze your behavior,” said Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation (EFF), a digital-rights nonprofit. “That knowledge turns out to be perfect both for advertising and propaganda. Will Facebook ever prevent itself from learning people’s political views, or other sensitive facts about them?”
On Wednesday, Facebook’s chief executive, Mark Zuckerberg, testified on Capitol Hill for a second day regarding how his company conducts its business and how it has failed to protect the privacy of its users.
The hearings were spurred by revelations that Cambridge Analytica, a voter-profiling company, had inappropriately harvested the detailed personal information of up to 87 million Facebook users and that foreign agents have repeatedly used the social-media platform to spread misinformation. Facebook executives have promised they’re working to prevent similar missteps from happening again.
Consumer data mining is the engine that fuels advertising-supported free online services. If Facebook is being singled out for the practice, it is partly because it is the market leader.
“There are common parts of people’s experience on the internet,” Matt Steinfeld, a Facebook spokesman, said in a statement. “But, of course, we can do more to help people understand how Facebook works and the choices they have.”
Still, privacy advocates want lawmakers and regulators in the U.S. to have a more pointed discussion about the stockpiling of personal data that remains the core of Facebook’s $40.6 billion annual business.
While a series of actions by European judges and regulators is intended to limit some of the powerful targeting mechanisms that Facebook uses, federal officials in the United States have done little to constrain them, to the consternation of U.S. privacy advocates.
Many other companies, including news organizations like The New York Times, mine information about users for marketing purposes. But privacy advocates say Facebook continues to test the boundaries of what is permissible. Some fault the Federal Trade Commission (FTC) for failing to enforce a 2011 agreement that barred Facebook from deceptive privacy practices.
“Congress needs to begin to ask questions like, ‘Why did the FTC allow this to happen?’ ” said Marc Rotenberg, executive director of the Electronic Privacy Information Center, a nonprofit group in Washington, D.C.
An FTC spokeswoman said the agency could not comment on the case.
Facebook requires outside sites that use its tracking technologies to clearly notify users, and it allows Facebook users to opt out of seeing ads based on their use of those apps and websites.
Facebook notes that when users sign up for an account, they must agree to the company’s data policy. It plainly states that its data collection “includes information about the websites and apps you visit, your use of our services on those websites and apps, your use of our services, as well as information the developer or publisher of the app or website provides to you or us.”
In Europe, however, some regulators contend that Facebook has not obtained users’ active and informed consent to track them on other sites and apps. Their general concern, they said, is that many of Facebook’s 2.1 billion users have no idea how much data Facebook could collect about them and how Facebook could use it to influence their behavior. And there is a growing unease that tech giants are unfairly manipulating users.
“Facebook provides a network where the users, while getting free services most of them consider useful, are subject to a multitude of nontransparent analyses, profiling, and other mostly obscure algorithmical processing,” said Johannes Caspar, the data-protection commissioner for Hamburg, Germany.
Last Friday, the Italian Competition Authority said it was investigating Facebook for exercising “undue influence” by requiring users to let the company automatically collect all kinds of data about them both on its platform and off.
“Every single action, every single relationship is carefully monitored,” said Giovanni Buttarelli, the European data-protection supervisor who oversees an independent European Union (EU) authority that advises on privacy-related laws and policies. “People are being treated like laboratory animals.”
Regulators have won some victories. In 2012, Facebook agreed to stop using face-recognition technology in the EU after Caspar accused it of violating German and European privacy regulations.
Outside the EU, Facebook employs facial-recognition technology for a name-tagging feature that can automatically suggest names for the people in users’ photos.
With facial recognition, brick-and-mortar stores can scan shoppers’ faces looking for known shoplifters. But civil-liberties experts warn that the technology could threaten the ability of Americans to remain anonymous online, on the street and at political protests.
A dozen consumer and privacy groups in the United States have accused Facebook of deceptively rolling out expanded uses of the technology without clearly explaining it to users or obtaining their explicit “opt-in” consent.
Last Friday, the groups filed a complaint with the FTC saying that the expansion violated the terms of the 2011 agreement. Facebook sent notices alerting users of its new face-recognition uses and said it provides a page where they can turn the feature off.
Facebook has other powerful techniques with implications users may not fully understand.
One is a marketing service, “Look-alike Audiences,” which goes beyond the familiar Facebook programs allowing advertisers to directly target people by their ages or likes. The feature allows marketers to examine their existing customers or voters for certain propensities — like big spenders — and have Facebook find other users with similar tendencies.
Murka, a social casino- game developer, used Facebook’s look-alike audience feature to target “high-value players” who were “most likely to make in-app purchases,” according to Facebook marketing material.
There is concern among some marketers that political campaigns or unscrupulous companies could potentially use the same technique to identify the characteristics of, for instance, people who make rash decisions and find a bigger and bigger pool of the same sort of people.
Facebook’s ad policies bar potentially predatory ad-targeting practices.
Advertisers are able to target ads to users using the look-alike service, but they do not receive personal data about those Facebook users.