The scrutiny of Facebook’s collection and use of consumer data in recent years has prompted the tech giant to repeatedly defend its efforts around transparency and privacy.
But about three-fourths of Facebook users were unaware that the company lists their personal traits and interests for advertisers on its site, according to a study published by the Pew Research Center on Wednesday. Half of the users who looked at the Facebook page with that data — known as their “Ad Preferences” — said they were not comfortable with the company’s compiling that information. Pew conducted a nationally representative survey of 963 American adults with Facebook accounts between Sept. 4 and Oct. 1 of last year.
While consumers have learned more in recent years about how they are targeted for online ads, the study suggests that many still do not know how much of their behavior is tracked, where it is compiled or even that Facebook has a page that lists all of that information. Pew focused on Facebook, which also owns Instagram and WhatsApp, because it “plays an incredibly important role in the media ecosystem of the world,” said Lee Rainie, Pew’s director of internet and technology research.
“Privacy matters to Americans — it’s a classic American value — yet when they’re online and doing other things, they act as if their personal information is O.K. to harvest and analyze,” Mr. Rainie said in an interview. “One of the theories on this inconsistency is that Americans don’t really know what’s going on. The fact that 74 percent of Facebook users didn’t know that these lists were maintained on them cuts to the heart of that question of where Americans are, or are not, with these systems.”
About 88 percent of the users had listings on their Ad Preferences page. The page says that it allows users to “learn what influences the ads you see and take control over your ad experience.”
“Pew’s findings underscore the importance of transparency and control across the entire ad industry, and the need for more consumer education around the controls we place at people’s fingertips,” Joe Osborne, a Facebook spokesman, said in a statement. “This year we’re doing more to make our settings easier to use and hosting more in-person events on ads and privacy.”
Targeted advertising is the core of Facebook’s business, which brings in more than $40 billion in revenue each year. Through all the clicking, posting and article sharing, and activity elsewhere online, Facebook builds up an ad profile for each of its users. That includes information as basic as their age and location, as well as their hobbies, political leanings, family type and more. Advertisers use that information to direct tailored messages to users.
But questions around how that data can be misused to manipulate people — and how much they know about its collection in the first place — have put tech companies like Facebook on the defensive. Tech companies have responded by promoting tools that they say offer transparency around their business practices, including “Ad Preferences” and a similar product from Google called “Ad Settings.” In December, Facebook created a temporary kiosk in Bryant Park in Manhattan to provide consumers with information about privacy and ad targeting.
Pew’s survey also took a closer look at two of Facebook’s more controversial user labels, which are determined by algorithms: political leanings and “multicultural affinities.” (Facebook decides whether a user has an “affinity” for a minority group like African-American or Asian-American, which can then be used to target ads.)
Half of the survey’s respondents were assigned a political label, while one-fifth said that they were given a multicultural affinity. Twenty-seven percent of those with a political classification said that the label was “not very or not at all accurate.” With the multicultural affinities, 37 percent said that they “did not have a strong affinity or interest” in the group that they were assigned.
“One of the debates we’ve seen a lot is how do we judge the performance of algorithms?” Mr. Rainie said. “One line of thought in the technology community, and particularly the critics’ community, is it ought to be 100 percent — if you’re going to judge the way the world works, you ought to be pretty accurate. The counterargument is that the test for an algorithm is: Does it do a better job than human beings at figuring out the way the world works?”
Orignially published in NYT.