Zuckerberg Thinks You Don’t Trust Facebook Because You Don’t ‘Understand’ It

January 25, 2019

I think we can all safely agree that last year was not great for Facebook. User trust plummeted to record lows as the company faced scandal after scandal. At this point, there are countless reasons users can and even should be wary of Facebook—and yet.

On Thursday, the Wall Street Journal published a 1,000-word screed by Mark Zuckerberg about the company’s data collecting practices titled “The Facts About Facebook.”

In it, Zuckerberg makes noise about the company being about “people,” and insists—as he has been for the majority of his company’s 15-year history—that we should trust it. Zuckerberg appears to think the primary reason users have little faith in the company’s ability to responsibly or ethically handle their data is because of its targeted advertising practices, about which he writes: “This model can feel opaque, and we’re all distrustful of systems we don’t understand.” He continues:

Sometimes this means people assume we do things that we don’t do. For example, we don’t sell people’s data, even though it’s often reported that we do. In fact, selling people’s information to advertisers would be counter to our business interests, because it would reduce the unique value of our service to advertisers. We have a strong incentive to protect people’s information from being accessed by anyone else.

So, sure. Let’s start with the ads.

Earlier this month, a Pew Research Center survey found that users do indeed remain largely in the dark about how Facebook tracks their information in order to feed them relevant ads (and off of which it makes heaping piles of money). Of the nearly 1,000 U.S. adults polled for the survey, some 74 percent of those who use Facebook said they had no idea about the site’s “ad preferences” section where activity-based “interests” appear.

Fifty-one percent of users said they were “not very or not at all comfortable” with Facebook amassing this information about them.

This data shows that the company has a lot of work to do when it comes to transparency. But additional data indicates that, in fact, the more we know about how Facebook works, the less trustworthy it becomes.

Annual surveys from the Ponemon Institute show that user trust in the social media giant toppled significantly in the wake of the Cambridge Analytica, when it was learned that Facebook previously knew that the research firm had obtained the personal data of tens of millions of Facebook users and mostly did nothing.

Reporting on the survey in April, the Financial Times said that user trust in Facebook had actually been on the rise before the scandal, but that user confidence in the company to protect their information fell from nearly 80 percent in 2017 to 27 percent last year. That was toward the beginning of the year—then the rest of it happened.

Read More

Delusional Sociopath Think?

0 comment