Whatever Mark Zuckerberg says about human community or his legacy, his company is acting in its own interests—and against the public good.
Facebook’s crushing blow to independent media arrived last fall in Slovakia, Cambodia, Guatemala, and three other nations.
The social giant removed stories by these publishers from users’ News Feeds, hiding them in a new, hard-to-find stream. These independent publishers reported that they lost as much as 80 percent of their audience during this experiment.
Facebook doesn’t care. At least, it usually seems that way.
Despite angry pushback in the six countries affected by Facebook’s algorithmic tinkering, the company is now going ahead with similar changes to its News Feed globally. These changes will likely de-prioritize stories from professional publishers, and instead favor dispatches published by a user’s friends and family. Many American news organizations will see the sharp traffic declines their brethren in other nations experienced last year—unless they pay Facebook to include their stories in readers’ feeds.
At the heart of this change is Facebook’s attempt to be seen not as a news publisher, but as a neutral platform for interactions between friends. Facing sharp criticism for its role in spreading misinformation, and possibly in tipping elections in the United States and in the United Kingdom, Facebook is anxious to limit its exposure by limiting its role. It has long been this way.
This rebalancing means different things for the company’s many stakeholders—for publishers, it means they’re almost certainly going to be punished for their reliance on a platform that’s never been a wholly reliable partner. Facebook didn’t talk to publishers in Slovakia because publishers are less important than other stakeholders in this next incarnation of Facebook. But more broadly, Facebook doesn’t talk to you because Facebook already knows what you want.
Facebook collects information on a person’s every interaction with the site—and many other actions online—so Facebook knows a great deal about what we pay attention to. People say they’re interested in a broad range of news from different political preferences, but Facebook knows they really want angry, outraged articles that confirm political prejudices.
Publishers in Slovakia and in the United States may warn of damage to democracy if Facebook readers receive less news, but Facebook knows people will be perfectly happy—perfectly engaged—with more posts from friends and families instead.
For Facebook, our revealed preferences—discovered by analyzing our behavior—speak volumes. The words we say, on the other hand, are often best ignored. (Keep this in mind when taking Facebook’s two-question survey on what media brands you trust.)
Tristan Harris, a fierce and persuasive critic of the ad-supported internet, recently offered me an analogy to explain a problem with revealed preferences. I pledge to go to the gym more in 2018, but every morning when I wake up, my partner presents me with a plate of donuts and urges me to stay in bed and eat them. My revealed preferences show that I’m more interested in eating donuts than in exercising. But it’s pretty perverse that my partner is working to give me what I really crave, ignoring what I’ve clearly stated I aspire to.
Facebook’s upcoming newsfeed change won’t eliminate fake news … at least, it didn’t in Slovakia. People share sensational or shocking news, while more reliable news tends not to go viral. When people choose to subscribe to reliable news sources, they’re asking to go to the gym. With these News Feed changes, Facebook threw out your gym shoes and subscribed you to a donut-delivery service. Why do 2 billion people put up with a service that patronizingly reminds them that it’s designed for their well-being, while it studiously ignores our stated preferences? Many people feel like they don’t have a choice. Facebook is the only social network, for example, where I overlap with some of my friends, especially those from my childhood and from high school.