Facebook is NOT Harmless Fun

March 19, 2018

The revelation that Cambridge Analytica exploited the data of 50 million Facebook profiles to target American voters is indeed frightening. But Cambridge Analytica shouldn’t act as a diversion from the real bad guy in this story: Facebook. It is mystifying that as his company regulates the flow of information to billions of human beings, encouraging certain purchasing habits and opinions, and monitoring people’s interactions, Mark Zuckerberg is invited to give lectures at Harvard without being treated with due scepticism.

We have now reached the point where an unaccountable private corporation is holding detailed data on over a quarter of the world’s population. Zuckerberg and his company have been avoiding responsibility for some time. Governments everywhere need to get serious in how they deal with Facebook.

After trolls were sent to jail for sending threatening messages to the activist Caroline Criado-Perez and MP Stella Creasy, a debate ensued over whether the likes of Facebook and Twitter should be classified as platforms or publishers. Facebook is treated as if it is simply a conduit for information, meaning it is not liable for the content its users share – in the same way that BT can’t be sued when people make threatening phone calls.

In 2014 Iain MacKenzie, a spokesperson for Facebook, said, “Every piece of content on Facebook has an associated ‘report’ option that escalates it to your user operations team for review. Additionally, individuals can block anyone who is harassing them, ensuring they will be unable to interact further. Facebook tackles malicious behaviour through a combination of social mechanisms and technological solutions appropriate for a mass-scale online opportunity.”

But the company is evasive about the number of moderators it employs, how they work, and how decisions are made. It has started taking a firmer line on far-right content – recently removing Britain First pages from the site – but it is still resisting many legislative attempts to regulate its content. What content users then see is decided by an algorithm that can change without any consultation, including with the government or the businesses that rely on Facebook for revenue – meaning that some can be quickly wiped off the map. In February 2018 the website Digiday reported on LittleThings, a four-year-old site that shut down overnight after Facebook decided to prioritise user posts over publisher content. A hundred jobs were lost.

Facebook wasn’t the only contributor to LittleThings’ demise, but those working at the website said there was nowhere else to go after the algorithm change. And this isn’t the only example: in 2013 an algorithm change halved the traffic of viral content website Upworthy – something from which the website has never recovered.

Read More

0 comment