Facebook has come under fire in recent months for selling everybody’s data to everyone ins sight. All of a sudden we all woke up to the fact that this enormous corporation was coining in millions of our data, and leaving us to pick up the pieces. The attempts by our lawmakers to make some kind of sense of what Facebook has been doing weren’t very encouraging – they often seemed to fail to understand the basic tech underlying Facebook or the basic business model.
But at least the wider world was trying to get to grips with Facebook’s power and influence.
Fixing what isn’t broken…
Facebook’s response was kind of predictable. They want to keep doing business, and anything that slows down the ad buys or puts users off the platform is bad for business. So they have to improve their reputation, without losing their income stream.
The current plan is to achieve that by making the user the focus of the solution.
Facebook’s always argued: we’re not a publisher, we’re a platform. So it’s platform users that have to be made to behave better. The problem with fae news isn’t that we sold ad targeting data to shady corporations, or foreign powers, or.. Well, basically anyone. We can deal with it at the user level.
Hence the reputation score.
Facebook wants to score its users by watching their behavior and awarding a complex score based on interactions with content and other users.
It’s not like they don’t have form – this is something similar to the newsfeed algorithm they’ve spent years developing, only for people this time rather than content.
In fact, this is only news to us. To Facebook, it’s business as usual, scaled up; they’ve been using trustworthiness scoring behind the scenes for over a year.
Facebook denies that this is what’s really happening: ‘The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading,’ a spokesperson told the BBC.
Rather, said the spokesperson, ‘What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system.’
…and breaking something else
Facebook’s saying that rather than being the new system for everyone, they’re going to use this only in exceptional and unusual circumstances.
But its use is probably going to – in fact, is already – pretty widespread.
The Facebook product manager in charge of fighting fake news, Tessa Lyons, says it’s ‘not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher.’
Which means it’s not going to be uncommon to use this system to reduce the amount of fake news that shows up on newsfeeds to spoil Facebook’s reputation, and thus the twin pillars of its revenue stream: loads of users, and loads of ads.
The trouble is, this might not be a solution to facebook’s fake news problem. It’s going to be hard to really address that without violating the most vital part of FB’s internal economy: easy access to huge tranches of user data.
And the new system has creepy, frightening echoes. In China, a reputational system is already in use – by the government. Citizens’ whole lives are scored altogether – indebtedness, job performance, and every other aspect of their interactions with their landlords, the state and their employers (of course in China, there’s often little difference). And the Chinese government already uses social media companies to track dissidents.
Facebook already has a history of siding with authoritarianism inherently, and while it’s been accused of political bias from both sides, with conservative news removed from newsfeeds by staff, it’s in the company’s interests to keep its users atomized and unaware about how the company profits from their data.
This implies the new Facebook system runs the risk of creating an additional layer of surveillance on a platform whose big problem already is that it’s basically spyware. This sounds a little like putting out a fire with gasoline.
So the concern about Facebook’s new system is twofold – it won’t solve the old problem, and it might create a new one.