Facebook Is Silently Scoring People On Whether Or Not They’re Trustworthy

As part of an ongoing fight against how half the news on the internet is either made up by weird Russian disinformation farms or made up by insane right-wing American conspiracy theory vloggers, Facebook has silently been assigning users a score measuring their trustworthiness, the Washington Post is reporting.

In an interview, Tessa Lyons, the product manager in charge of fighting ~fake news~, said that the metric (hidden from users), is part of the way they determine if something being flagged as untrue is actually untrue.

[jwplayer hAS3Rulj]

Facebook relies heavily on users flagging content but, as Lyons notes, people have a tendency to mark things as fake because they don’t agree with them or because they are “intentionally trying to target a particular publisher“.

Content flagged as false is forwarded to a third-party fact-checking service, and there is utility in Facebook knowing whether someone has a history of marking news as fake that turns out to be actually true, for the sake of efficiency:

One of the signals we use is how people interact with articles. For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.

WaPo says the score is on a scale between 0 and 1, although doesn’t make clear if that means it goes to a few decimal places or if it’s a binary and users are just either ‘trustworthy’ or ‘untrustworthy’.

Facebook hasn’t provided any insight into who has the score, how the score is graded, or how much it effects, concerned that people might start gaming that system in addition to people gaming the fake news flag, because that’s kinda how the internet is now.

More Stuff From PEDESTRIAN.TV