Facebook users, by and large, are not very good at differentiating between whatās fact and whatās false. Many usersĀ will eagerly share both reliable news and the fake stuff without any hesitation. It happens because users either want the falsehoods to be received as true or simply canāt tell the difference. Rampant media illiteracy is the root cause of the fake news handwringing weāve been dealing with since before the election, and will be fretting over until the end of time (or the end of Facebook, whichever comes first). Today, Facebook honcho Mark ZuckerbergĀ said he isĀ setting out to fix this fundamental problem of digital media illiteracy ā by putting more power in the hands of the illiterate.
In a new Facebook post today, ZuckerbergĀ said he āasked our product teams to make sure we prioritize news that is trustworthy, informative, and local.ā Why this has only become a priority in the companyās 14th year of existence is left unsaid. Zuckerberg admitted that āthereās too much sensationalism, misinformation and polarization in the world today,ā and that his website āenables people to spread information faster than ever before.ā As with the rest of Silicon Valley, Facebook is obsessed with the appearance of machine-like objectivity, and so ZuckerbergĀ saidĀ figuring out which outlets deliberately package viral-ready falsehoods and which do not is a head-scratcher (spoiler ā it isnāt):
The hard question weāve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but thatās not something weāre comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you ā the community ā and have your feedback determine the ranking.
So, rather thanĀ relying onĀ the subjectivity and biases of a team of outside experts, Facebook willĀ rely on the subjectivity and biases of two billion people around the world. Specifically, FacebookĀ said it will decide which media outlets are prioritized at least in part by just asking people which outlets they like:
As part of our ongoing quality surveys, we will now ask people whether theyāre familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who donāt follow them directly. (We eliminate from the sample those who arenāt familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)
Facebook is either unaware of ā or, more likely ā unwilling to deal with the fact that people have rabid, tribalistic loyalties to certain outlets. Someone who enjoys sharing, say, the Daily Caller or InfoWars articles is going to of course say that these are trustworthy outlets. Otherwise, theyāre admitting that they voluntarily consume and spread information that isnāt trustworthy, and we all think too highly of ourselves for that. According to a Facebook spokesperson, the surveys are meant to make sure āpeople can have more from their favorite sources and more from trusted sources.ā Isnāt part of the Facebook information disaster that so many people count things like RedStateEagleMilitiaZoneDeepStateNews (or what have you) among their āfavorite sourcesā? Should we be asking these people whatās trustworthy and what isnāt? Should they be deciding what will appear on your feed ā or even there own ā as reliable news?
Similarly, no one who posts five MSNBC articles every day is going to even consider giving Fox News a vote of trustworthiness. In fact, partisan news consumers will relish an opportunity to boost their side and downvote the bad guys, a cherished internet pastime. Rather than fix the enormous, world-spanning information morass theyāve created, Facebook is punting responsibility to its users (and, of course, the almighty Algorithm).
Details on how exactly these surveys will function are scant for now, though the Facebook spokesperson told me these new āchanges that are not intended to directly impact any specific groups of publishers based on their size or ideological leanings.āĀ The spokesperson added, āWe do not plan to release individual publishersā trust scores because they represent an incomplete picture of how each storyās position in each personās feed is determined.ā
Itās also unclear how this would affect small or new outlets that have little to no name recognition because theyāre small or new. What is clear is that Facebook in the new yearĀ is still as reluctant to do anything that will cause it institutional discomfort or provoke backlash from its right-and-left-aligned users. So long as Facebook remains a corporate monolith with immense control over the entire worldwide media industry, this problem wonāt go away.
At the outset of the year, Zuckerberg declared it his personal challenge to fix whatās broken at his company. Today, heĀ said to everyone,Ā Here, you deal with it.
ZNetwork is funded solely through the generosity of its readers.
Donate