Turns out Mark Zuckerberg didn’t read the user agreement
This past Friday, Facebook founder and CEO Mark Zuckerberg made a post detailing the social media giant’s plan to alter the way that “public content” — including news articles, branded content, and videos — appears on its users’ feeds.
This plan consists of two main changes to the algorithm that generates Facebook’s news feed. The first is that public content will, simply, appear less, composing about four per cent of the news feed, down from five per cent.
The second is that news outlets themselves will now be rated on the platform according to their trustworthiness, something determined not by Facebook executives or outside experts, but by users themselves as part of Facebook’s “ongoing quality surveys.” “We will now ask people whether they’re familiar with a news source and, if so, whether they trust that source,” wrote Zuckerberg. “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society, even by those who don’t follow them directly.”
These changes are in aid of users’ well-being, wrote Zuckerberg, making Facebook less of an aggregator of news and business content to be passively consumed, and returning it to its original focus as a platform for “meaningful social interactions with family and friends.”
But there are reasons to raise an eyebrow at the details of these changes. Most glaringly, one would think that the proliferation of “fake news” — whether benign clickbait or pointed misinformation — on Facebook and on other social media networks has demonstrated, if anything, the inability of the average user to identify accuracy in reporting. Asking those same users to determine what is or isn’t trustworthy doesn’t read as an improvement. Facebook, however, can hardly be faulted for failing to address the issue in a more thorough or nuanced fashion. Critics accuse the site of fostering ideological echo chambers, of giving its users only more of what they already want — but Facebook is, by design, a virtual echo chamber. It is a place for users to share pictures they like of things they like with people they like. It is not, at heart, a window on the world. It is a bathroom vanity.That society expects Facebook to be anything else testifies to the remarkable way in which this platform has permeated everyday life. Facebook has gone from being a social network to being the social network. It is where people coordinate, advocate, and agitate. It is where people buy and sell. And it is where people read and discuss current affairs and political issues, much more than they do in cafés, bedrooms, bars, and lecture halls.
While other, more linear platforms like YouTube, Twitter, Instagram, and Snapchat have by and large stuck to their core competencies, Facebook’s all-encompassing interface has become something like a bustling people’s parliament.
True, one is not required by law to join up, just as one is not required by law to have a driver’s licence. But not having a profile and not having a licence are similarly seen as conscious gestures, exceptions to the rule. Facebook is not a branch of government, and Mark Zuckerberg isn’t signing anybody’s social assistance cheques, but citizens, employers, law enforcement officers, and public officials all use Facebook as an ongoing public record. Perhaps Facebook never asked for this great power, or didn’t understand what it involved — they certainly don’t seem to be interested in the great responsibility that came with it. By moving to diminish the role that “public content” plays on the platform, Facebook aims to shrug off its duty to the public good, and also, at least temporarily, to diminish its public role.
Whether this is good news or bad is up for debate. Some have remarked that treating a vanity network as society’s central news portal was disastrous for publishers in the first place. As Jason Koebler wrote for Motherboard, “I hope that I nor any other journalist will have to care for one second longer about Facebook’s news feed.” Still, Facebook’s announcement carries a tinge of disappointment.Previous statements suggested that, faced with a staggering challenge, the company might actually lean in. Facebook, like Google, wields huge power in its vast reservoirs of user data. What if the company tweaked its algorithm to counter polarization by circulating moderate expressions of opposing viewpoints? What if they engaged a varied team of experts and users to assess the trustworthiness of publishers? What if they provided detailed user guidelines and visible moderation to keep discussions civil and productive?
These half-baked suggestions are hardly revolutionary. Facebook must have considered them, and many more possibilities.
Instead, however, the company is backing off in favour of sunny content from family and friends, hinting that while Mark Zuckerberg wants to do good, he wants to feel good even more.
Facebook might have a passing interest in the betterment of society, but this move should remind everyone that the site exists to generate just two things: the warm fuzzies and ad revenue. Everything else is noise.
Art by Corben Grant
