Loading...
Loading...
Click here if you don’t see subscription options
The EditorsNovember 18, 2021
Facebook logos on a computer screen are seen in this illustration photo. (CNS photo/Valentin Flauraud, Reuters)

What does a company with more than two billion monthly users—more than a quarter of the human race—owe the rest of us? The recent revelations of internal corporate documents, research and conversations show that Facebook—recently rebranded as Meta—has been wrestling with the negative effects of its platforms and products even while offering public assurances about them.

These materials, leaked by a former Facebook employee, Frances Haugen, have been extensively reported on, first by The Wall Street Journal and then by a consortium of other news organizations. They have also been turned over to the Securities and Exchange Commission and were the subject of congressional hearings.

One takeaway from the flood of headlines about these leaks is that Facebook is concerned about a broad range of negative impacts and fosters an extensive internal discussion about them—very little of which ordinarily makes it into public view. For example, internal Facebook research looked at the ways Instagram damages the mental health of teenagers, how misinformation moves through Facebook’s algorithmic feeds, the involvement of its platforms in disseminating hate speech and encouraging political violence in developing countries, and how its own privileging of reactions like “angry” over the default “like” for registering engagement has amplified toxic and low-quality posts.

Alongside these internal discussions, the leaks also revealed that Facebook has a special program that protects “VIP users,” including celebrities, politicians and journalists. This special category includes millions of accounts, which are exempt from the company’s normal enforcement and review processes. And complicating the frequent assertion that Facebook and other social media companies censor conservative political content, the documents show that many of Facebook’s own employees argue that in fact, Facebook protects right-wing publishers from even-handed enforcement of its own rules in order to avoid political backlash.

Facebook’s business model, built on monetizing human attention while outsourcing human judgment to algorithms, is a uniquely comprehensive and dangerous abdication of responsibility.

To offer a theological gloss on what seems to be a story about an impersonal corporation’s malfeasance, we can point out that these revelations offer a robust demonstration of the reality of original sin, which seeps through our social media feeds much as it does anywhere else in human society.

Looked at from this angle, the problem is not that Facebook, its employees or its executives are notably more or less corrupt than any other large corporation. Rather, the issue is that Facebook’s business model, built on monetizing human attention while outsourcing human judgment to algorithms, is a uniquely comprehensive and dangerous abdication of responsibility.

Facebook’s dependence on algorithms to drive its platforms’ feeds allows it to function as a global-scale media company based on the free labor of billions of users creating and interacting with content that Facebook hosts and from which it profits, but for which it refuses to accept any substantive responsibility. While the willingness of Facebook and other large social media platforms to host all this content for free may seem to profoundly democratize the ability to publish, the platforms have every incentive to monopolize their users’ attention as much as possible.

As a consequence of this incentive, Facebook’s algorithms use “engagement” as a proxy for “attention-worthy.” What Facebook presents for its users to pay attention to is precisely what it expects will keep them engaged with Facebook, a signal that is easy to measure without human effort. But the fact that human beings—sinful and subject to temptation as we are—are often prone to engage with the worst in ourselves and one another is not something Facebook has proven itself able to code around.

The effect of privileging engagement—which, remember, is for the sake of Facebook’s shareholders and profit margin, not its users—can be analogized to rubbernecking at a vehicular accident: The more people who pay attention to (“engage” with) the accident, the worse the traffic jam gets. But while a GPS algorithm would likely attempt to route drivers away from the traffic jam for the sake of getting them to their destinations, Facebook’s feed algorithms instead funnel users toward it, since such attention is exactly what Facebook sells.

One of the most disturbing revelations from the leaked documents was an experiment in which a Facebook researcher set up a new mock account that started off following mainstream conservative pages. Within days, Facebook’s recommendations algorithm started surfacing QAnon groups and other conspiracy theory content for this account. This is not evidence that Facebook is intentionally biased in favor of conspiracy theories, but rather evidence that conspiracy theories are more “engaging” than the truth—in approximately the same sense that cocaine is more addictive than kale. Engagement is not a reliable proxy for value, except in the narrow sense of monetary value for Facebook’s ad sales.

Within days, Facebook’s recommendations algorithm started surfacing QAnon groups and other conspiracy theory content for this account.

The first step to reining in Facebook must be a far-reaching commitment by the company to transparency, both about Facebook’s internal research into its negative effects and about such closely guarded secrets as its feed algorithms. Any company at Facebook’s scale has an outsized effect on the common good. It is absolutely necessary to enable others who do not share the profit motives of Facebook executives to evaluate those effects. The same argument also applies to other tech companies whose platforms shape the digital environment that we all share. Facebook’s recent renaming of itself as Meta to emphasize its focus on building the “metaverse,” envisioned as a virtual reality space for, well, everything, only underlines this point.

Along with greater transparency, Facebook should also be pushed to limit its scale and reach in significant ways, including through the threat of antitrust enforcement. But the most important constraint to apply is that the reach of information in Facebook’s feeds needs to have human judgment in the loop, not merely algorithmic amplification. Facebook tells us—and itself—that its platform is just a tool, and thus value-neutral, committed primarily to free speech. Users post content, their engagement feeds the algorithm and they see more of what they pay attention to. This supposed commitment to free speech, however, obfuscates the fact that the engagement the company seeks more precisely means “activity valuable to Facebook.” No matter how much Facebook would like that to be neutral, someone needs to take responsibility for how it affects the world.

The latest from america

Zara Devin, left, and Cillian Murphy in a scene from "Small Things Like These." (Enda Bowe/Lionsgate via AP)
“Small Things Like These” is dedicated to the girls and babies who went through the Magdalene laundries, the last of which closed in 1996.
John AndersonNovember 08, 2024
Republican President-elect Donald Trump gestures alongside his wife, Melania, during his rally at the Palm Beach County Convention Center in West Palm Beach, Fla., Nov. 6, 2024, after being elected the 47th president of the United States. (OSV News photo/Brian Snyder, Reuters)
After Barack Obama’s victory in 2008, the Democrats have repeatedly failed to hold together the coalition that elected the country’s first Black president.
Robert David SullivanNovember 08, 2024
Why can't Catholics and Protestants share the experience of partaking in the Eucharist each according to their tradition?
Meg GiordanoNovember 07, 2024
On this week’s episode, host Colleen Dulle interviews Catholic author and academic Michael W. Higgins to discuss his new book, “The Jesuit Disruptor: A Personal Portrait of Pope Francis.”
Inside the VaticanNovember 07, 2024