Facebook Wants a Monopoly on Human Connection
Mark Zuckerberg loves to talk about community. His story, the founder of Facebook said in a 2017 commencement address at Harvard University, is that of a “student in a dorm room, connecting one community at a time, and keeping at it until one day we connect the whole world.” Facebook’s formal mission statement is to “give people the power to build community and bring the world closer together.” In Facebook’s latest annual filing with the Securities and Exchange Commission, the word community appears 22 times.
But, of course, there is more to the story. Facebook is a technology story: Millennial whiz kid codes social network. It is a powerful marketing story, too, about a corporation packaging and selling data that details our lives. But it is also selling something as precious to human life as air and water; the desire for community is why the company will soon be worth a trillion dollars. No wonder Mr. Zuckerberg likes to repeat the word that’s made him one of the richest and most powerful people who’ve ever lived, as if he is saying a prayer. You can almost hear him whispering: Community: My precious.
Facebook has scaled, monetized and in many places even monopolized local human connection.
Facebook has scaled, monetized and in many places even monopolized local human connection, similar to the way Walmart scaled and monetized shopping, and what Starbucks has done for coffee, McDonald’s for hamburgers and Uber for cabs. That community we all long for has become a product. “There’s no way of getting around the commodification of community in social media, especially Facebook,” said Katherine Schmidt, a Molloy College theology professor who studies technology.
Now, at the end of its second decade, Facebook nears a turning point. Its future success depends in large part on whether it can successfully continue to market a glossy image of community to users, politicians, journalists and regulators while maintaining a website whose sole purpose is to run a secret algorithm that manages what content you see to keep you coming back for more. Because Facebook’s marketing of community is only half the transaction. The second part relies on the rest of us logging on to the site every day and willingly supplying information about ourselves in exchange for free communication, information and entertainment.
Facebook said in a statement that it competes “every day to earn people’s time and attention.”
But if all you want to do is scroll through posts about your friends’ pets and babies and political opinions, do the inner workings of the algorithm or the opinions of its founder really matter? They do when the company is as powerful as this one. A recent lawsuit by the Federal Trade Commission and the attorneys general of 46 states, which was dismissed in June but signals the likelihood of other legal challenges, focused on Facebook’s monopoly and how it obstructs competitors by buying them. The company snapped up Instagram in 2012 for one billion dollars and Whatsapp in 2014 for $19 billion. Its size means it has what business types call the first-mover advantage, which makes starting a rival social network almost impossible. Why would you join another social network when everybody you love, and almost three billion other people, are all on Facebook? “Facebook’s actions to entrench and maintain its monopoly deny consumers the benefits of competition,” said Ian Conner, Director of the F.T.C.’s Bureau of Competition. Facebook, which declined to comment for this story, said in a statement that it competes “every day to earn people’s time and attention.”
A Pew Research Center study published in June 2021 found that 36 percent of Americans regularly get their news from Facebook, yet the site claims it is an online platform, not a publisher. Watchdog groups have attacked the site’s permissive attitude toward hate speech, criminal organization and political propaganda, and blamed the site for facilitating the plotting of the January 6 attack on the U.S. Capitol. The thread that runs through so many of these complaints is that, in fact, Facebook is not a safe space for the community it claims to curate.
Trust in the Public Square
For its survival, Facebook needs to prove the opposite. That is what is behind its big investments in artificial intelligence technology to help moderate content, its hiring of more people to do the same, its creation of a Special Oversight Board that tackles some of the site’s thornier decisions, and Mr. Zuckerberg’s constant invocations of community. The company depends, massively, on public opinion. It needs us to think it believes in a safe, healthy version of community so that we will keep using the site as a public square.
If it can manage that, it will continue to expand powers unmatched by any private company, with influence over language, politics, religion and speech that dwarf that of any single national government. The Covid-19 pandemic, which grounded everybody at home for a year, consolidated Facebook’s position as the world’s premier communications platform, allowing friends to talk, churches to worship and educators to teach. As of March 31, 2021, the company reported 2.85 billion monthly active users on its main Facebook site. It also owns the picture-sharing app Instagram and the messaging company Whatsapp.
“In a real community, you don’t get to choose who you’re in the boat with,” said Vincent Miller. “You can’t block people at Thanksgiving dinner.”
For 2020, Facebook reported revenues of $86 billion and profits of $29.1 billion. It is one of the world’s four essential technology companies, along with Google, Amazon and Apple. Overall, by market cap value, it is the world’s sixth biggest company, behind Apple and Saudi Arabia’s oil producer Aramco, but ahead of behemoths like JP Morgan Chase, the Chinese trading platform Alibaba and the car company Tesla. Facebook’s capacity to expand the number of connections you can have is mind-numbing. Think about this: If you have 1,000 Facebook friends (not a crazy number) and each of them has that same number, you have a million friends of friends.
Facebook’s size is hard to fathom. Taking into account that some accounts are used by more than one person, it is estimated that over three billion people use the site. That is more than the populations of North and South America and Europe put together. Mark Zuckerberg is, functionally, a head of state, protected by private body guards. When he travels to a country, his visit draws kings and presidents and dominates headlines. Pope Francis gave him an audience.
For students of Mr. Zuckerberg’s life, it all makes sense. As a teenager, he loved the board game Risk and the video game Civilization, in which players compete to build powerful empires. He studied the rulers of ancient Rome. “In particular, he had a fanboy affinity with the emperor Caesar Augustus, whose legacy is a mixed one: a brilliant conqueror and empathetic ruler who also had an unseemly lust for power,” writes Steven Levy in his authoritative book Facebook: The Inside Story. Mr. Zuckerberg even named his first-born child August.
The website amplifies disagreements without any avenue for the possibility of a shared experience that might mitigate them.
Central to the empire is community. Even if you discount Aaron Sorkin’s Hollywood account of Facebook’s founding—Mark and his pals wanted an easier way to meet girls—Mr. Zuckerberg himself has said it was desire for connection, for himself and others, that motivated him. Exponential growth came, and thousands, then millions of people, then billions signed up: humans craving community, grandparents, teammates, buddies, crushes. Facebook me. Yes, precious. “Helping people build communities is one of the most important things that we can do,” Mr. Zuckerberg wrote in a Facebook post earlier this year, noting that there are 600 million different Facebook groups. In 2021 he added, “We’re going to continue to focus on helping millions more people participate in healthy communities and we’re going to focus even more on being a force for bringing people closer together.”
But Facebook is selling something it can’t really build. A real community is made up of dozens of people—maybe a bit over a hundred, according to sociologists—who learn to get along over time, who endure each other’s differences because they live near each other or are related. “In a real community, you don’t get to choose who you’re in the boat with,” said Vincent Miller, a professor of theology at the University of Dayton. “You can’t block people at Thanksgiving dinner.” By offering endless opportunities to connect to people with shared interests, Facebook offers a version of community based on choice instead of place. The website amplifies disagreements and keeps people returning to the site, without any avenue for the possibility of a shared experience that might mitigate them. There is no place to go to just be together in peace, only words clashing for eternity.
In Search of Moderation
In his latest encyclical, “Fratelli Tutti,” Pope Francis writes that digital relationships do not “demand the slow and gradual cultivation of friendships, stable interaction or the building of a consensus that matures over time.” Even if they have “the appearance of sociability,” they “do not really build community.” Contrast that to the vision outlined by Facebook’s chief operating officer, Sheryl Sandberg, in recent testimony. Facebook, she said, “is helping you stay in touch with friends and family and helping you know what’s going on in a very efficient way.” For Pope Francis, digital connectivity, despite its efficiency, “is not capable of uniting humanity.”
To be sure, there is no doubt that Facebook, like Walmart and McDonald’s and Starbucks, has added something to our lives. We have reconnected with old friends. Families discovered blood relatives they had never met. Cousins and siblings on different continents can easily stay in touch. And it is a business that offers a service. As James Martin, S.J., of America told me in an email exchange about Facebook, “participating in an economic society” requires some compromise. “You might as well not have microphones in churches because microphone companies profit from church services,” he wrote.
“Facebook talks about building community. But really what they do is extract from community.”
But the outrageous scale the company has chosen to pursue in order to sell more ads makes it different and gives it outsize leverage. In Mr. Zuckerberg’s telling, the company is a victim of its own idealism clashing with human nature. “The big lesson from the last few years is we were too idealistic and optimistic about the ways that people would use technology for good and didn’t think enough about the ways that people would abuse it,” he told Steven Levy.
The truth, Mr. Levy and others who follow the company told me, is that Facebook has become too big to manage coherently. “In the beginning, it provided an ethical service, and now, like Frankenstein’s monster, it’s outgrown the company’s ability to control it,” said Christopher Michaelsen, a professor of business at the University of St. Thomas. Consider this: Facebook has said that 5 percent of the accounts on its site are fake. That might not sound like a lot, but that could mean as many as 140 million accounts, more than the populations of France and England combined.
Companies that figure out ways of profiting from scale often offer lower prices and convenience. They also destroy something: in the case of Walmart, thousands of department stores and small Main Street shops.
For Facebook’s critics, who advocate breaking up the company, its business model is inherently destructive.
With Facebook, something similar has happened, and we are slowly figuring it out. What has been destroyed? Self-esteem, as we compare and despair; our sense of respect for people with opposing views. Facebook can also be a dangerous weapon. Exotic animal smugglers, human traffickers and other criminal groups organize themselves on the site. The 2016 presidential election was heavily influenced by lies spread on Facebook by a foreign country.
For Facebook’s critics, who advocate breaking up the company, its business model is inherently destructive. The site prioritizes selling ads by driving user engagement, even though that means their algorithm favors controversial or false posts over the truth. “Facebook talks about building community,” said Barry Lynn, executive director of the Open Markets Institute. “But really what they do is extract from community.” The company’s business model relies on making money by selling advertising to companies based on information it has gathered about its users.
The growing awareness of the harm Facebook causes has put pressure on the company to come up with a way of regulating the site. This company has inserted itself into human communication and built community on a scale that dwarfs the population of the Roman, British, Soviet and American empires combined. Can it also manage to police content well enough to maintain its attractiveness and meaning for the groups that use it?
Facebook’s response is that it can maintain good oversight of its communities by employing content moderators and arming them with new artificial intelligence. With 600 million groups, that is too much content for the company’s over 15,000 content moderators to handle.
The company publishes much of its artificial intelligence research on a special blog. Much of the work concerns training machines to avoid bias. “Some common statistical notions of fairness may lead to unintentional and potentially harmful consequences, especially to marginalized groups,” one recent paper noted. “This could be even more important to consider in decision systems that may affect millions or even billions of people.”
The company “has grown so big and so fast that artificial intelligence is the only hope they have for policing what’s happening on the site.”
The company “has grown so big and so fast that artificial intelligence is the only hope they have for policing what’s happening on the site,” Mr. Levy told me. But as he reports in his book, even the content moderators Facebook has hired do not think it is possible for A.I. to do its work. And hiring the workforce that would be required to manage every community would defeat the company’s promise to its shareholders to seek to maintain high profit margins.
Because of Facebook’s size and its libertarian ethos, the company sometimes allows damaging speech. For example, in the months before rioters attacked the U.S. Capitol in Washington, D.C., on Jan. 6, “Americans had been exposed to staggering amounts of sensational misinformation about the election on Facebook’s platform, shunted into echo chambers by Facebook’s algorithms and insulated from counter-speech by Facebook’s architecture,” the Knight First Amendment Institute told the company’s Special Oversight Board in a statement.
Supporters have argued that the local communities that have formed on Facebook are all so different, so distinct that they cannot be monitored by artificial intelligence, and hiring content moderators for every group would cost Facebook too much money. But critics say something must be done. “Facebook doesn’t want to address this because it’s a threat to their business model,” said Katie Paul, director of the Washington, D.C.-based Tech Transparency Project, a watchdog group. “We’re talking about a trillion-dollar company not being held to the same standards as other companies.”
Ms. Paul has campaigned to force Facebook to monitor criminal behavior more diligently, especially in the domain of trafficking antiquities. “The content moderators rely on users to report, but if you join a group for smuggling antiquities, you’re not going to report your clients,” she noted. “And they’ve refused to set up simple algorithms that might deter smugglers.”
“We’re talking about a trillion-dollar company not being held to the same standards as other companies.”
The Catholic Church is not beyond the site’s influence. Take a recent community dispute over the closure of a Catholic school in Crafton, Pa. Parents started a Facebook group called “Save St Philip School in Crafton, PA”. But it’s gotten ugly, with parishioners hurling insults at local church leaders and calling people “evil.” It’s the kind of conversation that deserves a more engaged moderator to keep things respectful. Local Facebook groups often have volunteer moderators, a role that anybody interested in civil dialogue could consider pursuing. But the role can be difficult and thankless and often draining and depressing. And Facebook will not pay to hire professional moderators for every group on the site. “The problem of social media, especially Facebook, is how easy anger, resentment and recrimination spread,” said Rev. David Poecking, a priest in the Diocese of Pittsburgh. The angry words linger in the discussion thread, poisoning the rest of the dialogue.
Facebook has facilitated communication in communities of faith as well. During the Covid-19 pandemic, many churches, synagogues and mosques depended on Facebook to broadcast services online. Like many other preachers, the Rev. Joseph Satish, a priest in Dayton, Ohio, said he enjoys the reach that Facebook offers. When he celebrated Mass on Facebook during Covid, over 1,000 people tuned in each week. But when he communicates with his constituents, some people inevitably write mean or spiteful things, he said. “I’ve learned simply never to respond,” he said. “And it would be nice if there were a different social network to choose, but there’s not.”
In 2020, Facebook launched its Special Oversight Board, a group of 20 senior executives, politicians and experts from around the world. Initial members included the human rights activist and former Guardian editor Alan Rusbridger and the Nobel Peace Prize-winner Tawakkol Karman. The supreme court of Facebook is a unique institution, charged with policing speech for the entire world. It made global news when it upheld Facebook’s ban on former President Donald Trump, a showcase of the company’s power, and ordered the company to reassess the prohibition within six months. Another early decision was to overrule the company’s removal of a post about breast cancer because it featured a nipple, which is easy for an A.I. system to detect and delete. That decision suggested that Facebook is overly reliant on computers instead of humans to police content, Helle Thorning-Schmidt, a former prime minister of Denmark who is a member of the board, told The Wall Street Journal.
Tae Wan Kim, an artificial intelligence ethicist at Carnegie Mellon University, tells me he credits Facebook for taking steps to try to enforce community standards, but says it is very difficult for artificial intelligence to make nuanced decisions. “It’s basically impossible for a machine to pick up on nuanced conversation,” he said. For example, Facebook wrestled with its decision to take down a piece of feminist satire that said: “Kill all men.”
But what should we do when engaging in social media? What is our responsibility? Is there a way to fight Facebook’s monopoly? Katherine Schmidt, the Molloy College theologian, argues Catholics should engage on Facebook and other social media platforms in a way that eventually moves conversations offline and intentionally fosters real-world community. “You can also argue that we should be setting up a rival Christian social network, but that tends to be a more Protestant logic,” she said.
In a post in February, Mr. Zuckerberg wrote that “we’re building the community infrastructure to support the diversity of communities needed for everyone in the world to join ones that are meaningful in their lives.” Whether or not Facebook can continue to profit and grow, its founder has always been right about community: It is precious.
Read more from America: