In June after a meeting with Pope Francis, Mark Zuckerberg held a “Townhall Q&A” in Rome. When one of the questions touched on whether Facebook is editing the news it provides to its users, I imagine the P.R. team stopped breathing for a few seconds. On stage, in front of the lights, Mr. Zuckerberg took a long pause and a few gulps of water. Composing himself, Facebook’s chief executive continued, “No, we're a tech company, we're not a media company.” Within 30 seconds he repeated himself, saying, “we’re a technology company.”
There are two big problems with that answer. If media is about how people consume information, Mr. Zuckerberg is running one of the biggest media companies in the world.
The second problem is more insidious. Claiming to be a tech company should not provide blanket absolution from responsibility. The Silicon Valley wizards should not pretend that their own values and conceptions of good are not getting baked into the products they make. While they are not directly responsible when someone uses a tool in a new and novel way they did not anticipate, they still share some level of responsibility. The excuse that “we’re a technology company” is not an escape from civic and moral responsibility.
The Facebooks of the world have changed the way we communicate and gather news. They have made it easier to share information and misinformation alike. They have kept us in contact with relatives and friends across time zones and continents and, in doing so, have allowed us to extend and reinforce tribal bubbles and resist new ideas.
As technology becomes a deeper part of our lives, we need to grow up. It is too simplistic to blame technology and not acknowledge that we are in a complicated relationship with it. We make and shape technology, but technology also makes and shapes us. In many parts of our lives we are part of the problem and part of the solution. The same goes for our technology.
Recently Mr. Zuckerberg has begun to stray from the party line “we’re a tech company.” When Facebook employees noted that sharing
some of Donald J. Trump's Islamophobic statements violated the hate speech policy at Facebook, Mr. Zuckerberg had to step in to allow the shares in the name of free speech. The case forced him and his company to acknowledge the reality: Facebook’s algorithms and policies are engineered to filter and edit the information we see each day. In many cases these decisions are made on moral grounds. Facebook’s willingness to discuss competing values and employ moral reasoning is far better than its typical denial of culpability. It is progress that we need to highlight and replicate across the technology industry.
Implicit moral values baked into site policies, feed algorithms and news filters should be made more explicit and discussed. Even as Facebook currently promotes free speech for political motives it has been accused by The Intercept website of suppressing posts that highlight police brutality
. As more and more of our information and news is filtered through Apple, Google, Twitter and Facebook, we should demand honesty and transparency to enable informed dialogue about the effects these information gateways have on public discourse.
No human designed system or algorithm is perfect. Each has flaws, biases and vulnerabilities. It can be gamed and it can be dangerous.
Steve Jobs and Steve Wozniak famously battled over an open Apple ecosystem versus the walled garden that now dominates. Those were technological choices that had serious civic and moral consequences. Hate speech and bullying have to be pitted against the values of free speech and open expression in social media. Who makes that call? How is the implicit design of the system already making some of those calls? We can not pretend that Facebook or technological tools will magically solve our disagreements over the value systems we use to distinguish good from bad. Technology is not an exemption from the hard work of discussing our implicit value systems. We’ve been using Facebook and its analogues for over a decade. We’re getting too old to act so naïve.
Eric Sundrup is associate editor at America and director of audience development.