The Federal Trade Commission levied a record $5 billion fine against Facebook in July for its failure to protect users’ personal information adequately. The company had announced that it was expecting the fine in April and set aside $3 billion for it then. After both the April announcement and the F.T.C.’s July action, Facebook’s stock price rose. The fine is easily within Facebook’s means (the company reported quarterly profits of slightly more than $5 billion), and investors were pleased to see this episode resolved.
Users register, posts go up, and violations get sorted out later.
One widespread reaction was that even this fine—more than 200 times larger than the previous record, a $22.5 million fine against Google in 2012—was unlikely to damage Facebook or do much to change its behavior. But it is not clear what size fine or what other forms of pressure might have greater effect.
During the past two months, the Justice Department has initiated an antitrust review of major tech companies, and Congress has held hearings examining how much power these internet giants hold and how they use it. Facebook’s own co-founder has called for the company to be broken up.
As legislators and regulators grapple with the enormous power technology companies wield over both private data and public conversation in our democracies, they need tools that cut deeper than large fines and limited oversight. Mark Zuckerberg has frequently argued that Facebook is a technology company, not a media company. While that distinction should not carry much, if any, moral significance, it does offer a point of leverage.
The growth of social media was enabled by the “safe harbor” provision in the 1998 Digital Millennium Copyright Act, which freed companies from liability for copyright-infringing content posted by users as long as they responded quickly to takedown requests. In other words, what it means to be a technology company rather than a media company is that tech companies do not have to review content in advance. Users register, posts go up, and violations get sorted out later. That has allowed these platforms to scale at the speed of their algorithms, leaving humans running behind to catch up.
This model—scaling over a vast number of users with minimum human involvement—makes social media both powerful and dangerous. In addition to fines, oversight and transparency, legislators should consider whether brakes need to be put on this business model. Imagine if Facebook, in order to register a user, had to invest even 1 percent of the effort required when the Transportation Security Administration enrolls someone in PreCheck? No doubt the social media giants would deem such an approach unworkable, but it would get their attention more than a $5 billion fine.