On Jan. 2, 2007, dressed in his trademark turtleneck, jeans and sneakers, Apple chief executive Steve Jobs debuted the iPhone. It was like a door opening directly into the future. Almost overnight Silicon Valley seemed to become the capital of human progress, a Wonka-esque home to possibility and wonder.
Eleven years later, the reputation of our latter-day Athens curdles. Rather than a mall or town square where you can find anything you can imagine, the internet appears to have become a surveillance system to surprise even Foucault, with cameras hidden on each page you visit—tracking your choices, the movements of your cursor, even the searches you delete.
As a founding figure of the Internet Corporation for Assigned Names and Numbers, an international nonprofit that oversees the smooth running of the internet, Dr. Paul Twomey watched the online universe explode into life at the turn of the century. Now an international consultant on cybersecurity, privacy and governance, he has followed the growth of social media and search engine platforms, and watched the rise of mining personal data as the internet’s default business strategy. He has been in the room with the people involved.
“You talk to them about ethics...and they’re like deer in the headlights. They have no idea what you’re talking about.”
What he sees there leaves him cold. “There’s no one saying you can’t do some things,” he says of many tech organizations today. “There are no adults in the room. You talk to them about ethics, the concept of political freedom, and they’re like deer in the headlights. They have no idea what you’re talking about.”
It is a point that has come up many times in recent months: How does a company like Google or Facebook evaluate its ethical responsibilities? Does it? In November, the actor Kumail Nanjiani (of the TV series “Silicon Valley”) tweeted about visiting start-up companies for the show and inquiring about products that seemed potentially harmful. “They don’t even have a pat rehearsed answer,” he wrote. “They are shocked at being asked.... ‘We’re not making it for that reason but the way people choose to use it isn’t our fault. Safeguards will develop.’”
“Only ‘Can we do this?’,” Mr. Nanjiani continued. “Never ‘Should we do this?’”
Dr. Twomey agrees. “People complain about the [National Security Agency]. I trust the spooks more than I trust these people,” he says, unfavorably comparing tech companies to surveillance agencies.
Problems have been seen in algorithms for everything from teacher performance and hiring practices to loan evaluations.
He notes with equal concern the growing transfer of important societal functions to computer algorithms. “For the last 200 years,” says Dr. Twomey, “we’ve been developing political systems to ensure an essential set of values around things like fairness. Now [those determinations] are being done increasingly by private companies via algorithms.”
In some cases very poorly: In 2016 a ProPublica study found that the COMPAS software now used in many state court systems to advise on sentencing was only “somewhat more accurate than a coin flip” in predicting the likelihood of an individual’s future criminal activity within the following two years. Further, the software was “likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.” Another study in 2018 found it no more accurate than asking a random person with no expertise what they thought.
Similar problems have been seen in algorithms for everything from teacher performance and hiring practices to loan evaluations.
A key problem, says Dr. Twomey, lies in the background of the creators. “Ninety percent of all algorithms written in this world are written by the same people, twentysomething male gamers. The tech companies say they’re wonderfully diverse, and it’s nonsense. They’re every color of the rainbow, but they’re all the same person, 20-to-30-year-old men coming from mathematics, computer science, maybe physics. And most of them wouldn’t know who Aristotle was if they fell over him.”
“Ninety percent of all algorithms written in this world are written by the same people, twentysomething male gamers.”
“The values of technology creators are deeply ingrained in every button, every link, and every glowing icon that we see,” tech activist Anil Dash wrote in March, and with them their assumptions and blind spots. A predictive algorithm used by child protective services ends up disproportionately flagging black children as needing intervention and underreporting similar situations with white children. Another tool for setting car insurance rates ends up charging lower-income people more.
Lacking external evaluations of algorithms, which both tech companies and state agencies resist (and which become more difficult the more complicated the algorithms become), coders’ prejudices and assumptions are buried in the system. Dr. Twomey fears they will promote a “global caste system” in which “your kids are going to get discriminated against” without even knowing it.
How can the church help fight this? To say that the Vatican does not tend to adapt quickly, particularly when it comes to science and technology, is an understatement. (See: its many apologies to long-dead scientific pioneers.) Its general lack of pliancy would seem to be a huge disadvantage when change and innovation is happening so fast.
But Dr. Twomey says that the church getting tech savvy is less important in this moment than its continuing to clearly stand with and for those on the margins: “The church needs to understand that the preferential option for the poor in a digital age includes the digital naïve, ignorant and excluded. Someone has to be a voice for their interests, a voice for their safety, a voice [to ensure] they’re being treated fairly.” They need the church’s leaders and scholars to be their advocates.
And the church may in fact have much to contribute to the development of algorithms. “Collectively the Catholic Church and her institutions have a Big Data capacity that rivals, if not surpasses, any social network,” tech commentator and consultant Father Robert Ballecer, S.J., points out. “Are we aware of the power in that data? Can we show the world a more responsible way to use it?”
The tools of our information age are “fantastic,” says Dr. Twomey. “They do wonderful things. But we need to have confidence in the principles we think are important. It’s taken us a long time to express them in our political process. We shouldn’t allow some start-up to say they don’t apply to them. This is just hubris.”