The bots were worried about me.
There I was, mindlessly scrolling past Trump memes, soup recipes and baby pictures when my screen abruptly showed me a new page. In a concerned font, Facebook itself asked if I wanted some help. Would I like some resources on depression and suicide prevention? Would I like to try this hotline? No thanks, Facebook. Back to my soup recipes, please.
I was baffled. Did some friend notice I am suffering from a touch of the gloomies and alert my social media overlords? Did some enemy sic them on me as a prank? Eventually, I realized I had recently quipped, “I could drink myself stupid and just feel a little shabby in the morning, but I ate a few spoonfuls of marshmallow fluff, and now I want to die.”
Would I like some resources on depression and suicide prevention? No thanks, Facebook.
Ding ding! “I want to die” was a red flag, triggering Facebook’s autoconcern. (“Drink myself stupid” probably did not help, either. No word on the bot’s opinion of marshmallow fluff).
It was weird. On the one hand, I love privacy as much as any 21st-century person (which is to say sometimes a lot and sometimes not at all, depending on what is most convenient). I would rather pretend I am not being constantly monitored.
On the other hand, it was pleasant to see surveillance used for good, for once. Some programmer had thought to build in an automated trigger in hopes of preventing suicide. It was comical to imagine the bots fretting over my high-fructose corn syrup hangover but sobering to imagine someone posting “I want to die” in earnest. The offer of help would not stop someone hell bent on killing himself, but for someone gently drifting into despair? I have been rescued by stupider things before. Maybe this automated prompt, impersonal though it was, could help someone get in touch with someone who cares.
What happens if we do start relying on algorithms to do the work for us?
There is the key: The automatic message was not supposed to substitute for human interaction. It was just a tool, albeit an imprecise one, to connect people. It was not supposed to replace anyone.
But what happens if we do start relying on algorithms to do the work for us? Some of my friends were urged by Facebook to respond to my post with e-cards of encouragement. That was a little weirder, more intrusive and less personal. So much for privacy. And worse, what if my friends start assuming that, if I really am struggling, Facebook will catch it?
There is a name for this phenomenon: “the lulling effect.” In 1972, the Food and Drug Administration started requiring manufacturers to put child-resistant caps on certain kinds of medicine. And in 1972, accidental poisonings from analgesics increased. Why? Because parents assumed the bottles were now safe, and they stopped keeping them out of children’s reach. They got complacent, too confident in technology; and so they took less personal responsibility. And children died.
We trust technology more than our own wits and eyes. Why?
The lulling effect makes sober adults drive into lakes, trees, sand pits and houses, just because their GPS devices tell them to. We trust technology more than our own wits and eyes. Why? Do we doubt ourselves and secretly assume technology is always superior to the human mind? Are we lazy and eager to pass off responsibility to a machine? Are we unsure if there even is a hard line between human and machine?
That road gets dark pretty quick. Behold, the rise of the sexbot. Behold, a nation that will not reproduce or accept outsiders and instead builds smiling pandabots to handle its elderly. Behold, the couple who spent so much time perfecting their virtual child, their newborn starved to death.
Sometimes when I turn on my phone, I get an ad that tries hard to sound like a message from a friend. “Thinking of you!” it says, or “I don’t know, maybe you will find this fun.” I see right away there is no human there, and it always makes me feel lonelier than I did before I turned the damned thing on. Magnify that feeling times a million when you have spent the day lavishing love on a wobbly companion childbot that abruptly stops loving you back when its brain battery catches fire.
It is not technology we should fear. It is ourselves.
Cue T. S. Eliot’s “fear in a handful of dust.” It is not technology we should fear. It is ourselves. We are not replaceable, but we are far too willing to behave as if we are. We are hell bent on extinguishing ourselves, on smothering the immortal spark with the passionless efficiency of bots who cannot breathe, think, know or love, but who imitate love well enough to make us feel fatally deficient.
Recall the cry of poor passionate Gerard Manley Hopkins:
Man, how fast his firedint, his mark on mind, is gone!
Both are in an unfathomable, all is in an enormous dark
Drowned. O pity and indignation!
On the very eve of the industrial revolution, Hopkins saw how transient we are, how easily extinguished. How easy to replace. If Hopkins felt the fear, he also knew the answer. If humans are in danger of allowing ourselves to be replaced, then the remedy is the same as it has always been: to become more human.
To become more human, as Christ did.
All of humanity needed help. All of humanity cried out in earnest, “I want to die!” And Christ responded. He saw history as a whole, as a single story of humans warring hard to blot ourselves out with whatever technology we could get our hands on, from the stick Cain used to bash his brothers head in to the lithe, latex limbs molded with precision by the techs at the RealDoll lab. He saw us and he conceived the strange and fearful answer: to become more human.
Enough! says Hopkins. In our distress, Christ comes to us, as one of us. And then, says Hopkins:
I am all at once what Christ is, since he was what I am, and
This Jack, joke, poor potsherd, patch, matchwood, immortal diamond,
Is immortal diamond.
We can be what Christ is because he became what we are.
Be more human. Every day, be more human. Write the note yourself. Check in on your friends. Make the phone call. Do not wait for the algorithm to tell you it is time. Offer yourself—your real self—in love. Be less replaceable. Be more human. Be more like Christ, and be more human.
In the ancient Middle East, it was the practice of people to build temples to their local deities and place the images of the deities in them. The Hebrew writers of Genesis, on the other hand, wrote of God creating the Heavens and the Earth as HIS Temple, placing his own Image in it. From this it isn't hard to understand the Jewish prohibition against graven images, it also gives us the nature of our vocation and purpose as humans - to be Bearers of the Image of our Creator, and the way in which we are to treat the temple in which we have been placed. Needless to say we often have failed miserably.
If we don't reclaim our vocation of Image Bearers by following Jesus, the PERFECT image of God (and the sender of the Spirit), in the pattern of his life, death and resurrection, this entire Artificial Intelligence thing will end badly.
To paraphrase Paul, we may understand the secrets of the universe, we may be able to harness the power of the sun and we may construct machines that can perform miracles, but if we don't have Love, we have nothing
I enjoyed reading this comment so much. Thanks for posting.
Artificial intelligence is still artificial. I am only irritated when my computer makes links for me. And I am thinking about buying a newer car and wonder if I can get one that does not talk and does not have a phone number of its own. The last thing I need or want is an electronic nag.
And this, ladies and gentlemen, is the very heart and mystery of our faith. That God became human, and lives among us. Almost unfathomable.
As Thomas Merton once said, if we could see in ourselves who we really are, we would bow down and worship each other.
Very characteristic of Merton's stupidity and vanity.
What a self-indulgent article on a matter of the self-evident idiocy of our times. Artificial intelligence does not exist and never will exist. An electrical circuit can not make value judgments and never will be able to make value judgments anymore than a light bulb can contemplate beauty when it lights up a room.