I spent nearly five years, several of them in Italy, seriously discerning a religious vocation. One year, I went on a retreat at a monastery in Umbria, where I got to know the abbot during my five-day stay. One evening, he noticed that I entered the chapel to “pray” carrying a large stack of books under my arm — one of them, unironically, was the spiritual tome “Difficulties in Mental Prayer.”
When I exited the chapel, it was almost midnight, so I was surprised to find the abbot pacing up and down the portico outside, hands behind his back, and the hood of his robe covering his head. When he saw me, he uncovered his head with a flourishing hand, smiled, and said, “Maybe next time, leave the books in your room.”
Technology has the potential to do tremendous good for humanity. We should build more humane tech, the kind that will help solve some of the world’s most pressing problems. But we should not imitate it.
He explained that in his decades-long stint as abbot, he had noticed something unusual among new novices training to enter the order — something he felt was damaging their development. Beginning in the mid-2000s, he saw them start to haul stacks of books into the chapel to use as prayer aids. He suspected, he told me, that it had something to do with the widespread use of computers and smartphones. People were beginning to think of themselves — or thinking of prayer — as they do about computers. There seemed to be a fear of self-directed thought: a fear of thinking that it must be “useless” if there is no input, similar to the way that a computer can’t run without a program.
This brief anecdote haunted me. The next time I went to sit in the chapel, I went alone, hands empty, and endured a painful and purifying silence. The idea that I was developing a calculating, computer-like mindset that affected me at the deepest levels of my soul was sobering.
The degree to which we humans imitate machines, which are the works of our own hands — things that are necessarily derivative of us, yet that we endow with demigod status due to their automagical power to perform tricks that we cannot — is the degree to which we will lose our distinctively human faculties. We become like that which we imitate.
Their idols are silver and gold,
the work of human hands.
They have mouths, but do not speak;
eyes, but do not see.
They have ears, but do not hear;
noses, but do not smell.
They have hands, but do not feel;
feet, but do not walk;
they make no sound in their throats.
Those who make them are like them;
so are all who trust in them.
—Psalm 115
Technology has the potential to do tremendous good for humanity. We should build more humane tech, the kind that will help solve some of the world’s most pressing problems. But we should not imitate it.
That is a temptation that is hard to resist. Humans are very skilled and complex imitators, as René Girard has shown. We imitate so well and deeply that we can even intuit and mimic our fellow man’s desires. He called our tendency to do so mimetic desire. “Man is the creature who does not know what to desire, and he turns to others in order to make up his mind,” he wrote. These “others” to whom we turn are what Girard calls our all-important “models” of desire.
Now, we are not just turning to other people as our models of desire; we are turning toward our devices.
New ways of living
Robert Alexander/Getty
The models we choose to imitate most naturally are indeed other humans. Dr. Andrew Meltzoff at the University of Washington’s Institute for Learning and Brain Sciences has advanced the “like me” theory of infant imitation. Meltzoff discovered that children come out of the womb immediately capable of imitating their fellow humans; they will not imitate a machine doing the same thing. From the earliest moments of life, it seems, a baby recognizes and imitates only that which is “like” them. Maybe the infant has a tacit awareness of its own dignity.
There is strong evidence to suggest this is changing, though, as the wise abbot I met in Italy observed. Adults are becoming more like what they imitate — and right now, the preponderance of imitation seems to be of technological devices or frameworks.
Gen. Jim Mattis famously said that “PowerPoint makes us stupid.” Officers in the Pentagon noted that it “stifles discussion, critical thinking, and thoughtful decision-making.” Jeff Bezos banned it from Amazon.
Instagram’s internal researchers found that the social media platform led to a “teen mental health deep dive,” making body image issues worse in one in three teenage girls. On top of the mental health problems, social media platforms like Instagram have made it more difficult for ardent users to adhere to reality for more than a few seconds. Everyone has seen someone enter a space in the real world and treat it as something to be used merely for the Instagrammable images it offers. The real-world behaviors mimic the behaviors learned from the device.
And a new generation of porn-addicted people — fueled by the ability to view hardcore porn anywhere, free of cost, in the palm of one’s hand — is now shaping a new aesthetic. Aside from the objectification of the human body, the trickle-up effect of porn aesthetics (norms around grooming, for instance) is a reminder that what people consume has far-reaching consequences. It is not just life imitating “art”; it is people imitating a mediated reality that their devices have made ubiquitous.
The lines between reality and mediated reality are becoming blurred, so Meltzoff’s “like me” hypothesis is taking on an added importance. If his hypothesis is correct — if we tend toward the imitation of that which is most “like us” — then, the nature of imitation is different today because we live in a world where it is simply harder to tell what is and what is not fully human. And this is the root cause of our mimetic crisis: the uniquely human is becoming obscured, and it is harder to recognize and identify.
The predominantly agnostic development of technology — characterized by a disregard for whether or not something contributes to a healthy human ecology — has made it harder to distinguish the human from the subhuman.
The digital politics of spiritual war
VW Pics/Getty
It’s helpful to take a brief step back to revisit a couple of classic philosophical ideas to understand the new situation we find ourselves in. Aristotle wrote at length about the concept of technē, or the technical. For the philosopher, the carpenter was the prototypical technologist; his hands were the supreme technology. He could fashion a piece of shapeless wood into a chair. When he was finished, there was nothing more to do but sit in the chair and enjoy it.
The chair was an inanimate object, the work of human hands. The Greeks reserved their wonder for φύσις (physis), or “nature,” which comes from the word φύω (phyō). It signifies something that is born, develops, and has the ability to move on its own according to its own principle of life. If a building had this quality, for example, then an architect could lay the first stone with an image of the completed project in his mind; the stones would then organize and construct themselves into a completed design.
A tool or object has no life principle, initiative, or process. It is not like a tree, which contains its nature within it. For that reason, technology was not a source of wonder for the Greeks; a tree, however, was. Technology is beautiful because it is “other” — it is not something we have forged completely with our own hands.
The loss of a sense of otherness and alterity is at the crux of the predicament we find ourselves in today. It is precisely what Eric Schmidt, the former CEO of Google, meant when he said that the internet would “vanish” because it would become a part of everyday objects and services. “It will be part of your presence all the time,” he said.
This aspect of the modern technological world is what Martin Heidegger referred to as the gestell, a German word meaning an imposition of one thing onto another: a total enframing of reality.
Today, most of the world we experience is technē, and it’s all within the gestell. The gestell, as Heidegger predicted, is a kind of all-encompassing cage since even our attempts to escape from technology are themselves technological. (Consider the features on most smartphones, which now help you monitor your usage and will prevent you from using apps during specific periods.)
The notion of alterity disappears as everything collapses into new enmeshment between humanity and tech. The difference between physis and technē isn’t as clear as it used to be, which is leading to general confusion about human nature itself.
The materialist and pop anthropologist author Yuval Noah Harari (of “Sapiens” fame), for instance, thinks that humanity is in the process of upgrading itself into “gods,” evolving into a kind of techno-species that might even one day be able to conquer death. For him, the future of nature and technology is almost complete.
Harari is wrong about human nature, but he’s right to see that humanity no longer knows what it wants. “The real question facing us is not ‘What do we want to become?’ but ‘What do we want to want?’” he wonders near the end of this book.
Most people no longer know what they want because it’s harder than ever to find powerful models of desire — models of humanity that inspire greatness and show man fully alive.
We won’t achieve this level of humanity by attempting to upgrade ourselves into a higher species, as Harari believes. We will achieve it through an anti-mimetic approach: to reject the predominant models of mediated and mediocre humanity on offer to us today and to adopt new models of humanity at its best.
We need not manufacture them. We need only a more expansive vision; to look further into the future and into the past, as Petrarch did, to beauty so old and so new. Our models can never be the work of human hands.