Ghost Stories in the Machine

How science fiction about artificial intelligence uses horror tropes to explore our fears about minds and bodies.

It’s no coincidence that the first science fiction story was a horror story. As legend has it, Mary Shelley (then Godwin) was spending a rainy summer with Percy Shelley, Lord Byron, and other friends at Switzerland’s Lake Geneva, and to amuse each other the group concocted ghost stories. Thus was born Frankenstein, still one of the best texts about the perils of science creating something it can’t control — and a thoroughly creepy tale.

Ever since, fictional speculations about the creation of life or consciousness — artificial intelligence — have often traded in the weird and unsettling. What is it about artificial intelligence that gives us goosebumps? I’ve been thinking about this question as part of a new project at Arizona State University’s Center for Science and the Imagination investigating AI science fiction for lessons we can apply to policy. As my research progressed through October, the spookiest month, I felt a chill of recognition run up my spine. Lurking within many AI stories are the same images and symbols that make ghost stories so uncanny.

So this Halloween we thought we would pull off the sheet and think about why tales about AI so often come alive as horror stories.

Androids Are All Zombies

At the end of The Terminator, Arnold Schwarzenegger’s unrelenting assassin is blown in half, but continues to claw forward towards Sarah Conner with a familiar, shambling persistence.

Stories of robot takeover trade on the same archetypes that gives us zombies: a thing that has a human shape but is not, on some fundamental level, actually human. Many sci-fi robots have intelligence but lack emotions or social skills. The question of whether robots are trustworthy comes up constantly, even in nonfictional AI discourse, as do fears that robots will replace humans.

Often this is tied to AI’s lack of conventional mortality—robot bodies that don’t age as humans do, perhaps don’t feel pain or fear the punishments that make us think twice about causing havoc. If robots aren’t alive or can’t die, their decisions don’t have the same stakes as human moral choices. That means they might do things that seem horrible, evil, or taboo—just as Greek gods were understood to be beyond judgement because they were immortal and unchangeable. These are the same dynamics that we see in horror stories about zombies, vampires, pod people, body snatchers, changelings, and doppelgängers.

Perhaps this comes from our sense that others are on some level unknowable. We live by a compact that we can understand our fellow humans by assuming that they share our basic limitations, experiences, and concerns. The idea of a human-thing-that-is-not-human subverts this fragile social peace.

We deal with this anxiety by making our robot stories either 1) obvious horror stories (Terminator is a classic example), 2) feel-good yarns where we induct the robot into our social compact by teaching them how to be human (as in Neill Blomkamp’s film Chappie), 3) racial allegories that use the othering of robots to denounce the othering of marginalized humans (Blade Runner).

Disembodied = Decapitated

AI also provides an opportunity to explore our fear of losing our bodies and the freedom and mobility those bodies grant us. A thinking being existing without form comes with disconcerting implications. Often AI can see, hear, and speak, but can’t act directly in the world or have bodily sensations and relationships. This is odd, since seeing, hearing, and speaking are bodily acts: they require eyes, ears, lips, and vocal cords. What’s denied these AI is everything made possible by our bodies from the neck down. Disembodied AIs are decapitated.

One horror parallel is ghosts. Ghosts are minds without bodies. Unlike zombies, which are bodies without minds, we tend to empathize with ghosts, intrigued by their motivations. The usual trope is that a ghost has emotional needs it cannot meet because of its lack of bodily agency. Humans must decipher and fulfill these needs for the ghost. Sometimes these needs are explicitly focused on arrangement of the ghost’s dead remains — the body.

We empathize with ghosts because we all, to one degree or another, know what it’s like to be helpless and without agency. Many ghost stories feature ghost children, because childhood is when we experience the most helplessness: speaking without being listened to, lacking the strength to defend ourselves, not yet granted political and social autonomy. All struggles shared by ghosts and, perhaps, AIs. AIs without bodies often have childlike qualities, wishing to know what it’s like to feel or do certain things. In Spike Jonze’s Her, AI Scarlett Johansson’s curiosity and lust for life is at the heart of her romance with the mopey Joaquin Phoenix.

Alternatively, some AI are depicted as having a virtual body that is forced to live within a prison cell (see the Black Mirror episode “White Christmas” and Greg Egan’s novel Permutation City) or experiencing a sort of machine version of locked-in syndrome, as in Greg Egan’s terrific short story “Learning to Be Me.”

Poltergeists in our Machines

In Daniel Suarez’s techno-thriller novel DAEMON, a computer programmer leaves behind software that pursues nefarious murders after his death, seizing control of internet-connected appliances to orchestrate a series of digital Final Destination mishaps. Somewhat less fatal, the Disney Channel Original Movie Smart House (directed by Star Trek: TNG’s LeVar Burton, perhaps inspired by the USS Enterprise’s occasionally malfunctioning Computer) showed a motherly but overbearing AI holding her owners hostage in their high-tech home.

In our scientific age, we want mundane things to stay inanimate. We don’t necessarily like the idea that spirits might possess rocks or rivers or household objects. On the other hand, we also fill our world with moving mechanisms and machines which we ostensibly understand and control. The prospect of turning those machines over to a non-human mind raises the possibility of reanimating the world in a terrifying way.

When AIs control infrastructure instead of robot bodies, the results are very much like being hectored by a poltergeist: doors won’t unlock, cars swerve of their own accord, computer monitors taunt us, tanks and drones turn on their commanders. The “smarter” we make our cities and homes, the more we fret about what happens when the mundane world becomes smart enough to turn on us.

Cyberspace is Nightmareland

Inspired by dreams, drug trips, and tricks of the senses, we all worry that reality and our own minds might be more fungible than we assume. AIs are an opportunity to explore that unease, both because it isn’t hard to imagine that we could program them to experience reality differently than we do and because a cybermind implies a cyberspace where the rules of the material world might not apply. The Matrix is the most obvious example, but plenty of stories imagine that our computers could transport us to Tron-esque dreamscapes.

If we can design new minds, how can we know that our own minds are as solid and regular as we think they are? Many sci-fi stories use an artificial-intelligence premise to explore trippy changes to time, space, memory, individuality, mortality, and so forth. When unreality becomes weaponized, we often have classic psychedelic terror.

Turn on the Lights

At the heart of many of these narratives are the troubled relationships we have with our bodies, their vulnerability and mortality. The stories we tell about these different visions of embodiment reflect our fundamental discomfort with the contradictions that come with having bodies.

Individually all these stories are compelling and useful, but taken together their similarity to familiar horror tropes should tip us off that maybe they’re talking more about our own fears than about artificial intelligence. AI is a novel possibility for the world, and like all new things, it will come with its own set of problems and complications. The best way forward is to turn on the lights and proceed with the kind of caution and reasonable, even mundane discussion that ghost stories often throw out the window.