Current:Home > My'Open the pod bay door, HAL' — here's how AI became a movie villain-VaTradeCoin
'Open the pod bay door, HAL' — here's how AI became a movie villain
lotradecoin app View Date:2024-12-25 23:35:12
This article was written by a human.
That's worth mentioning because it's no longer something you can just assume. Artificial intelligence that can mimic conversation, whether written or spoken, has been in the news a lot this year, delighting some members of the public while worrying educators, politicians, the World Health Organization, and even some of the people developing AI technology.
Misuse of AI is part of what actors and writers are striking about in Hollywood, and the threat of AI is something Hollywood was imagining long before it was real.
In 1968, for instance, the year before humans first set foot on the moon — and a time when astronauts still used pencils and slide rules to calculate re-entry trajectories because their space capsules had less computing power than a digital watch has today — Stanley Kubrick introduced movie audiences to a sentient HAL-9000 computer in 2001: A Space Odyssey.
HAL (for Heuristically Programmed Algorithmic Computer) introduced itself early in the film by saying, "No 9000 computer has ever made a mistake or distorted information. We are all, by any practical definition of the words, foolproof and incapable of error."
'Open the pod bay door, HAL'
So why was HAL acting so strangely? He (it?) was responsible for maintaining all aspects of a months-long space flight, ferrying astronauts to the moons of Jupiter. Programmed to run the mission flawlessly, the computer's behavior had become alarming, and two of the astronauts had decided to shut down some of its functions. Their plan was short-circuited when HAL, lip-reading a conversation they'd managed to keep him from hearing, cast one of them adrift while he was outside the ship repairing an antenna and refused to let the other back on board.
"Open the pod bay door, HAL" became one of the most quoted film lines of the decade when the computer responded, "I'm sorry, Dave, I'm afraid I can't do that. This mission is too important for me to allow you to jeopardize it."
It's hard to articulate what a genuine shock this was for 1960s movie audiences. There'd been films with, say, robots causing havoc, but they were generally robots doing someone else's bidding. Movie robots, at that point, were about brawn, not brain.
And anyway, malevolent robot stories were precisely the sort of B-movie silliness Kubrick was trying to avoid. So his intelligent machine simply observed (with an unblinking red eye) and, when addressed directly, spoke with a calm, modulated voice, not unlike the one that would be adopted four decades later by Siri and Alexa.
Darwin Among the Machines
Earlier literary notions of "artificial" intelligence — and there were not a lot of them at that point — hadn't really caught the public's imagination. Samuel Butler's 1863 article Darwin Among the Machines, is generally thought to be the origin of this species of writing, and it mostly just notes that while humankind invented machines to assist us — and remember, a really sophisticated machine in 1863 was the steam locomotive — we were increasingly assisting them: tending, fueling, repairing.
Over tens of thousands of years, Butler wondered, might humans not evolve in much the same way Darwin's study of natural selection had just established the rest of the plant and animal kingdoms do, to the point that we would become dependent on our devices?
But even when he incorporated that idea a decade later into a satirical novel called Erewhon, expounding for several chapters on self-replicating machines, Butler barely touched on the notion that those machines would develop consciousness. And neither did the influential 19th-century science fiction writers who followed him. H.G. Wells and Jules Verne invented plenty of unorthodox devices as they sent characters to the center of the Earth, and into space and the recesses of time, without ever considering that those devices might want to do things on their own.
The term "artificial intelligence" wasn't even coined (by American computer scientist John McCarthy) until about a dozen years before Kubrick made his Space Odyssey. But HAL made an impression on the public where scientists had not. Within just a couple of years, movie computers didn't just want spaceship domination; in Colossus: The Forbin Project (1970), they wanted to take over the world.
Malignant machines gone viral
And then this notion of technology-run-wild, ran wild. A high school student played by Matthew Broderick nearly started World War III in WarGames (1983) when he thought he was hacking a computer company's website but accidentally challenged the Pentagon's defense network to a quick game of "global thermonuclear war." The problem, it soon became clear, was that no one told the defense network they were just "playing."
Elsewhere, mechanical men stopped being all-brawn and got a new dispensation to think for themselves, something fiction had granted them before Hollywood got around to it.
In the 1940s, sci-fi novelist Isaac Asimov came up with "Three Laws of Robotics" that would theoretically keep "independent" machines in line. When Asimov's story I, Robot, was turned into a film a half-century or so later, those laws should have reassured Will Smith as he stared down thousands of bots. But he had good reason to be skeptical; he was fighting a robot rebellion.
The Terminator movies effectively put all these themes on steroids — cyborgs in the service of a computerized, sentient, civil-defense network called Skynet, designed to function without any human input. A "Nuclear Fire" and three billion human deaths later, what was left of humanity was engaged in a war against the machines that has so far consumed six films, a TV series, a pair of web series, and innumerable games.
And nuclear blasts weren't necessary to make machine intelligence alarming, a fact cyberpunk-noir established definitively in Blade Runner with its "replicants," and in a Matrix series that reduced all of humanity to a mere power source for machines.
Hollywood's still fighting that vision. Who knows what "The Entity" wants in Mission Impossible: Dead Reckoning (presumably we'll find out next year in Part Two), but whatever it is, it won't bode well for humanity.
It seems not to have occurred to Tinseltown that AI might do the things it's actually doing — make social media dangerous, or make undergrad writing courses unteachable, or screw up relationships by auto-completing incorrectly. None of those are terribly cinematic, so Hollywood concentrates on exploiting our fears — in the late 20th century, we worried about ceding control to technology. In the 21st century, we worry about losing control of technology.
Bring on the droids
Have there also been friendlier film visions of AI? Sure. George Lucas came up with lovable droids R2-D2 and C-3PO for Star Wars, and Pixar gave us Wall-E, a bot who was pluckily determined to clean up an entire planet we'd despoiled.
Spike Jonze's drama Her imagined a sentient, Siri-like personal assistant as a digital girlfriend. Star Trek's Data was not just a Next Generation android version of Mr. Spock, but also a sort of emotion-challenged Pinocchio.
And another Pinocchio — this one fashioned to stand the test of time — would have been Stanley Kubrick's own answer to the question he'd posed with HAL in 1968.
Kubrick labored for decades to hone the script for A.I. Artificial Intelligence, then just two years before he died, handed the project off to Steven Spielberg — the story of David, a robot child who has been programmed to love, and who ends up going beyond that programming.
"Until you were born," William Hurt's Professor Hobby told the bionic child he'd modeled on his own son, "robots didn't dream, robots didn't desire unless we told them what to want." The miracle, he went on, was that though David was engineered rather than born, he shared with humans "the ability to chase down our dreams...something no machine has ever done, until you."
That may not have been enough to make David a real boy, but it put a gentle face on what is perhaps our greatest fear about AI – that we are mortal, and it is not.
In the film, David outlives all of humanity, never growing up, never changing. And perhaps because he was played by Haley Joel Osment, or perhaps because Spielberg was calling the shots, or perhaps because the music swelled ... just so — it didn't feel the least bit threatening.
veryGood! (3672)
Related
- Bill Belichick's salary at North Carolina: School releases football coach's contract details
- Julianna Margulies apologizes for statements about Black, LGBTQ+ solidarity with Palestinians
- At UN climate talks, fossil fuel interests have hundreds of employees on hand
- Worried about running out of money in retirement? These tips can help
- Netizens raise privacy concerns over Acra's Bizfile search function revealing citizens' IC numbers
- Rescuer raises hope of survivors at a Zambian mine where more than 30 have been buried for days
- 2024 NFL draft first-round order: Bears fans left to root for Panthers' opponents
- California faculty at largest US university system launch strike for better pay
- 'Squid Game' without subtitles? Duolingo, Netflix encourage fans to learn Korean
- Deputy on traffic stop in Maine escapes injury when cruiser hit by drunken driver
Ranking
- Turning dusty attic treasures into cash can yield millions for some and disappointment for others
- At UN climate talks, fossil fuel interests have hundreds of employees on hand
- Tiffani Thiessen's Cookbook & Gift Picks Will Level Up Your Holiday (And Your Leftovers)
- Magnitude 5.1 earthquake shakes northwest Turkey. No damage or injuries reported
- North Dakota regulators consider underground carbon dioxide storage permits for Midwest pipeline
- In some Czech villages, St Nicholas leads a parade with the devil and grim reaper in tow
- 20 years after ‘Sideways,’ Paul Giamatti may finally land his first best actor Oscar nomination
- Recordings show how the Mormon church protects itself from child sex abuse claims
Recommendation
-
Biden commutes roughly 1,500 sentences and pardons 39 people in biggest single
-
OxyContin maker bankruptcy deal goes before the Supreme Court on Monday, with billions at stake
-
Spotify axes 17% of workforce in third round of layoffs this year
-
We all know physical fitness is crucial. But how many days weekly should you work out?
-
Supreme Court allows investors’ class action to proceed against microchip company Nvidia
-
Global journalist group says Israel-Hamas conflict is a war beyond compare for media deaths
-
Analysis: Emirati oil CEO leading UN COP28 climate summit lashes out as talks enter toughest stage
-
Sylvester Stallone returns to Philadelphia for inaugural 'Rocky Day': 'Keep punching!'