Is the artificial intelligence in M3GAN a real threat?
She is titanium.
In M3GAN, the blonde-haired, sweet-voiced android at its center has been programmed to be the perfect child companion. But then the murders begin.
The new film from Housebound director Gerard Johnstone and Malignant screenwriter Akela Cooper, is the latest to tap into cinema’s deep-rooted fear of machine insurgencies. The classic “robots gone haywire” trope dates as far back as Fritz Lang’s Metropolis and as recently as Roland Emmerich’s Moonfall. M3GAN leans closer to Joe Begos’ Christmas Bloody Christmas than these films — think “slasher level carnage” instead of “extinction level event.” The change makes M3GAN’s horror element more personal. The threat of malfunctioning smart assistants becomes scarier than the threat of nuclear holocaust because the threat lives at home. Technology is a wonderful thing. But what if the fabulous gadgets and inventions we rely on for easing our daily routines suddenly turned against us?
For scientists making strides in the fields of robotics and artificial intelligence, this is the multi-multi-multi million-dollar question, which M3GAN asks mainly through Gemma (Allison Williams), a roboticist at a toy company. She doesn’t understand children, but she does understand robots. When her niece Cady (Violet McGraw) is put under her care, Gemma immediately turns to technology as a parenting crutch. Gemma’s first scene in the film contextualizes M3GAN as an ambitious side project — and also as a disaster in progress. A demonstration for her boss, David (Ronnie Chieng), ends with prototype M3GAN’s head exploding. But the doll gains new urgency in Gemma’s life once Cady moves in with her, and M3GAN goes from being Gemma’s ticket to new success to a dire necessity because she really can’t hack this “guardianship” thing.
Ultimately, M3GAN is good for Cady — and for Gemma, who gets to wash her hands of such inconveniences as playtime and bedtime, now under M3GAN’s purview. This suits Gemma and M3GAN just fine. Gemma is a career woman, and M3GAN is literally made to look after Cady. It’s a win-win-win. Then the bodies start dropping.
AI is creeping closer to ubiquitousness in human existence. With that ubiquity comes the potential of dangerous consequences: As we slowly accept new forms of artificial intelligence, from social media monitoring to healthcare management, we invite new, unintended risk to our livelihoods. If that thought keeps you up at night, there’s good news, and there’s bad news. The good news is pretty good, but the bad news is still alarming.
“The bigger danger, which is with the newer changes in artificial intelligence, is machine learning.”
“People in robotics and AI do take seriously the notion that an artificially intelligent agent could become ‘superintelligent’ and wreak havoc,” Tom Silver, a PhD student at MIT CSAIL who works with robotics and AI, tells Inverse. “There is a very active research community on AI safety where these existential risks are explored.”
However, doomsday scenarios entertained by James Cameron in The Terminator and Terminator 2: Judgment Day, and by the Wachowski siblings in each of the Matrix films, are just that: entertainment.
“One reason is that we are still struggling to implement very basic competencies in robotics,” Silver says. “For example, we don't yet have robots that can reliably pick up, move, and place arbitrary objects.” You can’t mow down humans with plasma rifles if holding a flower pot is a Sisyphean task. This should give a measure of relief to paranoiacs stocking up on nonperishables in their remote bunkers. For the rest of us, being hunted by T800 endoskeletons is the least of our worries.
“In light of ChatGPT, people are more imminently concerned about something like a superintelligent chatbot than they are about a superintelligent embodied robot,” Silver says. Artificial intelligence’s applications, and the fear of artificial intelligence running awry within said applications, are myriad: Discrimination, better known in the worlds of robotics and AI as “machine learning bias,” disinformation à la deep fakes, and techno-solutionism — the notion that whatever problem you have, there’s an app out there to fix it, and if there isn’t, then there should be.
Still, M3GAN, the film, argues not that we should be afraid of M3GAN, the robot. As Silver notes, we’re at a stage in robotics where she couldn’t carry a cup of coffee without likely spilling it on someone’s lap. What we should be afraid of, or at least wary of, is M3GAN’s dramatized learning model.
“The bigger danger, I think, which is with the newer changes in artificial intelligence, is machine learning,” says Ronald Arkin, Regents’ Professor Emeritus at the Georgia Institute of Technology’s College of Computing. “A lot of the work right now in deep learning, which is a very powerful tool and is producing quite interesting results, is the fact that it is not transparent and it often doesn't have explainable capabilities."
Arkin refers to automated weapons — picture an air defense system knocking a hostile missile clean of the sky — as a major area of concern for robotics researchers. Granted, M3GAN doesn’t get into this particular area of study, though an android dancing to a Taylor Swift remix, shimmying and cartwheeling down a hallway just to rip the guillotine blade off a paper cutter, is arguably kinda like a weapon. But M3GAN relies on the same form of AI as that air defense system – deep learning, which mimics humans’ way of learning by example. M3GAN is effectively a lifelike mecha-nanny with an advanced deep learning model. Her software is coded to recognize behavioral patterns that allow M3GAN to become Cady’s ultimate friend and caregiver, as well as a ruthless, and stylish, killing machine.
People lie in M3GAN, which probably explains how M3GAN develops such alarming proficiency in lying. She dodges questions about her role in the death of Brandon (Jack Cassidy), a vicious bully she encounters on an ill-fated visit to an alternative school where Gemma intends to enroll Cady. M3GAN chases Brandon (Jack Cassidy), a vicious bully, into oncoming traffic. Later, she corrupts her own memory files to prevent Gemma from discovering that she’s capable of murder most foul.
Like Silver, Arkin ranks machine learning bias as a greater threat than slasher film violence, and which helps shape M3GAN’s brand of horror. “Biases that can be picked up from training data, which has inherent biases in it, often go undetected,” he says.
And one reason those biases, or other flaws, might go undetected is the marketplace . Just as software and hardware designers update the devices we currently have for devices yet to come, M3GAN takes a contemporary angle on the “killer robot” niche. M3GAN doesn’t kill people; good old-fashioned capitalism and rushed production do. (Okay: She does kill people, but that’s what we pay for when we buy a ticket to a slasher.)
“There's something in ethics called the precautionary principle,” Arkin explains, “which talks about trying to make sure that you have a good understanding of what is going to happen when you release new technology or emerging technology into the world.” People don’t follow this principle to the extent that they ought to, in part because over-caution muzzles innovation, and in part because commercial companies scramble to get their gizmos on shelves as fast as possible.
“You don't wanna lose your competitive edge if you're leading,” Arkin adds, and this is where M3GAN dovetails with reality. Gemma puts M3GAN in front of her company’s board of directors, along with David and their boss, the company president. “I think the world is about to shift on its axis,” he tells them both after Gemma’s demonstration leaves everyone in the room a sputtering pile of tears. He wants to fast-track M3GAN’s distribution before Christmas. Cost is no object. Time, however, is. Gemma doesn’t read this dash to retail as a problem at first. She’s all in. It’s not until she catches a whiff of M3GAN’s murder most foul that she starts having doubts, and by then it’s too late.
Arkin brings up the most relevant commercial AI du jour: The self-driving car. “The goal to get that technology out as fast as it possibly can, which could lead to great benefits, also has significant risks associated with it,” he says. Properly weighing the holistic impact of self-driving cars on society and on individuals is robotics’ research’s fundamental responsibility.
M3GAN marries that essential point with the indomitable evil of artificial intelligence in revolt. Technology, at its best, allows us access to information, entertainment, and, in a minor stroke of irony, human connection when we need them most. It’s a security blanket for comfort in discomfiting times. M3GAN tugs at that blanket’s loose threads until they unravel completely.
M3GAN is currently playing in theaters.
This article was originally published on