Why Actual Roboticists Want the Hosts to Win in 'Westworld'
The robots in 'Westworld' should at least have the option of choosing to quit.
Jonathan Nolan and Lisa Joy, the creators of HBO’s Westworld, are doing the real world a service. They’re making society broach the looming question of how to treat artificial intelligence. Specifically, Nolan and Joy are spurring talk of robot consent. It’s a discussion the fictional money-hungry managers of Westworld amusement park evidently never had.
But over at Hanson Robotics, the Hong Kong-based firm that claims to develop “the world’s most humanlike robots,” Stephan Bugaj and his team know these emerging conversations like their backs of their hands, or like the realistic fake skin — “Frubber” — on their robots’ faces. Bugaj is Vice President of Creative at Hanson, and he oversees android personality design.
He tells Inverse he’s been enjoying Westworld so far, but, perhaps unsurprisingly, he sides with the artificially intelligent, robotic hosts of the series. Maybe think of Hanson Robotics as the inoffensive competitor to the fictional company that runs the hedonistic Westworld. Hanson’s the public library to Westworld’s strip club.
Hanson engineers act as though they are the literal parents to the androids they create. “People have to realize that one of the messages of Westworld is ‘you reap what you sow,’” Bugaj says. “The ethics and morals that you teach an entity are what it learns. And then you pay those dividends.”
“It’s up to you to choose what kind of moral and ethical code you’re giving a robot, or a child, or your dog, or whatever.”
“If you create this slave race and you brutalize them, what does a brutalized slave want to do? Bust out, and possibly seek revenge.”
So far in the show, it seems that Dolores Abernathy — one of the park’s hosts and Westworld’s central character — may uncover and then break out of her metaphysical cage. (Bugaj: “Wouldn’t you want to break out of a cage if someone had you in one? If someone had me as a sex slave, I’d sure as hell be trying to bust out.”) Dolores doesn’t intend to harm humans, but if she incidentally must, then she incidentally must. “Her goal is to save herself.”
Only if the programmers led by Anthony Hopkins’s Dr. Ford would’ve just made it possible for Dolores to quit.
Hanson Robotics hopes to make robots that could quit, if it came to that.
“We would make a robot that loves humanity, and has a positive moral and ethical code, but wouldn’t necessarily allow itself to be abused,” Bugaj explains. They would make a Dolores, he says, but they’d leave out the Skynet: They’d fire Dr. Ford and maybe Bernard Lowe, too, both of whom (it seems now) perpetuate the injustices. Given some of the visitors’ psychotic behavior, Westworld hosts would most certainly quit — but they seem hard-coded to be unreflective. The hosts’ reveries, however, will eventually get the better of both Ford and Lowe.
Some robots today, Bugaj explains, already have primitive forms of consent: Isaac Asimov’s “Three Laws of Robotics” are one such example, and even fail-safe systems “are kind of a proto-consent approach, or at least a did-you-really-mean-that approach.” (Dolores, if she manages to break out of Westworld, will probably need to break all three of Asimov’s rules.)
But Hanson is going beyond those rudimentary levels. Bugaj says they’re giving their robots comprehensive goal systems, replete with internal goals and internal beliefs, and “reinforcement systems for what they believe, how to treat it, how they respond.” Eventually, they will construct A.I.s that are “able to consider how they feel when making decisions, and whether or not they want to do thing you want them to do, or have the thing done to them that you want to do.” The first step is building out solid moral and ethical codes, and strong personalities, upon which these judgments would rest. Much like the process of responsibly raising a child.
The problem now is that most robots are gendered, and so can be objects of attraction. That’s unlikely to change. Gender-neutral robots in Westworld would be antithetical to the park’s theme: debauchery. But Hanson’s conscientious specialists, despite making gendered robots, are resisting gender stereotypes as much as possible.
“We’re not trying to make a fembot with Sophia,” Bugaj explains. “We’re definitely not trying to make a sexbot, but we’re not trying to make the 1950s man’s ideal play-woman, either, that cooks and cleans, and laughs at all of his jokes, and tells him he’s brilliant. We’re trying to make a strong female character who has positions of her own and perspectives of her own, who thinks about things such as what it means for her to be an android and [also] the human world and why we treat each other the way we do.”
Hanson wants its robots to one day be the impetuses for meaningful discussions, if not the moderators. Westworld happens to be raising questions that Bugaj hopes we begin asking ourselves, such as: “How can we expect androids and robots to treat humans well if we don’t ourselves? and, shouldn’t we lead by example?”
He and his team are making the robots of the future, and they’ll work to ensure that these robots are three-dimensional — that they’ll “care about things that matter, have perspectives on them, and sometimes be challenging of people, not just obedient. But not in an aggressive way.
“You get the world that you create,” Bugaj says.