Sex Robot Samantha's 'Moral Judgment' AI is "Nonsense": Experts Tell Us Why
Samantha may one day judge owners based on their morality.
The sex robot Samantha could one day make judgments about your moral values — and use it to determine how much it “wants” to have sex.
That’s according to Samantha’s creator Sergi Santos, whose $5,000 machine has an artificial intelligence framework that could receive an update further down the line. If Samantha judges you to be a good person, for example, that could increase its libido and make it more “interested” in sexual activity.
“It requires a bit more thought than that but the algorithm I designed allows for all of that relatively easily,” Santos tells Inverse. “That’s why I published it, I think it can be adapted easily to do all these things but I need to do them.”
Sounds fascinating, right? Unfortunately, it doesn’t quite match up with reality. Santos published his work in the International Robotics & Automation Journal, outlining the A.I. that underpins Samantha’s operation. Inverse sent the paper over to the experts to see what they made of his claim. The feedback was… not good.
“There is nothing in this work that convinces me that it will work,” Noel Sharkey, robotics professor at the University of Sheffield and a judge on BBC’s Robot Wars, tells Inverse. “It is a paper theory with no simulation and testing and there is missing the needed detail to implement it. This is a clear case of ambition over achievement, but it is quite interesting nonetheless.”
Sharkey’s issue with the paper isn’t that it describes something impossible, but that it doesn’t adequately prove that Santos can create such a moral code.
Santos’ machine, which hit stores in September 2017, has regularly featured in headlines as its creator has made broad statements about its future development. He previously told Inverse that it could one day free humanity from work by acting as an all-purpose assistant, and also said that it would be “not so difficult” to upgrade the machines to make babies.
With his moral code, Santos has been bullish on the development schedule — he gave the International Business Times a scale of around two to three months. However, the published paper does not explain how this will happen.
Other experts doubt that a robot could even make such big moral judgements.
“I think this is nonsense - quite apart from the deeply questionable morality of sex robots (which I regard as a very bad idea) — there is no way that any robot, let alone a sex robot, can make judgements about the the intentions (good or otherwise) of humans,” Alan Winfield, professor of robot ethics at the University of the West of England, tells Inverse. “That would require the robot to have an artificial theory of mind (for a human) and AToM remains a distant research goal.”
A sex robot with a moral code could have a good social impact, though. John Danaher, a lecturer in law at the National University of Ireland, Galway, and co-author of Robot Sex: Social and Ethical Implications, tells Inverse that A.I. aimed at encouraging good behaviour could have a positive social impact. Depending on the implementation, Santos’ system could subvert the image of sex robots as ever-ready sexual partners and encourage users to develop positive sex attitudes.
“The technicalities of this may be an issue, and I would take Sergi Santos’s claims with a pinch of salt (as I would take the claims of any purveyor of technology), but the basic idea is not absurd,” Danaher says.
There is a darker side to all this, though. Danaher notes that a robot that can express a reluctance to sexual activities can also enable people to fulfil violent sexual fantasies. Santos’ system seems to avoid these issues by merely limiting libido, but it’s an ethical conundrum other manufacturers will have to consider.
Responding to the criticism, Santos says that the moral code is “just a concept,” but that defining a sense of right and wrong is “trivial.”
“The North Korea guy claims making weaponry that can destroy the world is good,” Santos says, referring to the country’s leader, Kim Jong Un. “Trump claims that them making it is bad, but that Americans should have it. It’s based on nothingness, just opinions coming from very vulgar biological needs like egocentrism, jealousy and will to power.”
From here, Santos says he will define his own moral code and implement it in Samantha. Contained in each doll’s head is an SD card slot, meaning his manufacturing team Synthea Amatus can release software updates to improve the A.I. over time.
“There is much work to be done in the algorithm I proposed to actually make it happen,” Santos says. “It needs work, but in principle, I don’t see why it shouldn’t work. When it’s done, if I manage to do it, then we’ll see.”
Hi there. You’ve made it to the bottom of this story! Speaking of which… we’re giving away an epic $5,000 ski trip to Banff, Alberta. Click here to enter! ⛷