Why This Guy’s Scarlett Johansson Robot Raises a Lot of … Concerns
Just in case you didn’t feel terrified about the future yet, this lifelike robot that looks eerily similar to Scarlett Johansson has arrived to give you a case of the heebie-jeebies. As you can see from the video, this robot proves how the Uncanny Valley works. She looks almost like a human—a very specific human—but her jerky motions and her almost-accurate-but-not-quite-right facial expressions really take the whole design into terror-town.
By the way, I’m saying “her” instead of “it” throughout this piece, because when this robot gains sentience and takes revenge on all of us, I want her to know that we’re cool and I respect you, Robo-ScarJo.
Her creator, Hong Kong-based designer Ricky Ma, told the Daily Mail that he based her appearance on a Hollywood starlet, but he wouldn’t say which one. (It’s pretty obvious which one.) According to the Verge, Johansson might have grounds to sue the guy for stealing her likeness, especially if he plans to sell the robot or the design. That article also notes that this sort of thing has been a concern for lawyers since the early 20th century, when wax figures of celebrities started growing in popularity. So, uh, at least this isn’t a new concern, but it’s still horrifying.
Why is it horrifying, you ask? Not a bad question, at least according to the comments on every piece I’ve seen about this robot so far. First of all, the comments I’ve read have assumed that this man built this robot for sexual purposes—and I don’t blame them for thinking so, since all of her programmed spoken responses are flirtatious in nature, such as winking and giggling when receiving a compliment about her appearance. It doesn’t seem like a big stretch to say that she’s been programmed to provide the Girlfriend Experience, although no one seems to have asked this guy whether her skill-set goes beyond that.
Anyway, I’ve seen many comments defending the idea of building a robot for sex, or at least buying one. What’s the difference between a robot and a sex toy, like a fleshlight or a vibrator? After all, there are fleshlights and vibrators that are specifically molded after celebrity body-parts, so how is this different? Well, for one thing, those celebrities give their consent to the companies that profit off of the use of their likeness. So that’s a big difference, at least when it comes to this particular robot’s existence.
Another big difference is that this robot can talk and move and do stuff. Normally, a fleshlight just sits there, a static tool—same goes for a sex doll, but once you start programming responses and behavior and dialog, a whole new set of questions arises, along with a whole new set of ethical concerns.
Even if this robot didn’t have a body at all (let alone an A-list celebrity’s body), those ethical concerns would still arise. They already have, at least when it comes to the other AIs that have begun to pop up. Back in 1976, computer scientist Joseph Weizenbaum argued that certain jobs should never be performed by robots; among those jobs, he listed “customer service rep,” “nursemaid for the elderly,” and “soldier.” We’ve already seen those three jobs getting replaced by robots, though.
The other jobs he listed: therapist, judge, and police officer. It’s scary to imagine robots filling those roles because of the reason that Weizenbaum gave at the time: We envision all of those roles as requiring empathy. That’s why customer service AI feels so alienating and frustrating in comparison to talking to a person. As a more extreme example, the military usage of drones has a similar effect, with far more significant consequences for human life.
Some disturbing patterns have already begun to crop up when it comes to the development of assistive AI technology. For one thing, many of these assistive robots have been gendered as female and also cast in servile roles. Although there are some exceptions, the male equivalent (like Jarvis in Iron Man) doesn’t seem to have caught on. However, even when the male equivalent does exist, his AI still takes on a servile role, like a butler. We cast robots into these “servile” roles, and the fact that most of them end up having female voices says a lot about our own feelings of comfort about who we imagine should “serve” us. It’s gotta be either a lady secretary or a British dude butler, depending on which privileged fantasy appeals to you most.
This is why I think Ex Machina should become required viewing for every would-be roboticist. In Ex Machina, a developer wants to create a robot girlfriend, but he doesn’t want to feel alienated by her; he wants her to behave in a human, realistic way, except every time he gives a robot the power of free will, she gets the power to leave him … which he doesn’t want. (It’s a little more complicated, and much more disturbing, than that—but I wouldn’t want to spoil the whole movie!)
The way that these female AIs have been designed thus far says a lot about how society wants women to look (like Scarlett Johansson, apparently) and how they should behave (at your beck and call, like Cortana and Siri). The tools that we create say a lot about who we are and what we think we want, but there’s a big difference between what we think we want and what would actually benefit us.
It’s clearly possible to create a facsimile of a person who doesn’t behave like they have human needs, but instead behaves like a program that does what you tell it to do, or a butler paid to put up with your bad moods. These are not true relationships; they have no give-and-take. They center only one person’s needs. If the person serving your needs is a butler, then at least you’re paying them! But ordinarily, domestic labor and emotional labor are undervalued and low-paid skills; they’re also coded as feminine skills. We already don’t have very much empathy or respect for the people who do these labors, so is it any big surprise that people treat AIs like crap, too?
I have a lot of hope about the future of AI and robotics. I don’t think it’s impossible to imagine a robot that is capable of empathy, even a robot that could be a therapist. I also don’t think it’s impossible that a robot could become self-aware someday, but that only seems possible to me if the creator of the robot displays some empathy as well.
We want to build AIs that genuinely care about us—but do we genuinely care about them?
—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—
Have a tip we should know? [email protected]