Would You Have Sex with a Robot?

woman in bed with robot
Ads for the computer love story Her are everywhere, and it makes me wonder about the ethics of sex robots. Would I ever have sex with one? Or would I find it ooky and gross — in a moral sense, but also aesthetically and erotically? Here’s what I decided.

I certainly don’t have any objections to using sex toys. I don’t have moral objections, or aesthetic/ erotic ones. I think sex toys are awesome, and I use them enthusiastically. And a sex robot would essentially be a sex toy. Unless it had consciousness, from a moral perspective I don’t see how using a robot for sex would be any different than using a vibrator for sex.

But I’m finding the idea ooky.

I think a sex robot would be different from a vibrator, in important ways.

*****

Thus begins my latest piece for io9, Would You Have Sex with a Robot? To find out more about my thoughts on the ethics, aesthetics, and… em, robotics, I guess, of having sex with robots, read the rest of the piece. Enjoy!

_______

UPDATE: In this case, definitely read the comments. It’s a great conversation. Lots of ideas and angles I hadn’t thought of.

{advertisement}
Would You Have Sex with a Robot?
{advertisement}

10 thoughts on “Would You Have Sex with a Robot?

  1. 3

    If androids are being designed to be more and more similar to human beings, does there come a point where there’s not a useful moral distinction between them?

    There is such a robot late in Isaac Asimov’s Foundation Series (not the original trilogy). In this case, the robot has passed through the uncanny valley and come out the other side, and I would say that the ethics of having sex with the robot would be no different to the ethics of sex with an organic human being, indeed, Asimov depicts the robot as human (hopefully I’ve said that in a way so as to avoid spoilers for anyone who wants to read it).

  2. 4

    It’s a very interesting question. To add some fuel to the fire, we often think of cognition as a very humanocentric or organocentric thing. Essentially, we believe that if it processes information in the same way as us then it is conscious, but if it processes information in a different way then it’s not. This is not necessarily true. Just because something thinks in a different way from us does not necessarily strip it of its rights and obligations. So the question becomes, what constitutes consciousness? We don’t even know if consciousness should be functionally or structurally defined, if there’s some critical component that makes something aware or if awareness is itself a resultant property of the right circumstances, whatever those circumstances may be.

    More disturbingly, if awareness is defined functionally, then it becomes reasonable to surmise that there are possible systems capable of giving rise to consciousness that are abhorrent. For instance, I believe that using serial processing paradigms for the development of AI is unethical because I believe that the self is defined as a continuity of mental states. If a serial system were to exhibit mental states, then the clock cycles of such a system would break that continuity in very strange and unethical ways. Given this, I believe that AI should be built on an asynchronous, multi-core platform.

    Now more on topic: Let’s say a sex robot were created with full consciousness in a way that doesn’t break that continuity of mental states. Such a system would either have a desire to please you, or its best equivalent of a desire to please you. If an AI is purpose-built with a desire to fulfill its role, then the potentially unethical portion would necessarily not lie with using it for that role, but with the construction of purpose-built conscious systems. But this is also a very organo-centric viewpoint. Let’s look at the most recent superman film as evidence. The Krypton homeworld purpose-built its citizens (leaders, warriors, etc.) and did we feel any kind of moral outrage at this concept? I didn’t, I could see flaws in such a system, but they weren’t ethical flaws.

    Indeed, we as humans purposefully breed humans to certain roles all the time: Kings, Queens, Heirs to businesses. We consciously and purposefully work to shape our children with expectations of making them certain ways: Moral, thoughtful, decisive, etc. We have purposeful expectations of our children, and those expectations only become immoral when the child’s desires conflict with those expectations. In the case of the purpose-built AI, there is a significantly reduced risk of this desire/purpose conflict. Do keep in mind that desire is not a mask we wear, if someone desires something then it necessarily holds that desire, and discomfort with holding that desire arises from conflict with other desires (ie if I desired to punch someone, my discomfort with that desire is not inherent to that action or me, but because I also desire to adhere to a moral system that denies I punch someone.) So if we’re tailoring the desires and the purposes of the AI, we eliminate the two immoral points existent in what we already do with children.

  3. 5

    This is a bit of a long comment, based on the following paragraph from the i09 article:

    Off course, if a sex robot did have consciousness, having sex with it, or indeed creating it, would be morally reprehensible. The idea of designing a conscious being that existed to serve your desires and not have any of its own is reprehensible. I’m not sure what it would even mean to have consciousness and not have any desires of your own. If a sex robot were conscious, that consciousness would either have to be manufactured in a profoundly twisted way that perverted (and not in a good way) the entire idea of what it means to be conscious… or it would be a plain old consciousness. In which case the sex robot would simply be a slave.

    To me, this raises an interesting set of questions, but I’ll raise two here: robot “evolution” and moral quandaries that arise once we have conscious robots.

    The presumption in this article (and many others) is that a human would create consciousness. I would argue, though, that such “intelligent design” (pun definitely intended) need not be necessary for the creation of a sex robot (or – indeed – any robot). We already use tools to make tools more capable of conducting the specific tasks they are meant to do, and we already have conditions in which computers are used to create computers that are better able to do specific tasks. In short, direct human intervention and specific human-directed design may well not be necessary in order to create a sex robot; given a system in which sex robots could be manufactured with preferences (i.e., “selective pressures”) driving new designs for consecutive generations of sex robot (i.e., “natural selection”-like pressures), it is possible that an “undesigned” sex robot – with a motivating drive to provide sexual pleasure for the task that it was “evolved” to do – could end up existing. Furthermore, as Greta points out, since “a big part of the pleasure of sex for me is the connection between two (or more) conscious beings, the overlapping and intertwining of two (or more) sets of desires and limits,” this fundamental attribute could act as a major additional “selective pressure” that would be a game-changer for any line of sex robots in which it “evolved.”

    Now, all of this is premised upon a few of points. The first is that a form of “evolution” can occur among robots (thus my use of quotation marks), effectively independent of direct human intervention and design. The second is that consciousness is an emergent property that need not necessarily be designed. The final is that there are many different forms of consciousness – not merely a human consciousness – regardless of the physical form of the entity.

    It’s interesting (at least to me) that Geta’s article at i09 (and others that talk about artificial intelligence in robots) presupposes that sex robots require a creator, and her moral questions stem from that point. However, if robots can also undergo a form of “evolution” (even one unguided by an “intelligent designer”) in which the niche (and the associated selection pressures) is that of human sexual desire, then can we apply the same moral outrages? For example, is the above example of a sex robot that “evolved” a form, consciousness, and identity as solely a being that provides sexual gratification to a human being be any more reprehensible than the evolution of any other organism? If so, what is it that holds the moral reprehensibility? Human society that could allow such a sex robot the time and resources to “evolve” to such a specialized state? Human sexuality that drove the form and design of the “evolution”?

    The other question – and perhaps one that already has many philosophical answers – is what to do with a conscious sex robot (or – indeed – a conscious robot of any kind) once it exists. Whether it is morally reprehensible for someone to specifically design a conscious sex robot or whether a conscious sex robot “evolved” without any “intelligent design,” once we have a conscious sex robot (or several), what is the moral thing to do with it (or them)?

    Do we destroy them? It’s quite possible, after all that sex robots will have a highly socially disruptive influence, and it could be argued that destruction of a conscious sex robot (or robots) would prove to be a far greater social benefit than the costs of acting to preserve them in some sense. This argument (in essence) is what is used to pursue the extinction of entire species of viruses. Of course, the essence of this argument is also what has been used (and is used) to justify fundamentalism and bigotry.

    Do we isolate them? If conscious sex robots are created or “evolve”, and we choose not to eradicate them, then isolating them from the general population could be another option that we make as a society. However, if these entities are conscious, then mere isolation would count as a form of punishment, especially since we could be taking actions against them that take away their motivation to exist (based on my obviously fictitious example of an “evolved” sex robot, above). Arguably, societies do worse every day with animals raised for food.

    So many other questions, but in the end, there would be little doubt that the inclusion of “humaniform” robots (to evoke Asimov) will prove to be … interesting for society in the future.

  4. 7

    Depends on how you define robot.

    If you mean an automated programmable manipulator without human features; a cross between a slow-motion jackhammer, auto welder and milking machine, then I’d probably give it a go. Advanced as that is, the fuck-o-tron 9000 is still just a sex toy.

    If you mean a humanoid automaton which simulates companionship but can’t pass a Turing test, then probably not. As an occasional toy, it might be fun, but It cannot take the place of human companionship any more than a pet can. That said, I have no moral objection to this, as people who are interested in an undemanding toy instead of the messy real-life interactions with a real person are certainly welcome to remove themselves from the dating pool. Perhaps the MGTOW twits can finally make good on their promise and actually go away.

    If we’re talking about something complex enough to think independently, then we’re in the realm of “people”. Unless they have autonomy and equal rights, I’d consider it a gross violation. Although that said, if they were intelligent, autonomous and politically equal, and even capable of relating to people in any sort of emotional manner, then we’re talking about relationships instead of sex toys (or at least should be). At which point it will depend on whether an emotional relationship is even possible, let alone a sexual one. I could see such robot adopting sex as a means of improving their relationships to people.

    Of course, the next question is what happens if we are to robots as dogs are to us…

  5. 8

    Yes with three caveats:

    • It (she) would have to be a sexy robot – Putting my genitals inside or around a hard metallic whirring object is not attractive to me.
    • It would be a huge investment – I can’t imagine a sex robot ever being “cheaper” than finding an appropriate real-life sex partner. And even if I were so fabulously rich as to be able to afford it (or robots were commonplace), I’d consider it much more important to improve my social skills/interactions and find a real-life partner instead of investing in something my right hand already manages with no problems. Maybe a hirable sex-work robot as training-wheels for horny teenagers?
    • But then I feel that for a long-term relationship I’d need to build up an emotional rapport first – in which case the question would be “would you have sex with a robot who you are in love with (and who loves you)” – in which case: yes, if she wanted to as well.

  6. 9

    I do feel that the original article is extremely squishy in its definition of the word ‘robot’. One variation that very few people have mentioned–Surrogates-style machines that have a connection to a living person, who may or may not physically resemble the ‘surrie’, as the machines are known in the comic/movie.

    Note: I personally consider that comic series incredibly flawed on several levels, and more than a little transphobic and ableist. But the key conceit of the story–machines that aren’t so much sentient in and of themselves, as they are operated by someone sentient–gives another facet to the thought-experiment.

    One thing I would note is that I’m not sure a sexbot is invariably more or less morally complex to consider than any other ‘purposed servant bot’ might be. That is, if the creators can control an AI’s desires (and yet still have it be recognized, somehow, as a ‘true’ AI, which is a tricky concept to consider in the first place), is it any more unethical to make it want to bone (or even bone a particular person/people) than it is to make it want to wash dishes or mop the floor (or even, say, perform hard labor in dangerous conditions, like mining asteroids?).

    There was a bit in the article talking about the Japanese RealDoll enthusiasts (which included some of the usual incidental “aren’t Japanese people weird” racism that I expect); but Japan also has a small but solid business model built around the exact opposite concept–social contact without sex. No-sex, no-nudity Host/Hostess clubs, of course, have been a thing there for decades, and there’s even at least one business that just offers ‘cuddling’ time–ie, you get to lay down with a person hugging you for half an hour or more. Would a hyper-realistic hug-bot be an ethical concept?

  7. 10

    There was a bit in the article talking about the Japanese RealDoll enthusiasts (which included some of the usual incidental “aren’t Japanese people weird” racism that I expect)

    freemage @ #9: What are you talking about? I didn’t mention Japanese RealDoll enthusiasts in the article even once.

Leave a Reply

Your email address will not be published. Required fields are marked *