When we consider the threats posed by robots we might imagine a mechanical army marching on DC or a super-intelligent algorithm overwhelming the Internet. In the nearer future we may envision our jobs lost to automation. But we wouldn’t immediately think of sex toys.

Plenty of individuals fight for sex robots. (Next month Iskandar, Malaysia hosts the Second International Congress on Love and Sex with Robots, which seeks to explore robot emotions, intelligent electronic sex hardware, and roboethics among other concrete and abstract endeavors. On their website the congress notes “a strong upsurge of interest in the more personal aspects of human relationships with these artificial partners,” and points to the media, documentaries, films, and even the academic community as workshops for these ideas.) But a growing number of theorists warn that an immediate threat comes from the seemingly benign development of such objects.

Earlier this year a Japanese firm rolled out its newest generation of Pepper, an android robot with work prospects from banking to education to hotel hospitality. Billed as the “first humanoid robot designed to live with humans,” Pepper can converse with people, recognize them and even respond to their emotions by analyzing facial expressions, body language and choice of words. With these skills it has found its way into classrooms, through lobbies and behind cash registers. It hasn’t, however, climbed into bed per a clause in the products’ user agreement.

SoftBank—the firm that produces Pepper—has posted the user agreement that accompanies their product. (Note, agreement in Japanese.) The most interesting stipulation appears at point 4, which loosely translates to prohibiting “Acts for the purpose of sexual or indecent behavior, or for the purpose of associating with unacquainted persons of the opposite sex.” It’s a surprise anyone even found the clause in the oft overlooked terms and conditions, but it’s nonetheless binding. If you own or employ Pepper, you must not attempt intercourse of any kind.

The four-foot-tall, pearl white android is buffed, slick and ergonomic, but it isn’t the aesthetic prototype for human-robot (sexual) relations. Practically speaking, one would need to get creative to make sex with Pepper even possible. This is at least part of the reason why SoftBank includes such a clause in their user agreement—i.e. to restrict hackers from manipulating their product for an erotic end. But the other, more significant reason is in image and branding.

Pepper is an android at its most basic—though his creators would like for us to think otherwise. Whatever parallel emotions we might recognize in his behavior are instead the products of careful programming and fine branding. “At risk of disappointing you,” Aldebaran (SoftBank Group) writes on their website, “he doesn’t cook, doesn’t clean and doesn’t have super powers…” By anthropomorphizing the robot, its creators encourage us to identify with it, to feel for it and to respect it like we would another human being. This, of course, helps them further establish the robot’s role as a potential employee.

It also helps mitigate the looming discontent and potential violence of ex-employees as they lose their work to automation. By making Pepper more human in our eyes, SoftBank protects the robot from the type of aggressions we people tend to take out on objects. (Consider the man who attacked an innocent store robot in Tokyo last month after the robot’s human colleague failed to provide the customer with satisfactory service.) Barring us from sex with Pepper automatically conceptualizes the robot as a subject rather than an object, and affords it the same tentative respect as a human.

IGN News covered the hubbub around Pepper’s terms of service:

Not all sex robot opponents are fighting to make robots more recognizably humanlike. Indeed some are vying for the opposite. Just last month researchers Kathleen Richardson and Erik Brilling launched the Campaign Against Sex Robots, an initiative intended to generate discourse, minimize social inequalities, and raise concerns with the growing support for sex robot R&D.

Richardson and Brilling’s main angle is that “the development of sex robots further sexually objectifies women and children,” perpetuating the current prostitute-john exchange, which prioritizes the needs/wants of the buyer (john) over the needs/wants of the seller (prostitute). They list potential negative consequences as reduced human empathy and reinforced power relations of inequality and violence. To support their position Richardson released her research paper, “The Asymmetrical ‘Relationship’: Parallels Between Prostitution and the Development of Sex Robots,” which you can read here.

Though they may have the same desired end, SoftBank and the Campaign Against Sex Robots have disparate intentions. For SoftBank, the want is to make Pepper more human and thus more identifiable as an agent or a subject in our lives. What better way than affording the android its sexual autonomy? Meanwhile Richardson and Brilling insist that this conflation of woman and machine would directly and negatively affect human-human interaction. At risk here is not the android’s autonomy, but the perpetuation of oppressive power structures.

Divergent as their intentions may be, both find a common foe in sex robot advocates like Professor Adrian Cheok (City University London) and Dr. David Levy (Intelligent Toys Ltd London) who chair the Congress on Love and Sex with Robots. Though the latter pair are more receptive to human-robot relations, there’s no doubt they’ll give thought to the the implications of intercourse with androids—both for society and the robots themselves.

Image credit: Softbank