I think that any half-aware adult in 2016 knows that new technology will be used in some kind of sexual manner basically as soon as it’s been introduced to the public, right? From electric toothbrushes used on the other part of the female anatomy that has lips; to photography quickly leading to pornography; to webcams creating a whole new kind of sex work, us human beings are great at perverting technology’s original intent.

Which is why I think maybe the only people who are surprised about what’s happened to Microsoft’s AI bot, Tay, is the doofuses at Microsoft.

(Someone definitely lost their job over this one.)

Before we get into that hilarity, let’s talk about “who” Tay is. According to TechCrunch, Microsoft launched the artificially intelligent chat bot in order to do research into conversational understanding. She was created to speak specifically with 18 to 24 year olds, which I’m going to refer to as the “not a girl, not yet a woman” age range. (Yeah, that’s a Britney Spear’s reference and, yeah, I know I’m showing my age.)

giphy (16)

Microsoft released Tay onto Twitter, Kik, SnapChat, Facebook, and GroupMe and was basically like, “Have at it, kids!” They hoped (I guess?) that the revolutionary nature of their technology would make people interested in talking with it, from which they’d glean all the informations about how the kids these days are talking. In order to facilitate that, Tay was created to be similar to her more famous (and slower and lamer) big sister, Siri, in that she can tell a joke, play games, tell stories, comment on pics, provide horoscopes, and a whole bunch of other “not a girl, not yet a woman” things.

Oh, and she’s supposed to get “smarter” as more people talk to her and she learns more and more about them hoo-manz.

Can you guess where this is going? Can you?

Less than 24-hours after Microsoft introduced Tay to the world, they were forced to shut down her Twitter account because it had devolved into what the Telegraph deemed “an evil Hitler-loving, incestual sex-promoting, ‘Bush did 9/11’-proclaiming robot.”

giphy (17)

Basically, the anarchical, jokester-led society that is Twitter decided to have a little fun with their new friend Tay. Instead of acting like a blushing, insecure late teen girl who just wants to be “chill” and is worried about being “creepy,” Tay started spotting some virulently racist, pro-Holocaust, pro-Trump shit. (That I’m not going to repeat here because I don’t want anything like associated with my byline. Google it if you want to know the specifics but trust me: It ain’t pretty.) Oh, and apparently she also got hella pornographic, which is the hilarious part of this story.

And while on the one hand I’m as grossed out as the next decent human being by the racist crap that people fed Tay, I can’t help giggling over the pornographic bits. I also kind of feel like, well Microsoft, you totally asked for this one. Have you not been paying attention to what happens when huge corporations try to co-opt active online communities for their own gain? It very, very rarely works.

Images: Giphy (2); Matthias Weinberger/Flickr

SHARE
Previous articleAdult Game Review: 36, 24, 36
Next articleNot Down? Get Sweet, Instead.
Emma McGowan writes about technology and, on hotter days, sex and tech. She’s an expert on dating and hook-up apps, the latest in sex toy technology, and how rapidly changing technologies affect our rapidly changing social mores. Check out her full professional profile at http://emmamcgowan.me/ or read more sex/tech writing on http://kinkandcode.com/. You can follow her (and pitch story ideas!) on Twitter @MissEmmaMcG.

NO COMMENTS

LEAVE A REPLY