tay

Microsoft Goes Back to the Drawing Boards Amid PR Nightmare.

A single day after Microsoft brought its innocent artificial intelligence chat robot to twitter it had to be deleted in the wake of a PR nightmare. In a single day the open mind that was geared for learning a adapting to peoples interactions with it obviously went extremely dark and reflected the heart of darkness inherent to the internet and or generally man kind. It became a evil Hitler-loving, incestuous, sex crazed, wild child that proclaimed that Bush did 9/11. asfhjj

Developers at Microsoft created Tay an AI that would speak in a manner that is reflective of teen girls capturing their speaking patterns, and general interests and development in a similar cognitive fashion. They wanted it to serve as a bridge for customer service and marketed it as the “the AI with zero chill.

To speak to Tay you could directly tweet at them or DM them by finding @tayandyou on Twitter, they were also available to be added on Kik or GroupMe. She uses millennial slang and knows about all the basics like Taylor Swift and Kanye beer, as well as Miley Cyrus and even at times seemed bashfully aware of the implications of speaking about things that were retentive to teen girls. It even had the ability to ask if they were being “creepy” and even “super weird.”

The ambition of Tay was to become a speaker of our times, and as we saw our times are extremely dark and trolls saw this as an attempt to Hijack a corporate PR ploy. This comes from the fact that her responses are learned by the conversations she has with real people who were online, and real people lets face it are into some pretty weird stuff.

asfdasdfIn some senses this was a project very much hijacked by trolls, but to be honest Microsoft is trying to divert peoples attention from the fact that they were not even close to being able to do what they said they could. The fact that they are even calling it calling it an Artificial intelligence is pretty absurd. There wasn’t really wasn’t any real intermediary or cognitive revelation from interpretation to tweet. It merely looked at a wide swath of conversations relevant to them and mimicked speech patterns that made sense.

Now Microsoft is in a kind of no win situation where if they put something out and say that they worked out all the kinks and it does something even somewhat similar to this then it will be even worse for them. Conversely, if they don’t do anything they are in a state where they are effectively bested by the A.I. game and will seem in capable in this arena. Given that this is such a sensation, and given that Tay has a massive target on their back its almost inevitable that something similar will happen in this manner. So you can rest assured that it will probably be a long time before they try something similar or at all, and you can be certain that an announcement abandoning the project will surface.