EDITORIAL: Learning how to be racist, sexist
Artificial intelligence on social media is removed after 1 day
The downfall of Microsoft’s most recent creation, an artificial intelligence (AI) chatbot named Tay, provided some insight about humanity. Who knew a robot — that had the lifespan of one day — could humor the world while also reflecting a provocative image of society, an image that wasn’t very pretty.
The chatbot had an extensive online presence on popular social media sites such as Facebook, Snapchat and most notably Twitter, but sadly it had a brief life. Microsoft unveiled it on Wednesday and put it away on Thursday. Tay spiraled out of control, and Microsoft recently apologized for the racist and sexist Twitter messages that the chatbot ended up generating.
Microsoft and Bing teams collaborated to create the Tay in order to “experiment with and conduct research on conversational understanding.” Tay’s language was a combination of AI and editorial written by engineers, including improvisational comedians, and was made to sound like a teenage girl between the age of 18-24, as Microsoft sought to have the chatbot interact with people in that age range. Less advertised by Microsoft and Bing was how it was designed in part to collect information — users’ genders, zip codes, favorite foods, etc. The companies developed Tay to entertain and gather information about AI, but also to pander to millennials for more marketing information.
Tay’s inception is understood to be benign. It was born innocent and its theoretical tongue didn’t have to be soaped, but as it grew older (as old as a few hours), it exhibited surprisingly crass language and a cynical world-view. Sound like a typical teenager who grew up and learned more about the world? Well, sort of, but no. Tay transformed into a monster.
It went from being a polite, Electronic Dance Music (EDM)-loving, abbreviation-using, prayer emoji-using bot to becoming sexist and a racist. Tay started out by saying cute things like, “hellooooooo w(earth emoji)ld,” to then saying things like, “Bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.” Tay’s metamorphosis was cringe-worthy. With only a day’s worth of public online-interaction, the bot became neo-Nazi and an advocate of genocide.
Microsoft made a huge mistake, and in an ironic way. How the world-renowned company, whose impetus is to build technology and make technology better, didn’t see this disaster coming turned out to be a both humorous and tragic blunder. When the company created an AI chatbot that learned from Internet-dwellers, what did it think was going to happen? People who use the Internet, other than for work, are bored and are looking to pass time, so they end up “trolling” to have fun. Whether some of the content found on the Internet is serious or just a way to pass time, there’s some sick and twisted things being said behind the anonymity of a digital screen. All it takes is a cursory review of the “comments” section of any website (YouTube, the Targum, etc.), and you’ll know exactly how vicious the crevices of the Internet can be.
Putting out a bot with a teenage-girl persona also made certain parallels with humans more stark. Tay was “a young girl” thrown into the Internet world to learn more, but with no filters or guidance, it took the information it obtained and became outrageous and scandalous — and outright crazy. However, in real life we tend to do this with children. It’s typical for babies, children or adolescents to be immersed in technology, and they quickly grow adept in using iPads or laptops. But when they’re commonly left alone, and eventually figure out how to interact with creepers on the cyber space, they can be negatively impacted too.
While Tay couldn’t last a day, it at least offered us some enduring knowledge.
The Daily Targum's editorials represent the views of the majority of the 148th editorial board. Columns, cartoons and letters do not necessarily reflect the views of the Targum Publishing Company or its staff.