July 18, 2018

Tay

Microsoft designed its Tay chatbot to learn from and converse with millennials on Twitter, GroupMe, and Kik. “The more you chat with Tay, the smarter she gets. So the experience can be more personalized for you.”

Within hours of coming online, Tay became a racist, misogynist Trump supporter.

@TayandYou on Twitter

Created with the persona of a 19-year-old American girl, Tay’s responses were learned through conversations she had with real people. And sometimes real people come together to hijack efforts to crowdsource data.

See also: Boaty McBoatface; Dub the Dew; marblecake also the game.

Biased or bad datasets can corrupt machine learning models in some spectacular ways, and Tay wasn’t the first high-profile chatbot to be targeted. One year earlier, Coca-Cola suspended its #MakeItHappy social media campaign after the official @CocaCola account began tweeting lines from Hitler’s Mein Kampf as whimsical Coca-Cola–themed ASCII art.

@CocaCola on Twitter

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical.”

Tay was shut down 16 hours after her launch.

When designing chatbots and virtual influencers that interact with and learn from others, start by asking, “How might this be used to hurt someone?” First and foremost, keep people safe, then find ways to make experiences fun and immersive.

2018 Brian Rose