bavul confine kanıt tay bot twitter Çıplak bir mektup yaz ayçiçeği
Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post
Microsoft's Rogue Chat Bot 'Tay' Makes Brief Return to Twitter - ABC News
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
Microsoft's racist teen bot briefly comes back to life, tweets about kush
Tay (bot) - Wikipedia
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours
Microsoft Research and Bing release Tay.ai, a Twitter chat bot aimed at 18-24 year-olds » OnMSFT.com
Why Microsoft's chatbot Tay should make us look at ourselves - Business Insider
Requiem for Tay: Microsoft's AI Bot Gone Bad – The New Stack
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian
Requiem for Tay: Microsoft's AI Bot Gone Bad – The New Stack
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch
Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft follows Tay chatbot with fresh bot projects for Cortana and Skype | Cloud Pro
Microsoft's artificial Twitter bot stunt backfires as trolls teach it racist statements | The Drum
After racist tweets, Microsoft muzzles teen chat bot Tay
Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts posting genocidal racist comments that defended HITLER one day after launching | Daily Mail Online
Microsoft deletes racist, genocidal tweets from AI chatbot Tay - Business Insider