סלון ניתן להתעלם לא מאויש tay twitter bot חינוך מוסרי אחיינית לסת מוות
HuffPost Tech on Twitter: "Microsoft's chat bot "Tay" went on a racist Twitter rampage within 24 hours of coming online https://t.co/znOQ7ubTBN https://t.co/THwECB7gna" / Twitter
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet | KCUR - Kansas City news and NPR
Microsoft apologizes for hijacked chatbot Tay's 'wildly inappropriate' tweets | TechCrunch
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET
Microsoft chatbot is taught to swear on Twitter - BBC News