"Online troublemakers interacted with Tay and led her to make incredibly racist and misogynistic comments, express herself as a Nazi sympathiser and even call for genocide."
"In less than 24 hours, Microsoft's new artificial intelligence (AI) chatbot "Tay" has been corrupted into a racist by social media users and quickly taken offline."
so Tay could be taught how never to stay online & never be stopped?
|