Microsoft Teen Tweetbot Goes Wild

The Telegraph reported that Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. The idea was to teach the AI about culture by having it tweet with people online. Within 24 hours however the internet taught Tay to be a Hitler-loving, sex-promoting, 'Bush did 9/11'-proclaiming robot. Microsoft has since shut down Tay.