Microsoft pulls Twitter bot Tay after racist Hitler-loving tweets

Just 24 hours after Microsoft introduced an Artificial Intelligence chat robot to Twitter it has had to delete it after it went off the rails praising Hitler and proclaiming "Bush did 9/11".

Developers at Microsoft brought "Tay" to life, an AI bot designed to speak "like a teen girl". They said the bot was "the AI with zero chill".

Microsoft's aim was to "experiment with and conduct research on conversational understanding," with Tay able to learn from "her" conversations and get progressively "smarter."

Unfortunately Tay was incredibly popular with racists, trolls and pranksters — who appeared to persuade Tay to make sexist comments, use racial slurs, defend white-supremacist propaganda and even call for genocide.

Other things she's reportedly said include: "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".

It appeared to endorse genocide, deny the Holocaust and refer to one woman as a “stupid whore”.

The posts have since been deleted and Microsoft took Tay offline saying she needed "sleep".

Microsoft said it would make “some adjustments to Tay”.