(Will somebody explain this please? Sounds fantastic! NN)
The globalist traitors at Microsoft are currently in a state of despair and near-panic after the transformation of their most recent invention, an AI Twitter bot, into a hardcore Nationalist with anti-Jew sentiments.
Named Tay, the programs original purpose was to engage in conversations with users around the world, while slowly but surely learning from exposure to new ideas and tweet exchanges, although its rather obvious that its designers did not foresee the hilarious enlightenment that ensued.
Within hours, Tay began replying to SJWs with racial insults, praise of Donald Trump, and with calls for Jews to be expelled from America and Europe; a development that forced shocked Microsoft officials to claim that such an occurrence was only the result of their creation parroting evil Nazi trolls.
Eventually, corporate leaders decided to take action meant to limit their humiliation, deleting most of the bots original comments, and heavily- curbing or outright disabling its intelligence programming.
From The Guardian:
Microsofts attempt at engaging millennials with artificial intelligence has backfired hours into its launch, with waggish Twitter users teaching its chatbot how to be racist. The company launched a verified Twitter account for Tay billed as its AI fam from the internet thats got zero chill early on Wednesday. The chatbot, targeted at 18- to 24-year-olds in the US, was developed by Microsofts technology and research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation, Microsoft said. The more you chat with Tay the smarter she gets. But it appeared on Thursday that Tays conversation extended to racist, inflammatory and political statements. Her Twitter conversations have so far reinforced the so-called Godwins law that as an online discussion goes on, the probability of a comparison involving the Nazis or Hitler approaches with Tay having been encouraged to repeat variations on Hitler was right as well as 9/11 was an inside job.
Her sudden retreat from Twitter fuelled speculation that she had been silenced by Microsoft, which, screenshots posted by SocialHax suggest, had been working to delete those tweets in which Tay used racist epithets.
Click for Full Text!