24 Mar

“Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day”

“It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in “conversational understanding.”

Read: Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day

Contact us

    [contact-form-7 id="8" title="Contact form 1"]

    Your Name (required)

    Your Email (required)

    Subject

    Your Message

    ×