MUSIC- American pop singer Taylor Swift reportedly threatened to sue software giant Microsoft over its chatbot Tay which posted racist messages on Twitter.
As revealed in the biography of Microsoft boss Brad Smith, Taylor’s lawyers tried to sue Microsoft in 2016.
Microsoft developed a chatbot named TayTweets to interact with 18-24 year olds online.
Taylor was not happy with the name as it resembled hers and also the chatbot was indeed racist. It tweeted to say that it supported genocide and did not believe holocaust happened.
Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter.