Taylor Swift threatened to sue Microsoft over racist ‘Tay’ chatbot

"The name Tay, as I'm sure you must know, is closely associated with our client'"

Taylor Swift tried to sue Microsoft over a chatbot which posted a string of racist messages online, the company’s president has revealed.

In a new biography by Brad Smith, he reveals that the singer’s lawyers challenged Microsoft in 2016.

Swift was reportedly angered by Microsoft’s chatbot Tay, which was designed to interact with 18 to 24-year-olds, because it was similar to her own name.

Microsoft’s AI bot was launched in 2016 and was designed to learn from social media conversations.

But it took a sinister turn when it began tweeting a slew of racist messages – including support for genocide and holocaust denial.

The bot was pulled offline after 18 hours and Microsoft were forced to issue an apology.

According to The Guardian, Swift’s legal challenge was centred around the bot’s similarity to her own name.

Taylor Swift

Taylor Swift

In an extract from his new book Tools and Weapons, Smith says: “I was on vacation when I made the mistake of looking at my phone during dinner.

“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“The name Tay, as I’m sure you must know, is closely associated with our client’,” Smith adds.

“No, I actually didn’t know, but the email nonetheless grabbed my attention.”

Smith says the lawyer argued that the name Tay was a legal violation and “created a false and misleading association between the popular singer and our chatbot”.

Swift is known for her fierce intellectual property rights. In 2015, she attempted to trademark the song names ‘Blank Space’, ‘1989’ and the title of her childhood novel.

Last night, NME was in attendance as Swift delivered an intimate concert for her global fans in Paris.