It appears you have not yet registered with our community. To register please click here...

 
Go Back [M] > Madshrimps > WebNews
Microsoft's racist twitter bot gets Swift response Microsoft's racist twitter bot gets Swift response
FAQ Members List Calendar Search Today's Posts Mark Forums Read


Microsoft's racist twitter bot gets Swift response
Reply
 
Thread Tools
Old 13th September 2019, 09:58   #1
[M] Reviewer
 
Stefan Mileschin's Avatar
 
Join Date: May 2010
Location: Romania
Posts: 153,575
Stefan Mileschin Freshly Registered
Default Microsoft's racist twitter bot gets Swift response

Claimed it used her name in vain

Apple fangirl Taylor Swift sued Microsoft over a racist AI chatbot which used twitter.

In 2016, Vole was incredibly embarrassed when its AI chatbot learnt to be a bigot. However, what was not known at the time that Taylor Swift threatened legal action because the bot's name was Tay.

Tay was a social media chatbot geared toward teens first launched in China before adapting the three-letter-moniker when moving to the US. The bot, however, was programmed to learn how to talk based on Twitter conversations. In less than a day, the automatic responses the chatbot tweeted had Tay siding with Hitler, promoting genocide, and just generally hating everybody. Microsoft immediately removed the account and apologised.

The popular-beat combo singer believed that since she was also known as Tay, her fans could have confused her with the racist bot. It is an easy mistake to make when you see the name Tay you automatically think of Taylor Swift in the same way that when you see a rounded rectangle, you always think of Apple.

When the bot was reprogrammed, Tay was relaunched as Zo which was less likely to hack off a pop diva if it started saying that the Holocaust never happened, or the iPhone 11 was pricey.

The story came out in a book Tools and Weapons by Microsoft president Brad Smith and Carol Ann Browne, Microsoft's communications director. According to The Guardian, the singer's lawyer threatened legal action over the chatbot's name before the bot broke bad. The singer claimed the name violated both federal and state laws. Rather than get in a legal battle with the singer, Smith writes, the company instead started considering new names.

https://fudzilla.com/news/49384-micr...swift-response
Stefan Mileschin is offline   Reply With Quote
Reply


Similar Threads
Thread Thread Starter Forum Replies Last Post
Twitter bans Congressional candidate after racist image Stefan Mileschin WebNews 0 17th February 2018 15:41
Black lawmakers call on Facebook and Twitter to purge racist ads Stefan Mileschin WebNews 0 11th October 2017 06:32
PewDiePie in trouble once again for racist outburst Stefan Mileschin WebNews 0 14th September 2017 06:28
Tim Cook condemns 'repulsive' racist violence in Charlottesville Stefan Mileschin WebNews 0 19th August 2017 10:23
The Facebook president and Zuck's racist rulebook Stefan Mileschin WebNews 0 29th May 2017 10:54
Microsoft's Tay AI makes brief, baffling return to Twitter Stefan Mileschin WebNews 0 1st April 2016 10:58
It's not Tay's fault that it turned racist. It's ours. Stefan Mileschin WebNews 0 26th March 2016 16:09
Microsoft is also bringing Swift to Windows 10 Stefan Mileschin WebNews 0 6th May 2015 11:17
Microsoft's Adam Orth Resigns Over Twitter Tirade Stefan Mileschin WebNews 0 12th April 2013 09:07
Boffin claims Google ads are racist Stefan Mileschin WebNews 0 11th February 2013 07:55

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


All times are GMT +1. The time now is 11:54.


Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
SEO by vBSEO