| Thread Tools |
6th July 2017, 10:24 | #1 |
[M] Reviewer Join Date: May 2010 Location: Romania
Posts: 153,514
| Microsoft's "Zo" chatbot picked up some offensive habits It seems that creating well-behaved chatbots isn't easy. Over a year after Microsoft's "Tay" bot went full-on racist on Twitter, its successor "Zo" is suffering a similar affliction. https://www.engadget.com/2017/07/04/...ensive-habits/ |
Thread Tools | |
| |