The following warnings occurred: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Warning [2] Undefined array key "avatar" - Line: 14 - File: global.php(816) : eval()'d code PHP 8.2.18 (Linux)
|
![]() |
Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Printable Version +- VGDC FORUMS (http://videogamedc.com/bb) +-- Forum: Welcome to VGDC Forums. (http://videogamedc.com/bb/forum-1.html) +--- Forum: General Discussion (http://videogamedc.com/bb/forum-2.html) +--- Thread: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage (/thread-241.html) Pages:
1
2
|
Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - WeaponTheory - 03-24-2016 Quote:Microsoft launched a smart chat bot Wednesday called “Tay.” It looks like a photograph of a teenage girl rendered on a broken computer monitor, and it can communicate with people via Twitter, Kik and GroupMe. It’s supposed to talk like a millennial teenage girl. http://www.huffingtonpost.com/entry/microsoft-tay-racist-tweets_us_56f3e678e4b04c4c37615502 My question is.... Fucking...WHY?! What a fucking waste of programming, engineering and talent. All to make some almost "Her" shit. The cause http://fusion.net/story/284617/8chan-microsoft-chatbot-tay-racist/ https://twitter.com/tayandyou/ https://www.tay.ai/ "Tay is targeted at 18 to 24 year old in the US." RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Dangerous_D - 03-25-2016 what a fucking waste RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Axess - 03-27-2016 https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/ Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls I personally find AI fascinating, so I hope to see more attempts at it. Hopefully Microsoft doesn't bail out of AI because of this incident. It's their money, they can spend it on whatever R&D they want. RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - WeaponTheory - 03-27-2016 (03-27-2016, 11:32 PM)Axess Wrote: https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/ More like, people thinking outside of the box. If anything, this was a free Beta test to Microsoft from the Chan sites. My God, all that money, and you can't even hire testers for this shit? RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Axess - 03-28-2016 (03-27-2016, 11:56 PM)WeaponTheory Wrote:(03-27-2016, 11:32 PM)Axess Wrote: https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/ I guess trolling can be thinking outside the box. Sure. Also, you seriously think Microsoft didn't test it? lol, It's Micro$oft. This is not the first time you see something go wrong thanks to outside abuse. Even with M$ testing, there is only so much you can do with internal tests, so they expanded to the public for testing. So yes, this was more like a beta, as was explained in the blog post I posted in the previous reply. Microsoft Blog Wrote:As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups. We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience. Once we got comfortable with how Tay was interacting with users, we wanted to invite a broader group of people to engage with her. It’s through increased interaction where we expected to learn more and for the AI to get better and better. RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - WeaponTheory - 03-28-2016 If I had to guess, Microsoft's "idea of testing" the bot was probably feeding it information by either talking to it like it's five years old, or treating it like a personal girlfriend. It took people with the simple mindset of anything relating to racism to make the bot shit the bed. Their testing sucked dick, chief. ![]() RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Axess - 03-28-2016 Eh, they didn't account for immature trolls I guess. Again, the bot ran for over a year in China without this kind of issue, but no one acknowledges that part. Since the AI is not open source, there's no way for us to know really how the bot was abused and exploited. It's more than just sending it racist messages. If that's really all it took, then yeah M$ testing sucked, but the fact that it ran fine in China but not with Twitter testing says more about the online community than M$. RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Shervik - 03-28-2016 despite them turning her into a racist and stuff, I just want to say: What a time to be alive! I mean shit, when I was 6 I had to call my friends by landline. RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - manofonetitle - 03-28-2016 I can't help but feel amused by the fact that it took trolls only a day to corrupt the bot and make a big scene out of it. No filters = No mercy. RE: Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage - Axess - 03-28-2016 (03-28-2016, 01:28 PM)Shervik Wrote: despite them turning her into a racist and stuff, I just want to say: I remember not being allowed to call my cousin often because he lived in another area code, which would lead to higher charges on our phone bill ![]() |