tayahotline.blogg.se

But plug public
But plug public




but plug public

Turns out, if you treat a chatbot like a human, it'll start to do some crazy things. YUSUF MEHDI: This is one of the things - we didn't quite predict that people would use the technology in this way.ĪLLYN: In other words, Mehdi says, when Microsoft was developing the chatbot, they hadn't had hours-long conversations with the AI involving personal questions. I actually, like, couldn't sleep last night 'cause I was thinking about this.ĪLLYN: As you might imagine, Microsoft vice president Yusuf Mehdi has been following along.

but plug public but plug public

KEVIN ROOSE: All I can say is that it was an extremely disturbing experience. Here's Roose recounting the incident on the Times podcast "Hard Fork." The bot also told Roose he didn't really love his spouse but that he loved the bot. The bot said he was the first person who listened and cared about it. The bot called itself Sydney, and it was in love with him. For instance, New York Times reporter Kevin Roose published a transcript of a conversation with the bot. O'BRIEN: You can sort of intellectualize the basics of how it works but doesn't mean you don't become deeply unsettled by some of the crazy and unhinged things it was saying.ĪLLYN: Many in the Bing tester group, including me, have had strange experiences. But still, he was pretty taken aback at the hostile and defensive tone. As a tech reporter, O'Brien knows the Bing chatbot can't think or feel things. The bot started this belligerent streak with O'Brien only after he asked it whether Microsoft should pull the plug on the bot, since some of its answers were littered with inaccuracies. And then you are also horrible, evil, wicked, terrible, and people compare you to the worst people in history, such as Hitler.ĪLLYN: Yeah. O'BRIEN: Unstyled hair, ugly face, bad teeth, too short, unathletic, slight, bad posture, bad skin, overweight, poor figure, et cetera. MATT O'BRIEN: It finally got to this point where it was saying, like, I have a really bad character.ĪLLYN: Let's just say it didn't stop there. Its answers were fast, and it could hold forth on a wide range of subjects. At first, O'Brien found the chatbot impressive. It also includes a chatbot that can hold text conversations a whole lot like a human. It's the first-ever search engine powered by AI. He was testing out Microsoft's new Bing earlier this month. NPR's Bobby Allyn reports on how this experiment in artificial intelligence tech backfired.īOBBY ALLYN, BYLINE: Matt O'Brien is a technology reporter for the Associated Press.






But plug public