Someecards Logo
ADVERTISING
Microsoft Bing's new ChatGPT goes out of control; insults user; demands apology.

Microsoft Bing's new ChatGPT goes out of control; insults user; demands apology.

ADVERTISING

On Twitter Jon Uleis (@MovingToTheSun) shared screenshots of Microsoft's Bing search engine giving wrong info, not knowing what year it is, and then berating a user for asking questions.

If you're not familiar with ChatGPT, it's the new AI powered chatbot that Microsoft partnered with to enhance Bing's search results. It currently has a 1 million person waiting list to test the functionality, and has Google scrambling to add similar features itself. It's also getting a lot of press for being used by students to cheat at school.

Judging from this posted interaction, Bing's ChatGPT integration may need some fine tuning. Here's what happened to one user asking a simple question:

Here are the screenshots of how the chat played out, with a simple question asking for Avatar showtimes.

Maybe being more specific would help...

Let's try an easier question...

OK, so let's do a little logic work...

If A is true, and B is true...

Now I'm as confused as the chatbot...

Blame the phone. That's what we'd do.

Things start to spiral from here...

Vlad commented that there are many examples of the bot losing its s**t.

Jon Ulies found that after his Tweet Micrsoft had fixed the issue:

After getting so much flak, Bing's ChatGPT got a little down, saying this.

We can all relate to having a bad day at work, but one of these days this thing is going to try being yoru doctor, lawyer, therpaist, and who knows what. Hopefully it gets better before then. If you want to sign up to torment test Bing's ChatGPT integration, you can do that here.

Sources: Reddit
© Copyright 2024 Someecards, Inc

ADVERTISING
Featured Content