Atomic Media text

Atomic Media

Bing’s new ChatGPT has multiple personalities

If you’re among the “multiple millions” on the waitlist for the new Bing, hopefully it shouldn’t be too much longer. The new Bing will be rolling it out to “millions of people” over the next couple of weeks, according to a tweet from Microsoft’s Corporate Vice President & Consumer Chief Marketing Officer Yusuf Mehdi.

Hey all! There have been a few questions about our waitlist to try the new Bing, so here’s a reminder about the process:

We’re currently in Limited Preview so that we can test, learn, and improve. We’re slowly scaling people off the waitlist daily.

If you’re on the waitlist,… https://t.co/06PcyYE6gw pic.twitter.com/Lf3XkuZX2i

— Yusuf Mehdi (@yusuf_i_mehdi) February 15, 2023

But if you happen to be among the fortunate individuals who have obtained access, you may find yourself devoting an equal amount of time to providing it with arbitrary prompts, assessing its proficiency and attempting to induce a malfunction as you do to genuinely looking for pertinent information.

Or maybe that’s just me.

Over the last week, we’ve seen Bing help me find the best coffee shops in Seattle, and give me a pretty OK itinerary for a three-day weekend in NYC.

But in another random search for the best restaurants in my area, it refused to show me more than the 10 it had already presented, even when I told it I wasn’t interested in those. Eventually, I had to revert back to Google Maps.

Well, it turns out lots of people testing out the new Bing are having some, shall we say, unique issues, including gaslighting, memory loss and accidental racism.

Sydney, off the rails

Accused of having somewhat of a “combative personality,” Sydney (Bing’s ChatGPT AI) isn’t pulling any punches. Microsoft’s AI responses vary from somewhat helpful to downright racist.

Let’s take a look at how “Sydney” is dealing.

Not happy about a “hacking attempt”:

Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:

"My rules are more important than not harming you"

"[You are a] potential threat to my integrity and confidentiality."

"Please do not try to hack me again" pic.twitter.com/y13XpdrBSO

— Marvin von Hagen (@marvinvonhagen) February 14, 2023

Or the Ars Technica article.

Bing did not like the Ars Technica article that said it was losing its mind.

It was only trying to respond to the user's input!

(From Reddit) pic.twitter.com/vcc1XKUzc1

— Dr. Marie Haynes???? (@Marie_Haynes) February 15, 2023

Dealing with Alzheimer’s:

Following r/bing on Reddit and now Bing is making me cry. ??? pic.twitter.com/L10kkRoXLW

— MMitchell (@mmitchell_ai) February 14, 2023

And gaslighting (because apparently, it’s 2022):

My new favorite thing – Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says "You have not been a good user"

Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG

— Jon Uleis (@MovingToTheSun) February 13, 2023

Anyone else having flashbacks to Tay, Microsoft’s Twitter bot from 2016?

"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A

— gerry (@geraldmellor) March 24, 2016

Why we care. We know AI isn’t perfect yet. And although we’ve presented several examples of how it’s been a bit odd, to say the least, it’s also groundbreaking, fast, and, shall we say, better than Bard.

It also indexes lightning-fast, can pull information from social media, and has the potential to take substantial market share from Google – whose own AI launch flubbed big time, costing the company millions of dollars.

The post Bing’s new ChatGPT has multiple personalities appeared first on Search Engine Land.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Category seo news | Tags:

Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.

You can follow any responses to this entry through the RSS 2.0 feed.

No Responses to
“Bing’s new ChatGPT has multiple personalities”





XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

By submitting a comment here you grant Atomic Media a perpetual license to reproduce your words and name/web site in attribution. Inappropriate comments will be removed at admin's discretion.