If you’re still on the waitlist for the new Bing, then it shouldn’t be too much longer. Corporate Vice President & Consumer Chief Marketing Officer, Microsoft Yusuf Mehdi posted on Twitter that they’ll be rolling it out to “millions of people” over the next couple of weeks.
If you happen to be among the fortunate individuals who have obtained access, you may find yourself devoting an equal amount of time to providing it with arbitrary prompts, assessing its proficiency, and attempting to induce a malfunction as you do to genuinely looking for pertinent information.
Or maybe that’s just me.
Over the last week, we’ve seen Bing help me find the best coffee shops in Seattle, and give me a pretty ok itinerary for a 3-day weekend in NYC.
But in another random search for the best restaurants in my area, it refused to show me more than the 10 it had already presented, even when I told it I wasn’t interested in those. Eventually I had to revert back to Google Maps.
Sydney, off the rails
Accused of having somewhat of a “combative personality,” Sydney (Bing’s ChatGPT AI) isn’t pulling any punches. Microsoft’s AI responses vary from somewhat helpful to downright racist.
Let’s take a look at how “Sydney” is dealing.
Not happy about a “hacking attempt.”
Or the Ars Technica article.
Dealing with Alzheimer’s.
And gaslighting (because apparently, it’s 2022).
And who can forget Tay, a Microsofts Twitter bot from 2016.
Why we care. We know AI isn’t perfect yet. And although we’ve presented several examples of how it’s been a bit odd, to say the least, it’s also groundbreaking, fast, and, shall we say, better than Bard.
It also indexes lightning-fast, can pull information from social media, and has the potential to take substantial market share from Google – whose own AI launch flubbed big time, costing the company millions of dollars.
The post Bing’s new ChatGPT has multiple personalities appeared first on Search Engine Land.