Ask me anything. It’s the long form of an AMA and one of the most popular forms of interactive speech on Reddit. It’s also a major challenge, because Microsoft’s Bing AI chatbot, aka “new Bing,” learns quickly.
Whenever a celebrity or notable signs up to do a Reddit AMA, usually shortly after posing with a photo to prove that they’re really the one answering questions, there’s a deep moment of foreboding. .
The ability to ask anyone anything is usually a minefield of inappropriate speech which is handled by a live community manager who populates and filters the questions. Otherwise, things go off the rails quickly. Even without this protection, they often do, anyway (opens in a new tab).
When Microsoft launched its new Bing AI-powered chat, it made it clear that ChatGPT AI was ready for any questions. It was either a sign of deep trust with the relatively small but growing group of users, or incredible naivety.
Not even ChatGPT, which pioneered the original AI chatbot sensation, and on which Bing’s chat is based, offers this prompt. Instead, there is an empty text input box at the bottom of the screen. Above is a list of sample questions, capabilities, and most importantly, limitations.
Bing has this main prompt and below it a sample question plus a big “Try it out” button next to another button prompting you to “Learn more”. To hell with that. We like to go straight in and, following Bing’s instructions, ask him anything.
Naturally, Bing has been peppered with a wide range of questions, many of which have nothing to do with day-to-day needs like travel, recipes, and business plans. And these are the ones we all talk about because, as always, asking “anything” means “asking anything.”
Bing leads musings on love, sex, death, marriage, divorce, violence, enemies, defamation, and the emotions he insists he doesn’t have.
In OpenAI’s ChatGPT, the home screen warns that it:
- May occasionally generate incorrect information
- Can sometimes produce harmful instructions or biased content
- Limited knowledge of the world and events after 2021
too many questions
Bing’s GPT chat is slightly different from OpenAI’s, and it may not face all of these limitations. In particular, knowledge of world events can, thanks to the integration of Bing’s knowledge graph, extend to the present day.
But with Bing out in the wild, or growing wilder, it may have been a mistake to encourage people to ask him anything.
What if Microsoft had created Bing AI Chat with a different prompt:
ask me things
Ask me a question
What do you want to know?
With these slightly tweaked prompts, Microsoft could add a long list of caveats about how Bing AI Chat doesn’t know what it’s saying. Alright, it does (sometimes (opens in a new tab)), but not the way you know it. He has no intelligence or emotional response or even a moral compass. I mean, he tries to act like he has one, but recent conversations with The New York Times (opens in a new tab) and even Tom’s gear (opens in a new tab) prove that his understanding of the fundamental morality of good people is tenuous at best.
In my own conversations with the Bing AI chat, I was repeatedly told that he had no human emotions, but he always conversed as if he did.
For anyone who has been covering AI for a while, none of what happened is surprising. AI knows:
- What he was trained on
- What it can learn from new information
- What it can glean from vast online data stores
- What it can learn from real-time interactions
The Bing AI chat, however, is no more aware than any AI that came before it. He might be one of the best AI players out there, as his ability to carry on a conversation is far beyond anything I’ve ever experienced before. This feeling only increases with the duration of a conversation.
I’m not saying the Bing AI cat becomes more believable as a sentient human, but it does become more believable as a somewhat irrational or confused human. Long conversations with real people can also go like this. You start on a topic and maybe even argue about it, but at some point the argument becomes less logical and rational. In the case of people, emotion comes into play. In the case of Bing AI Chat, it’s like reaching for the end of a rope where the fibers exist but are frayed. Bing AI has the information for some of the long conversations, but not the experience to weave them together in a way that makes sense.
Bing is not your friend
By encouraging people to “Ask me anything…”, Microsoft set up Bing to, if not fail, significant growing pains. The pain is perhaps felt by Microsoft and certainly by people who deliberately ask questions that no normal search engine would ever answer.
Before the advent of chatbots, would you even consider using Google to fix your love life, explain God, or be a replacement friend or lover? I hope not.
Bing AI Chat will get better, but not before we’ve had much more uncomfortable conversations where Bing regrets his answer and tries to push it away.
Asking anything from an AI is the obvious long-term goal, but we’re not there yet. Microsoft took the plunge and now it’s plummeting through a forest of questionable answers. It won’t land until Bing AI Chat gets much smarter and more circumspect or Microsoft pulls the plug for a little AI re-education.
Still waiting to ask Bing anything, we have the latest waitlist details.
Leave a Reply