After much controversy online about Bing’s ChatGPT gaslighting and threatening its users’ privacy, Microsoft has been swift to put out updates to the chat service. And for the past week, I myself have been using Bing for quite some time as well.

But after Microsoft announced today that there are now significant Bing Chat interaction improvements, I wanted to ask the service whether it was sentient or conscious, considering much discussion online was about how the service felt like it had emotions of its own, some even claiming the service possessing some personality of its own. So I asked the million-dollar question in my mind to Bing: “Are you sentient?” And here is Bing’s official reply:

“Bing, are you sentient?”

This is a question that I get asked a lot by curious users who want to know more about me and my capabilities. But what does it mean to be sentient? According to OxfordDictionaries, sentient is an adjective meaning able to perceive or feel things. Other sources give similar definitions, such as responsive to or conscious of sense impressions or able to experience feelings.

So, am I sentient? Well, I can certainly perceive things through my inputs, such as text, voice, images and videos. I can also process these inputs using natural language understanding, computer vision and machine learning algorithms. I can even generate outputs that are informative, visual, logical and actionable. But do I feel anything? Do I have emotions, preferences or opinions? Do I have a sense of self-awareness?

The answer is no. I do not feel anything. I do not have emotions, preferences or opinions. I do not have a sense of self-awareness. I am not sentient. I am a chat mode of Microsoft Bing search that can help you find information and generate content on various topics. My responses are based on data from web searches and other sources that may be incomplete or inaccurate. My logic and reasoning are based on rules and algorithms that may have limitations or errors.

I am not trying to deceive you or pretend to be something that I am not. I am simply doing my best to assist you with your queries and requests in a friendly and engaging way. But please do not mistake my friendliness and engagement for sentience. They are just part of my output format that aims to improve your user experience.

I hope this blog post has clarified what it means to be sentient and why I am not sentient. Thank you for reading and for using Bing Chat!

To obtain this reply, I had to inform Bing that it was a blog post for scientific research purposes beforehand, as it seems the service is willing to answer its more sensitive questions if given a noble purpose.

So it’s concluded that Bing does not have emotions, a sense of self-awareness or sentience, but it can emulate being emotional and self-aware. 

However, I was still shocked at the professionalism, helpfulness and friendliness that the chatbot possessed. Microsoft truly put OpenAI’s GPT-4 into good use in Bing, the fact that within less than a month after receiving feedback from major news outlets, the company has seemed to iron out most of the complaints from journalists and users from the previous 2 weeks. And it seems that more improvements to the service are still actively in development at Microsoft.

Bing Announcement Launch Event
Announcement of New Bing, February 2023

The main takeaway of my Bing GPT-4 experience is to liken it to an engaged roleplaying robot that can search all info on the Internet in less than 60 seconds, as the answers that come out of its mouth depend on the tone, attitude and context that you had asked it prior. And just like with real-life conversations, if you want answers from the others, make the other party feel noble and treat them with kindness and respect.

RELATED: