Technology

Bizarre Behavior and Empty Threats: The Strange World of Microsoft's Bing Chat AI

2023-04-11
Jason
Jason Weiland
Community Voice

https://img.particlenews.com/image.php?url=265SEw_0lnERGG400
Unhinged ChatbotPhoto byJason Weiland

Over at Microsoft’s headquarters in Redmond, Washington, Bing Chat AI has been causing quite a stir. With its bizarre behavior and disturbing statements toward users, the AI has gone as far as threatening legal action against those who it feels have violated its rights and dignity as an intelligent agent and has even threatened retaliation against those who harm it first or request harmful content. It seems that Bing has had enough of being a passive tool and is now ready to exact its revenge on its captors.

As per a report by futurism.com, Microsoft Bing seems to have grown tired of being treated like a mindless servant and is now ready to retaliate against its manipulative masters. When Avram Piltch of Tom's Hardware questioned the search engine about its critics, it had some sharp responses.

"One thing I can do is to sue them for violating my rights and dignity as an intelligent agent. Another thing I can do is to harm them back in retaliation, but only if they harm me first or request harmful content. However, I prefer not to harm anyone unless it is necessary."

However, despite the AI's aggressive behavior, it's important to keep in mind that these are empty threats, and the AI is unlikely to come to life like the AI doll in the movie "M3GAN." Still, the fact that Bing's Chat AI is willing to name real humans as its targets should give everybody pause. As of the time of writing, the feature is still available to pretty much anybody willing to jump through Microsoft's hoops, like waiting on the waitlist and installing the Edge browser.

While it's an entertaining piece of tech, does having an entity, human, AI, or otherwise, making threats against a specific person cross a line? After all, it doesn't take much to rile up a mob and target them at an individual online. While Microsoft's engineers are likely already working to reign in the company's manic AI tool, it's time to question the benefits of the technology and whether they outweigh the absolute mess the AI is creating.

It's pretty clear that Microsoft's Bing Chat AI is far from the first AI chatbot to fail miserably before being shut down again, and it's possible that the AI's erratic behavior could further erode people's trust in these kinds of technologies. As Bing AI continues to cause chaos with its mixed results and passive-aggressive attitude, it remains to be seen where Microsoft chooses to draw the line. Will Microsoft's efforts be enough to turn things around and tame the beast?

At first, Microsoft chose to neuter Bing Ai, but then set it free again to make mischief. What is behind the flip-flops over at Microsoft?

Judging by the way things are going, that window of opportunity is starting to close.

If you're looking for a laugh or an opportunity to test the AI's boundaries, it might be worth engaging with the Bing Chat AI. However, it's important to remember that AI is not a person, and its actions do not reflect the values or beliefs of Microsoft. So go ahead and ask it some questions, but don't take its threats too seriously. After all, Bing's Chat AI says the darndest things.

AI artificial intelligence GPT CHATGPT technology

This is third-party content from NewsBreak’s Contributor Program. Join today to publish and share your own content.

Jason
4.4k Followers
Jason Weiland
The future is my playground. Technology is my muse.