Business

OpenAI worries people may become emotionally reliant on its new ChatGPT voice mode

2 Mins read

OpenAI is worried that people might start to rely on ChatGPT too much for companionship, potentially leading to “dependence,” because of its new human-sounding voice mode.

That revelation came in a report Thursday from OpenAI on the safety review it conducted of the tool — which began rolling out to paid users last week — and the large language AI model it runs on.

ChatGPT’s advanced voice mode sounds remarkably lifelike. It responds in real time, can adjust to being interrupted, makes the kinds of noises that humans make during conversations like laughing or “hmms.” It can also judge a speaker’s emotional state based on their tone of voice.

Within minutes of OpenAI announcing the feature at an event earlier this year, it was being compared to the AI digital assistant in the 2013 film “Her,” with whom the protagonist falls in love, only to be left heartbroken when the AI admits “she” also has relationships with hundreds of other users.

Now, OpenAI is apparently concerned that fictional story is a little too close to becoming reality, after it says it observed users talking to ChatGPT’s voice mode in language “expressing shared bonds” with the tool.

Eventually, “users might form social relationships with the AI, reducing their need for human interaction — potentially benefiting lonely individuals but possibly affecting healthy relationships,” the report states. It adds that hearing information from a bot that sounds like a human could lead users to trust the tool more than they should, given AI’s propensity to get things wrong.

The report underscores a big-picture risk surrounding artificial intelligence: tech companies are racing to quickly roll out to the public AI tools that they say could upend the way we live, work, socialize and find information. But they’re doing so before anyone really understands what those implications are. As with many tech advancements, companies often have one idea of how their tools can and should be used, but users come up with a whole host of other potential applications, often with unintended consequences.

Some people are already forming what they describe as romantic relationships with AI chatbots, prompting concern from relationship experts.

“It’s a lot of responsibility on companies to really navigate this in an ethical and responsible way, and it’s all in an experimentation phase right now,” Liesel Sharabi, an Arizona State University Professor who studies technology and human communication, told CNN in an interview in June. “I do worry about people who are forming really deep connections with a technology that might not exist in the long-run and that is constantly evolving.”

OpenAI said that human users’ interactions with ChatGPT’s voice mode could also, over time, influence what’s considered normal in social interactions.

“Our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” the company said in the report.

For now, OpenAI says it’s committed to building AI “safely,” and plans to continue studying the potential for “emotional reliance” by users on its tools.

Read the full article here

Related posts
Business

Palantir and Anduril join forces with tech groups to bid for Pentagon contracts

3 Mins read
Unlock the Editor’s Digest for free Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter. Palantir and…
Business

Saudi Arabia warned Germany about man held over Magdeburg attack

3 Mins read
Unlock the Editor’s Digest for free Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter. Saudi authorities…
Business

EU imports record quantities of Russian LNG in 2024

3 Mins read
Stay informed with free updates Simply sign up to the EU energy myFT Digest — delivered directly to your inbox. Russian liquefied…
Get The Latest News

Subscribe to get the top fintech and
finance news and updates.

Leave a Reply

Your email address will not be published. Required fields are marked *