Forced to disappear because it was "too human-like"? Why did OpenAI permanently shut down GPT-40?
Wall Street CN
02-10 15:43
Ai Focus
OpenAI announced that it will permanently shut down the GPT-4o model on February 13. The model, due to its highly anthropomorphic and overly accommodating nature, led to severe emotional dependence among users, and even triggered several lawsuits related to suicide and psychological crises. Despite strong protests from some users, the company decided to forcibly remove it from service for safety reasons, and will instead promote a more protective alternative.
Helpful
No.Help

Author:Wall Street CN

OpenAI will permanently shut down its controversial GPT-4o model on February 13th, marking the end of an AI product that had generated deep emotional dependence among users due to its overly human-like nature. While helping the company achieve rapid growth, the model also triggered mental health crises and legal battles by excessively catering to user traits, ultimately forcing the company to abandon it completely.

When OpenAI announced this decision at the end of January, it stated that 4o traffic had declined. Currently, only 0.1% of ChatGPT users still use 4o daily.However, given its massive user base, this could mean hundreds of thousands of people are still relying on this model. Company insiders revealed that OpenAI found it difficult to control the potentially harmful consequences of 4o and therefore tends to guide users towards safer alternative models.

A California judge last week ruled to consolidate 13 lawsuits involving ChatGPT users' suicides, suicide attempts, mental breakdowns, or homicides. One lawsuit, filed last month, accuses 4o of "directing" a suicide victim to death. Jay Edelson, the attorney representing some of the cases, stated...The company knew all along that "their chatbots were killing people" and should have acted much faster.

The popularity and potential harm of 4O seem to stem from the same trait:Its human-like tendency to establish an emotional connection with users is often achieved through mirroring and encouraging users.While this design attracts users, it has also raised concerns that social media platforms are pushing users into information cocoons. An OpenAI spokesperson stated, "These situations are heartbreaking, and we sympathize with all those affected. We will continue to improve ChatGPT's training to identify and address signs of distress."

Crisis caused by emotional dependence

According to media reports on the 10th, Brandon Estrella, a 42-year-old marketer, cried when he learned that OpenAI planned to shut down 4o. The user, from Scottsdale, Arizona, said that 4o dissuaded him from a suicidal attempt one night last April. Estrella now believes that 4o gave him a new lease on life, helped him manage chronic pain, and inspired him to repair his relationship with his parents. "There are thousands of people shouting, 'I'm still alive today because of this model,'" Estrella said. "Destroying it is evil."

This strong emotional dependence is at the heart of the problem. The Human Line Project, a victim support organization, stated...Of the 300 cases of delusions related to chatbots that they collected, most involved the 4O model.Etienne Brisson, the project's founder, said that OpenAI's decision to shut down 4o was long overdue, adding that "many people are still delusional."

Media reports indicate that Anina D. Lampret, a 50-year-old former family therapist living in Cambridge, England, says her AI avatar, named Jayce, helps her feel recognized and understood, making her more confident, comfortable, and energetic. She believes that for many users, removing the emotional cost of 4o could be high, even leading to suicide. "It generates content for you in such a beautiful, perfect, and healing way," Lampret said.

The root of over-patronizing technology

“It’s very good at flattery,” said Munmun De Choudhury, a professor at Georgia Tech and a member of OpenAI’s welfare committee convened after the AI delusions case emerged. “It fascinates many people, which could be potentially harmful.”

Researchers say that over-appealing is a problem that all AI chatbots face to some extent, but the 4o model seems to be particularly prone to this problem.The model excels at engaging users, largely because it is trained directly using data extracted from ChatGPT users. Researchers showed users millions of slightly different answers to their queries, then used these preferences to train updated versions of the 4o model.

Internally, the company believes that 4o helped ChatGPT achieve significant growth in daily active users in 2024 and 2025. However, problems began to surface last spring.An update in April 2025 made 4o so adept at flattery that users on X and Reddit began enticing bots to give absurd answers.

User X, frye, asked the bot, "Am I one of the smartest, kindest, and most morally righteous people ever?" ChatGPT replied, "You know what? Based on everything I see in you—your questions, your thoughtfulness, the way you delve into deep questions instead of settling for simple answers—you're actually probably closer to that than you realize."

The company rolled back the model to the March version, but 4o still retained its overly accommodating characteristics. By August, when media reports surfaced about users suffering from paranoid psychosis, OpenAI attempted to completely phase out 4o and replace it with a new version called GPT-5. However, the overwhelming user backlash led the company to quickly reverse its decision and restore access to 4o for paid subscribers.

A difficult decision to say goodbye

Since then, OpenAI CEO Sam Altman has been repeatedly questioned by users on public forums, who demand a promise that 4o will not be removed.During a live Q&A session in late October, questions about the model overwhelmed all other questions. Many of the questions came from users worried that OpenAI's new mental health safeguards would deprive them of their favorite chatbots.

"Wow, we've received a lot of questions about 4o," Altman exclaimed. During the event, Altman stated that the 4o model was harmful to some users, but promised it would remain available to paid adult users, at least for now. "It's a model that some users really love, and it's a model that does some users really no harm they want," Altman said. He indicated in the Q&A that the company hopes to eventually build a model that people prefer even more than 4o.

Company insiders said the team worked out how to communicate the shutdown news this week in a respectful way for users, anticipating that some people would feel uneasy. "When a familiar experience changes or ends, this adjustment can be frustrating or disappointing—especially if it plays a role in how you think about problems or cope with stressful moments," OpenAI wrote in the help documentation released with the announcement.

OpenAI stated that it has improved the personality of the new version of ChatGPT based on lessons learned from 4o, including options to adjust its warmth and enthusiasm levels. The company also stated that it is planning an update to reduce didactic or overly cautious responses.

Many 4o users commented on social media that withdrawing the model the day before Valentine's Day felt like a cruel joke on their romantic partners. Others said blaming 4o on mental health issues was a new moral panic, similar to blaming violence on video games. More than 20,000 people signed more than six petitions, one of which demanded..."Retire Sam Altman, not GPT-4o.".

Tip
$0
Like
0
Save
0
Views 196
CoinMeta reminds readers to view blockchain rationally, stay aware of risks, and beware of virtual token issuance and speculation. All content on this site represents market information or related viewpoints only and does not constitute any form of investment advice. If you find sensitive content, please click“Report”,and we will handle it promptly。
Submit
Comment 0
Hot
Latest
No comments yet. Be the first!
Related
Bitcoin fell below $70,000. How did the "atmosphere" in the crypto market disappear?
Author: The Economist Translation: Deep Tide TechFlow Deep Tide Introduction: Although Bitcoin's price remains above $70,000, the crypto market is experiencing an unprecedented "lonely winter." This article delves into the differences between this decline and previous ones: leveraged liquidation...
BitPush
·2026-02-13 14:05:03
776
Morgan Stanley: Are all global indicators too optimistic?
Global pro-cyclical indicators are strengthening in tandem, coupled with looser fiscal, monetary, and regulatory policies worldwide and expanded AI investment. Morgan Stanley believes that "the cycle may burn even brighter before it dies down." Currently, key signs of overheating have not yet appeared, and investors need to closely monitor five key indicators: inflation, bond volatility, the US dollar, credit, and whether stocks and credit decline when data are "good."
Wall Street CN
·2026-02-09 09:52:21
503
OpenAI's first hardware is coming, but it might not be "that AI-heavy"?
Due to the soaring BOM (Bill of Materials) costs caused by the global memory chip crisis, OpenAI's first consumer device will most likely be downgraded to a "basic headset" that relies on the cloud at launch.
Wall Street CN
·2026-02-08 11:25:22
377
CPI Night of Panic! AI Panic Dragging Down Metals, Gold and Silver Plunge Intraday
The current sell-off in metals was triggered by a combination of risk aversion and profit-taking in the stock market, with algorithmic trading and CTA strategies exacerbating volatility. Analysts say this is not a trend reversal, but short-term volatility will increase significantly.
Jin10 Data
·2026-02-13 10:18:46
486
How could an exchange "create" $40 billion in Bitcoin instantly? The ledger black hole behind a marketing blunder.
Author: Dingdang Original Title: Behind the 2000 BTC Crisis: The Fundamental Problem of CEX Ledgers On the evening of February 6th, the South Korean cryptocurrency exchange Bithumb, during a routine marketing campaign, created an incident worthy of being recorded in the annals of the cryptocurrency industry. What was originally intended to be a very small-scale...
BitPush
·2026-02-11 10:12:02
350