Daily Guardian UAEDaily Guardian UAE
  • Home
  • UAE
  • What’s On
  • Business
  • World
  • Entertainment
  • Lifestyle
  • Sports
  • Technology
  • Travel
  • Web Stories
  • More
    • Editor’s Picks
    • Press Release
What's On

Invest Bank Reports 97% Profit Growth in Q1 2026

May 14, 2026

Meta says WhatsApp is now the safest app to chat… with an AI

May 14, 2026

I’m still not sold on a disc-less Xbox, but Project Helix feels inevitable now

May 14, 2026

At $4,499, the Sony A7R VI undercuts the A1 II by $2,000, and still matches it at 30fps

May 14, 2026

Assassin’s Creed Hexe leak predicts the return of a legendary hero and I can’t wait for it

May 14, 2026
Facebook X (Twitter) Instagram
Finance Pro
Facebook X (Twitter) Instagram
Daily Guardian UAE
Subscribe
  • Home
  • UAE
  • What’s On
  • Business
  • World
  • Entertainment
  • Lifestyle
  • Sports
  • Technology
  • Travel
  • Web Stories
  • More
    • Editor’s Picks
    • Press Release
Daily Guardian UAEDaily Guardian UAE
Home » Using ChatGPT too much can create emotional dependency, study finds
Technology

Using ChatGPT too much can create emotional dependency, study finds

By dailyguardian.aeMarch 22, 20253 Mins Read
Share
Facebook Twitter LinkedIn Pinterest Email

OpenAI seems to be announcing new AI models by the week to improve its ChatGPT chatbot for the betterment of its 400 million users. However, the ease the AI tool provides seems to prove that it’s possible to have too much of a good thing.

The artificial intelligence company is now delving into the potential psychological ramifications that ChatGPT might have on its users. OpenAI has published the results of a two-part study completed alongside MIT Media Lab, which uncovered a connection between increased usage of the ChatGPT chatbot and users’ increased feelings of loneliness.

Each organization conducted an independent study and then compiled the results to a consolidated conclusion. OpenAI’s study examined over one month “over 40 million ChatGPT interactions,” which didn’t include human involvement to maintain user privacy. Meanwhile, MIT observed approximately 1,000 participants using ChatGPT over 28 days. Currently, the studies have not yet been peer-reviewed.

MIT’s study delved into different use functions that could affect users’ emotional experience interacting with ChatGPT, including using text or voice. Results found that either medium had the potential to elicit loneliness or to affect users’ socialization during the time of the study. Voice inflection and topic choice were also a major point of comparison.

A neutral tone used in ChatGPT’s voice mode was less likely to lead to a negative emotional outcome for participants. Meanwhile, the study observed a correlation between participants having personal conversations with ChatGPT and the increased likelihood of loneliness; however, these effects were short-term. Those using text chat even to converse about general topics experienced increased instances of emotional dependence on the chatbot.

The study also observed that those who reported viewing ChatGPT as a friend, and those who already had a propensity toward strong emotional attachment in relationships, were more likely to feel lonelier and more emotionally dependent on the chatbot while participating in the study.

OpenAI’s study added additional context, with its results noting overall that interacting with ChatGPT for emotional purposes was rare. Additionally, the study found that even among heavy users who implemented the Advanced Voice Mode feature on the chatbot and were more likely to answer that they considered ChatGPT to be a friend, this group of participants experienced low emotional reactions to interacting with the chatbot.

OpenAI concluded that its intent with these studies is to understand the challenges that might arise due to its technology, as well as to be able to set expectations and examples for how its models should be used.

While OpenAI suggests that its interaction-based study simulates the behaviors of real people, more than a few real humans have admitted on public forums, such as Reddit, to using ChatGPT in place of going to a therapist with their emotions.











Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Keep Reading

Meta says WhatsApp is now the safest app to chat… with an AI

I’m still not sold on a disc-less Xbox, but Project Helix feels inevitable now

At $4,499, the Sony A7R VI undercuts the A1 II by $2,000, and still matches it at 30fps

Assassin’s Creed Hexe leak predicts the return of a legendary hero and I can’t wait for it

Qualcomm leak suggests we have entered the ludicrous era of pricey phones

I played like a rat in Arc Raiders, and the loot was disgustingly good

New Backrooms trailer proves it might finally be the horror movie that gets creepypasta right

Apple’s 2028 iPhone display sounds impossible, but Samsung and LG are scrambling to build it

Samsung phones will block those nasty push notifications brimmning with adware

Editors Picks

Meta says WhatsApp is now the safest app to chat… with an AI

May 14, 2026

I’m still not sold on a disc-less Xbox, but Project Helix feels inevitable now

May 14, 2026

At $4,499, the Sony A7R VI undercuts the A1 II by $2,000, and still matches it at 30fps

May 14, 2026

Assassin’s Creed Hexe leak predicts the return of a legendary hero and I can’t wait for it

May 14, 2026

Subscribe to News

Get the latest UAE news and updates directly to your inbox.

Latest Posts

Qualcomm leak suggests we have entered the ludicrous era of pricey phones

May 14, 2026

I played like a rat in Arc Raiders, and the loot was disgustingly good

May 14, 2026

New Backrooms trailer proves it might finally be the horror movie that gets creepypasta right

May 14, 2026
Facebook X (Twitter) Pinterest TikTok Instagram
© 2026 Daily Guardian UAE. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.