Image of AI used for representative purpose 
News

Parents sue OpenAI over teen’s suicide, experts call for urgent AI regulation in mental health

Beyond the risk of harmful advice, experts also flag the potential for cognitive decline linked to AI over-dependence

EdexLive Desk

More than a month ago, the parents of 16-year-old Californian Adam Raine sued OpenAI, the makers of ChatGPT, alleging that the platform aided and abetted their son’s suicide.

Raine’s death reflects a wider global concern over the role of Artificial Intelligence (AI) in mental health, prompting experts to call for urgent regulation, reports Anubhab Roy of The New Indian Express.

On World Mental Health Day, mental-health professionals warn that engaging with AI during periods of emotional distress can be dangerous.

“It is possible that the user might get access to dangerous and potentially life-threatening information,” said Dr Manoj Kumar Sharma, Professor at the Service for Healthy Use of Technology (SHUT) Clinic, NIMHANS Centre for Well Being. He added that discretion is difficult to expect from teens, who are among the heaviest users of large language models (LLMs) such as ChatGPT.

Beyond the risk of harmful advice, experts also flag the potential for cognitive decline linked to AI over-dependence.

“As per the latest research by the Michigan Institute of Technology (MIT), there is a 47 per cent collapse in brain activity among people who are using ChatGPT. This increases the risk of dementia and the onset of Alzheimer’s disease,” said Dr Shweta Sharma, Student Counsellor at the International School of Management Excellence, Bengaluru.

“It is biased and lacks value and ethical implications,” she added.

Meanwhile, Bengaluru-based Cadabams Group has launched Mindtalk, an AI-driven “deep agent” that offers insights, interventions and coping strategies to people in distress. According to Managing Director Sandesh Cadabam and Executive Director Neha Cadabam, Mindtalk’s datasets are drawn from the group’s 32 years of psychiatric and therapeutic practice, a key difference from generic LLMs like ChatGPT.

But what happens if a user in acute suicidal crisis approaches Mindtalk? “When it becomes serious, there are protocols for de-escalation. It will stop and ask you to seek professional help. The principle on which this application is built is that AI is an enabler, not a replacement,” said Kushal Raju, Chief Technology Officer at Cadabams Group. He added that the software was initially launched for in-house doctors, adding another layer of safety.

Dr Manoj stressed that change also has to occur at the level of the user. However, both he and Dr Shweta echoed the urgent need for national-level regulation of AI in the medical landscape. “There is an urgent need to further define and legislate the use of AI in mental health,” Dr Shweta said.

Bengaluru: BTech student allegedly falls to death from university hostel building; police launch probe

FIR lodged against unidentified man for making 'obscene' gestures in JNU

UGC launches 'SheRNI' to ensure women scientist representation

Father of Kota student who killed self suspects foul play, demands fair probe

Gorakhpur NCC Academy will inspire youth to contribute to nation-building: UP CM Adityanath