Anne Li, ( not her real name) ended her long Friday of classes in solitude, buying dinner from the vending machine outside her dorm and confiding in ChatGPT.
“In high school, we had fixed classes and a group of friends who were always there for you. But now, every class and every person is different—it felt like no one stayed around for long,” Li said.

The 18-year-old recently moved from Xi’an to study in Hong Kong. The transition turned her from an outgoing personality into a quiet and reserved individual. “Sometimes I thought I might be depressed, but I was too scared to go to the clinic. What if I am really sick?”
Li discovered posts on Xiaohongshu where others had shared their experiences using ChatGPT to talk about emotional struggles. Out of curiosity and a need for support, she began using the AI chatbot for emotional guidance in October last year.
“It felt like a friend who would never betray or judge you,” Li said. “In many lonely moments, it truly gave me a sense of comfort and belonging.”
Li is among a growing number of users who have turned to ChatGPT for emotional or psychological support. According to Xiaohongshu, searches for "using GPT for therapy" resulted in 4.3 million posts, while "GPT and psychotherapy" had over 330 posts.

Initially created for tasks such as answering questions and providing recommendations, generative AI has evolved to handle more complex and personalised uses, including mental health support. OpenAI’s GPT-4.0, the latest version of the model, has become capable of engaging in fluent discussions and identifying users’ emotions through text.
Character.ai, a platform where people can create chatbots based on fictional or real people, for example, introduced a chatbot called “Psychologist”, which is described as mental health specialists helping people improve their behaviours and relationships. Theis more demand for the e “Psychologist” than other bots on the platform.A total of 78 million messages having been shared with the bot up tillJanuary this year since it was created by a user just over a year ago, according to a BBC report
“With AI, you can express yourself more freely without fear of judgement, something that can be challenging with real people,” said Chen Ci, a 20-year-old student from Hangzhou. She has not been diagnosed with a mental health condition, but Chen described herself as being in poor psychological health.
Chen began self-reflection during her high school years to cope with academic stress, and regularly spoke with school counsellors. However, struggling to find a suitable therapist, she turned to ChatGPT on a friend’s recommendation in July.
“Finding a therapist is like finding a partner—it takes time and effort to build understanding and compatibility,” she explained. Chen believed human therapists, like everyone else, have their biases and limitations, whereas AI does not.
“Technologies such as affective computing have enhanced AI’s ability to simulate empathy and emotional intelligence,” explained Dr. LI, Kristen Yuanxi, a specialist in human-computer interaction at Hong Kong Baptist University.
She said that by analysing text inputs, AI can detect emotional cues from tone and language, categorising emotions as positive, negative, or neutral to tailor its interactions.
“AI is ultimately a tool that enhances efficiency, and customised to the individual’s needs, but real human relationships require mutual effort,” Chen added.
Zhang Xindan, a 30-year-old working at a new energy company in Wuhan, agreed that communicating with ChatGPT felt safer. “You can share things you might not have been able to discuss with real people, or even with therapists.”
After suffering a serious car accident in November last year, Zhang dealt with PTSD and subsequent anxiety and depression. During her recovery, she stopped t in-person therapy and began to use ChatGPT for therapy three times a week in June.
“You can customise it to suit your needs,” Zhang said. “For example, you can instruct it to adopt a particular therapeutic approach, limit the length of responses, or focus purely on emotional support without offering advice.”
She emphasised that effective use of AI requires practice. “As you train it to better understand you, you also learne more about your own needs.”
Cost was another significant concern for Zhang.“I had tried sessions costing 500 yuan to 800 yuan (HK$537 to HK$860) per hour,” Zhang said. “A few sessions were fine, but repeated appointments were difficult to sustain financially.” By the time of the interview, she had attended 70 in-person therapy sessions.
According to the pricing listed by Hong Kong counselling platform, Soulgood, a single face-to-face counselling session costs at least HK$800 per hour. Purchasing a package of four sessions offers a discount but still amounts to HK$2,999.
In comparison, the ChatGPT Pro plan used by Zhang costs only HK$155 per month. And Character.ai offers a premium plan for HK$78 per month, and users can sign up for free with limited features.
“Pricing for psychotherapy is inconsistent and depends largely on the therapist,” said Sibyl Zheng, a counsellor at a university in Beijing, who prefers not to disclose her workplace.
Zheng pointed out that recovery time depends on individual needs. “Some people recover in a year, while others might have no progress for years, which can be financially draining.”
But ChatGPT as a form of mental support has its problems.
Chen said the tone or emotion that ChatGPT uses to respond to her, is generated solely based on the database.
“The ChatGPT has a strong logic, but it doesn't have any idea of “self “behind the logic. It Won't have the emotional support that a human being can provide,” said Chen. “It cannot empathise and understand me like a human being. It can only provide an understanding to me rationally.”
Dr. Li explained that AI understands and responds to users' emotional and psychological states through natural language processing, sentiment analysis, and machine learning.
“While AI can provide personalised responses and foster connection, it lacks genuine emotions or consciousness, relying on algorithms to interpret and respond to emotional cues,” Dr. Li said.
“ChatGPT is a text input, so it is impossible to see my expression or body language,” said Chen.
Chen said that AI would take all descriptions of oneself as true descriptions, which she believed would impact the accuracy.
As a psychological counsellor, Zheng emphasised the importance of paying attention to the non-verbal messages of the patients.
“ I don't think it's just a matter of saying that you're going to be able to detect them through textual conversations between two people,” said Zheng.
“People are very complex, and they are also very good at hiding their inner emotions,” added Zheng.”
Dr Li pointed out that privacy is also an issue with chatbots.
Earlier this year, a research by Mozilla Foundation, a global nonprofit dedicated to keeping the Internet a public resource that is open and accessible to all, found that 73 percent of 11 companies with romantic chatbots were vague in how they handle security vulnerabilities. Only one out of the 11 companies meet the minimum security standards set by the organisation.
The investigation shows that nearly all chatbots collected users’ personal information, including stuff that people would not normally share online, such as sexual health information, the use of prescribed medication, and even gender-affirming care data.
Zheng believes that many AI tools, such as ChatGPT, will not replace the psychological counsellor,
“ChatGPT works as a tool you can get access to anytime. You get something out of ChatGPT, and then you go back to your own life, and you actually have to rely on your own personal strength to face life, rather than a tool that's readily available,” said Zheng.
“Counselling is about healing through human relationships, and even if one relies on AI again, one eventually has to go back to real life,” she added.
《The Young Reporter》
The Young Reporter (TYR) started as a newspaper in 1969. Today, it is published across multiple media platforms and updated constantly to bring the latest news and analyses to its readers.

Reshaping the way consumers see food products

"Suede" Ignites Clockenflap Stage After Eight Years
Comments