There has been a recent rise in teens using social media to self-diagnose mental health issues, including autism.
Many Western teenagers use social media to self-diagnose their mental health. Illustration photo. (Source: CNN) |
Unlike most teenagers who surf TikTok and Instagram for entertainment, Erin Coleman's 14-year-old daughter (USA) uses social networks to search for videos about mental health diagnoses.
Based on information on social media, she was convinced she had attention deficit hyperactivity disorder (ADHD), depression, autism, phobia of dirt, fear of germs and fear of going out. “Every week, my daughter came up with a different diagnosis,” said Coleman. “She thought she had it too.”
After undergoing mental health and medical tests, doctors concluded that Ms. Coleman's daughter was suffering from severe anxiety.
Mental health crisis
Social media platforms, including TikTok and Instagram, have come under scrutiny in recent years for potentially exposing young users to harmful content and exacerbating the youth mental health crisis.
As a result, more and more teens are using social platforms like Instagram and TikTok to find resources and support for their mental health, and to deal with it in ways that work for them.
Using the Internet to self-diagnose is nothing new. With so much information available online, teens can get the mental health information they need and feel less alone.
But self-diagnosis and misdiagnosis exacerbate the problem. Even more dangerous, kids may self-medicate for conditions they don’t have. The more they search for this content, the more similar videos and posts social media algorithms surface.
The most common self-diagnoses he sees in teens are ADHD, autism spectrum disorder, dissociative identity disorder and multiple personality disorder, especially since 2021, said Dr. Larry D. Mitnaul, an adolescent psychiatrist in Wichita, Kansas. “As a result, treatment and intervention are quite complex,” which puts parents in a difficult position because seeking help is not always easy.
Another parent, Julie Harper (USA), said her daughter was outgoing and friendly but that changed during the Covid-19 lockdown in 2020, when she was 16 and diagnosed with depression. Although her condition improved with medication, her mood swings increased and new symptoms emerged after she started spending more time watching TikTok.
Experts say many social media users who post about mental disorders are often seen as “credible sources” by teens, either because those users also have the disorder discussed in the video, or because they claim to be experts on the topic.
Call to action
In May, the U.S. Surgeon General issued a warning that social media use poses “profound harm” to children, calling for more research on its impact on adolescent mental health and action from policymakers and social media companies. Social media companies should adjust their algorithms to detect users who are consuming too much content on a particular topic, said Alexandra Hamlet, a psychologist in New York City. “They need to have notifications that remind users to pause and think about their online habits,” she said.
“We don’t have specific protections beyond our Community Standards, which prohibit promoting, encouraging, or glorifying things like dieting or self-harm,” Liza Crenshaw, a spokesperson for Instagram’s parent company Meta, said in a statement. Meta has created programs like the Well-being Creator Collective, which guides creators in creating content that is positive, inspiring, and supportive of teens’ physical and mental health. Instagram has introduced tools to curb late-night browsing, prompting teens to move on to something else if they’ve been watching one thing for too long.
Enhanced control
Social networks today have tools to measure the harmful effects of excessive use, especially for young people, but there are few measures to limit it. However, some platforms and applications have begun to offer solutions.
For example, Snapchat, one of the most popular communication platforms and social networks among young people in the West, has officially launched the “Family Center” feature, which allows parents to partially control their children’s social media usage. Through this feature, parents can know how often their children log in to social networks and who they communicate with on social networks, even though they are not allowed to view the content of the communications.
Social networks must introduce similar features because protecting minors is one of the biggest priorities of social network regulators in Western countries, especially in Europe, US Surgeon General Vivek Murthy warned on May 23.
Accordingly, the development trend of social networks is inevitable and needs to be adjusted so that they can develop transparently and be controlled, not restrained. In the context of large technology companies such as Google, Facebook and TikTok... increasingly influencing but less responsible to the community, the role of governments in tightening control is necessary. In addition to the responsibility of technology companies, another important factor to ensure a healthy social network environment is to raise awareness of each social network participant and strengthen the very important role of education.
Source
Comment (0)