New TikTok features comprise an expanded guide in support of the enhancement of mental health in teens. Such guide follows a report that said the own research of Facebook that its photo-based social medial site Instagram could be harmful.
According to a CNet report, the video-sharing app shared a few new features early today, designed to back users' mental well-being, including guides on how to engage with those who may be struggling and updated warning labels for sensitive content.
In a blog post, TikTok said while it doesn't allow content promoting, glorifying, or normalizing suicide, disorder, or self-harm.
The social video app added they do support those who opt to share their experiences to bring awareness, help other people who might be in a struggle and seek support "among our community."
ALSO READ: Research Finds Boy Playing Video Games Have Lower Risk of Depression
More Well-Being Guides
To more safely support such conversations and connections, TikTok has rolled out new well-being guides to help individuals share their personal experiences on the video-sharing platform.
This social video platform is sharing a new Safety Center guide for teens, educators, and caregivers about eating disorders.
The guide was designed with experts such as the National Eating Disorders Association, Butterfly Foundation, Bodywhys, and National Eating Disorder Information Center.
In early 2021, TikTok added a feature, directing subscribers searching for terms associated with eating disorders to suitable resources.
Additionally, when searching for words or phrases such as #suicide, they are referred to local support resources such as the Crisis Text Line helpline to look for information on treatment choices and support.
Furthermore, Tiktok said too that it is updating its cautionary label for sensitive content so that when a user is searching for terms that could surface distressing content, so that when a subscriber is searching for words or phrases that could surface troubling content like "scary makeup," the search result page will display an opt-in viewing screen.
Worry Over Teens' Mental Health
The Wall Street Journal recently reported that in studies carried out over the last three years, Facebook researchers have discovered that Instagram is "harmful" for a high percentage of young individuals, specifically among teenage girls.
For several years, child advocates have expressed worry over the mental health effect of sites such as Instagram, where it can be difficult to separate actual from changed images.
Advocacy groups, as well as lawmakers, have long criticized Facebook and Instagram for allegedly embracing harmful content, as well as fostering anxiety and depression, specifically among younger audiences.
A report by the United Kingdom's Royal Society for Public Health discovered that Instagram is the worst social media platform for young people's mental health.
FB and IG's Response to Criticisms
In May, in response to criticisms, both Facebook and Instagram said they would give all users the choice to hide the number of likes their posts are getting from the public to choose if they can see the number of likes and reactions on all posts in their feed.
Following the report from the Journal, IG said in a blog post that it stands through its research to understand young individuals' experience on the app.
According to the head of public policy Karina Newton, at Instagram, most people's minds question if social media can do good or not for people.
She said the study on this is mixed, and it can be both. At Instagram, she continued, they look at the benefits, as well as the risks of what they're doing.
The IG official also explained that the social media app had done an extensive job around suicide, bullying, eating disorders, and self-injury to make the platform a safe site for all.
Related information about Mental Health and TikTok is shown on TODAY's YouTube video below:
RELATED ARTICLE: Spanking Affects Child's Brain Development; Leads to Mental, Behavioral Issues
Check out more news and information on Mental Health in Science.