Meta, Snap, Tiktok work together to ban content on self-harm, suicide

0
35

WASHINGTON, SEPT 13 – The three biggest social media platforms are working together to tackle online content featuring suicide and self-harm, Meta announced on Thursday, according to a report United Press International (UPI).

Meta, the owner of Facebook, Instagram and WhatsApp, has teamed up with Snap and TikTok to form Thrive, an initiative designed to remove stigma around mental health issues and slow the viral spread of online content depicting suicide or self-harm, Meta said in a post. blog.

“Suicide and self-harm are complex mental health issues that can have devastating consequences,” Meta said in a statement.

“We focused on this content because of its tendency to spread quickly across multiple platforms,” ​​Antigone Davis, Meta’s Head of Global Security, said in the post.

“This initial signal is only representative of content and will not include identifiable information about any account or individual,” he said.

This initiative was formed with The Mental Health Coalition, a group of mental health organizations working to remove the stigma around these issues.

Meta, Snap and TikTok will share guides with each other, or ‘signals’, allowing them to compare notes and investigate and take action if similar content appears on other apps. Thrive will serve as a database accessible to all participating social media companies.

Meta uses technology developed by Lantern, a company designed to make technology safe for minors. Amazon, Apple, Google, Discord, OpenAI and others are part of the coalition. Meta clarified in its statement that it targets content, not users.

“We focus on this content because of its tendency to spread quickly across multiple platforms. This initial signal is only representative of the content and will not include identifiable information about any account or individual,” Davis said in the blog post.

Social media companies will be responsible for reviewing and taking any necessary actions through Thrive, as well as writing an annual report to measure the program’s impact.

Meta said when content featuring self-harm or suicide is identified, it will be given a number, or ‘hash’, which can then be cross-checked by other social media companies to find and remove the content.

A photo taken on March 25 shows the Meta (former Facebook) logo on a smartphone in Mulhouse, eastern France. – AFP

Increased use of social media by minors has led to a spike in depression and suicidal behavior, says The Mental Health Coalition. Research also shows that young people who self-harm are more active on social media.

Earlier this year, Meta announced it would begin removing and limiting sensitive content deemed ‘age inappropriate’ from youth feeds on its app.

The company said it plans to hide search results and terms related to suicide, self-harm and eating disorders for all users.

Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teenagers, including videos and images of self-harm. – Named

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here