Meta launches Thrive program to share signals about violating suicide content
Meta logo | Image credit: Meta
Meta is taking another big step toward removing dangerous content from its platform. The social network giant announced this week it has teamed up with the Mental Health Coalition to launch Thrive, a program that allows other social networks to share signals about violating suicide or self-harm content.Meta explains that “participating companies will start by sharing hashes – numerical codes that correspond to violating content – of images and videos showing graphic suicide and self-harm, and of content depicting or encouraging viral suicide or self-harm challenges.”
Meta revealed that between April and June, it took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram alone. That’s an impressive amount of harmful content, but it’s safe to assume that the number is much higher if we take into consideration the other participating companies, Snap and TikTok.
Follow us on Google News
Things that are NOT allowed:
To help keep our community safe and free from spam, we apply temporary limits to newly created accounts: