Scroll, like, repeat: Meta faces lawsuit over Instagram's impact on teens
It's all fun and games (and likes, and comments), but Meta will have to face a lawsuit from Massachusetts. The allegations: the company purposely made Instagram features addictive for young users and misled the public about the harm it could cause to teens' mental health.
A judge in Boston, Peter Krupp, recently ruled against Meta’s request to throw out the case, a Reuters report reads. The lawsuit, led by Massachusetts Attorney General Andrea Joy Campbell, accuses Meta of breaking state consumer protection laws and being a public nuisance.
Meta argued that the lawsuit should be blocked by a federal law, Section 230 of the Communications Decency Act, which usually protects online companies from being sued over content posted by users. However, the judge said this didn’t apply in this case because the focus is on Meta’s own actions – like allegedly making false claims about Instagram’s safety, its protections for young users, and its age-verification system for kids under 13.
Personally, I'm afraid it's not just Instagram that could be perceived as harmful to teens and kids: it's virtually all the major platforms. They're designed to get you hooked and that's what they do: it's up to us to resist the temptation to doom-scroll all through the evening. That being said, I completely understand the plaintiffs and their motivation: when you're 15, the "resist the temptation" mindset is rare as a double rainbow.
Back to the lawsuit, though. Meta disagreed with the ruling, with a company spokesperson stating that the evidence would prove its commitment to supporting young people.
This decision followed another ruling earlier in the week by a federal judge in California, who also rejected Meta's attempt to dismiss lawsuits brought by over 30 states. Those lawsuits claim that Meta's platforms are contributing to mental health issues among teenagers by making them addictive.
Massachusetts was one of the few states to file its own lawsuit in state court, rather than join the federal case. The case gained attention due to allegations that Meta CEO Mark Zuckerberg had dismissed concerns that Instagram's design could harm users.
The lawsuit claims that features like push notifications, likes, and endless scrolling were deliberately designed to exploit teens’ psychological vulnerabilities and their "fear of missing out" (FOMO).
A little over a month ago, Meta introduced yet another teen safety net: Teen Accounts.
These features apply to users under 16, limiting who can message them and providing tools to manage screen time and content exposure.
One key feature restricts messaging access so that only people teens follow or are connected with can message them, blocking strangers from their inbox. While parents can monitor who their teens are in contact with, they won’t have access to the actual conversations, offering a balance between safety and privacy, Meta claims.
To address screen time concerns, Instagram will include a "Daily Limit" feature in the Teen Accounts, reminding teens to take breaks after an hour of use. A "Sleep Mode" can also block notifications between 10 PM and 7 AM, encouraging healthier sleep habits.
Instagram is also tightening its controls on sensitive content, aiming to reduce the likelihood of teens encountering inappropriate posts or accounts in the Explore and Reels sections.
Teens under 16 automatically have these privacy and safety settings enabled, and any changes to the settings will require parental approval. The new Teen Accounts features will initially roll out in the US, UK, Canada, and Australia, with a global rollout planned for 2025.
Furthermore, just days ago, Meta introduced another safety feature on Instagram to protect teens from sextortion (a form of blackmail that exploits sexual content). The updates include measures to block or redirect follow requests from suspicious accounts to spam, and warn teens when they receive messages from unfamiliar accounts, especially those in other countries. If a questionable account follows a teen, it will be restricted from viewing the teen’s followers or tagged photos.
Meta is also preventing users from taking screenshots or screen recordings of temporary images in DMs, and rolling out a tool that blurs nude images in private messages. Additionally, Meta has taken action by removing over 800 Facebook groups tied to sextortion scams.
A judge in Boston, Peter Krupp, recently ruled against Meta’s request to throw out the case, a Reuters report reads. The lawsuit, led by Massachusetts Attorney General Andrea Joy Campbell, accuses Meta of breaking state consumer protection laws and being a public nuisance.
Attorney General Andrea Joy Campbell stated that this decision allows the state to continue its efforts to hold Meta accountable and push for changes to protect young users on its platforms.
Personally, I'm afraid it's not just Instagram that could be perceived as harmful to teens and kids: it's virtually all the major platforms. They're designed to get you hooked and that's what they do: it's up to us to resist the temptation to doom-scroll all through the evening. That being said, I completely understand the plaintiffs and their motivation: when you're 15, the "resist the temptation" mindset is rare as a double rainbow.
Back to the lawsuit, though. Meta disagreed with the ruling, with a company spokesperson stating that the evidence would prove its commitment to supporting young people.
This decision followed another ruling earlier in the week by a federal judge in California, who also rejected Meta's attempt to dismiss lawsuits brought by over 30 states. Those lawsuits claim that Meta's platforms are contributing to mental health issues among teenagers by making them addictive.
The lawsuit claims that features like push notifications, likes, and endless scrolling were deliberately designed to exploit teens’ psychological vulnerabilities and their "fear of missing out" (FOMO).
Instagram's new features are not enough, apparently
A little over a month ago, Meta introduced yet another teen safety net: Teen Accounts.
These features apply to users under 16, limiting who can message them and providing tools to manage screen time and content exposure.
One key feature restricts messaging access so that only people teens follow or are connected with can message them, blocking strangers from their inbox. While parents can monitor who their teens are in contact with, they won’t have access to the actual conversations, offering a balance between safety and privacy, Meta claims.
Instagram is also tightening its controls on sensitive content, aiming to reduce the likelihood of teens encountering inappropriate posts or accounts in the Explore and Reels sections.
Furthermore, just days ago, Meta introduced another safety feature on Instagram to protect teens from sextortion (a form of blackmail that exploits sexual content). The updates include measures to block or redirect follow requests from suspicious accounts to spam, and warn teens when they receive messages from unfamiliar accounts, especially those in other countries. If a questionable account follows a teen, it will be restricted from viewing the teen’s followers or tagged photos.
Things that are NOT allowed: