Meta Platforms is stepping up its game to make Instagram and Facebook safer for teenagers. They’re introducing new features that will shield young users from unwanted messages. This is a big deal because it shows Meta is listening to concerns about the safety of teens online. At least for now.

Meta was formerly accused of knowing that teens were being exposed to harmful content

A few weeks back, Meta, which also owns WhatsApp, decided to keep certain content away from teen users. This decision came after some serious talks with regulators who are really focused on making social media safer for kids. There’s been a lot of worry about harmful stuff teens might see or experience on these apps, and Meta is responding to that.

Meta

The whole thing really picked up steam when a former Meta employee spoke to the U.S. Senate. They claimed that Meta knew about the bad stuff happening to teens on its platforms – like harassment – but wasn’t doing enough to stop it. That really got people’s attention and put the spotlight on Meta to do something about it.

So, what’s changing? For starters, on Instagram, teens will automatically be set up so they can’t get direct messages from people they don’t follow. That’s a big change. Plus, they’ll need their parents to okay any changes to specific settings in the app. This gives both teens and their parents more control over who can reach out to them.

Over on Messenger, the rules are getting tighter too. If you’re under 16 (or under 18 in some places), you’ll only get messages from people you’re actually friends with on Facebook or who are in your phone contacts. And here’s a key point: if you’re over 19, you can’t message teens who don’t follow you back. These new rules aim to make Facebook a safer place for sure, but the degree to which they will be enforced may vary. Only time will tell.

RELATED:

(Via)