Meta on Thursday began blocking messages from strangers sent directly to young teens using Instagram or Messenger.
By default, teens younger than 16 years old can now only be messaged or added to group chats by people they already follow or are connected to, according to the post.
Changing the setting will require approval through "parental supervision tools" built into the apps, the tech company said in a blog post.
Meta added that it is working on a way to prevent teens from seeing unwanted or potentially inappropriate images in all direct messages.
"We'll have more to share on this feature, which will also work in encrypted chats, later this year," Meta said.
Meta early this month tightened content restrictions for teens on Instagram and Facebook as it faced increased scrutiny over how its platforms are harmful for young people.
This type of content would include content that discusses suicide or self-harm, as well as nudity or mentions of restricted goods, the company added.
Restricted goods on Instagram include tobacco products and weapons as well as alcohol, contraception, cosmetic procedures, and weight loss programs, according to its website.
In addition, teens will now be defaulted into the most restricted settings on Instagram and Facebook, a policy that was in place for new users and that now will be expanded to existing ones.
The changes come months after dozens of US states accused Meta of damaging the mental health of children and teens, and misleading users about the safety of its platforms.
Leaked internal research from Meta, including by the Wall Street Journal and whistle-blower Frances Haugen, has shown that the company was long aware of dangers its platforms have on the mental health for young people.
On the platforms, teens are defined as being under eighteen, based on the date of birth they give when signing up.