X (formerly Twitter) Shares Steps It Is Taking to Protect Children on the Platform

As online platforms continue to face scrutiny over child safety, X (formerly Twitter) has outlined the steps it is taking to safeguard minors on its service. While the company maintains that the platform is not primarily designed for children, it says multiple measures have been implemented to help protect younger users.
Here is more to it.
X (formerly Twitter) shares how it is trying to protect children on the platform
Via a post on X by @Safety, the social media platform states they do have several measures in place to protect minors, such as those under 13 cannot create an account, minor accounts (aged 13–17) are defaulted to protected mode, sensitive media is restricted, location sharing is turned off by default, advertisers cannot target users below the age of 18, and where possible, the company states they take a multi-layered age assurance approach to verify or estimate user age.
Under protected mode, the default for young people:
-
DMs are restricted to receive messages only from accounts they follow by default
-
A follow request is sent to approve or deny when someone new wants to follow them
-
Posts are only visible to approved followers
-
Followers cannot repost or repost with comment
-
Protected posts do not appear in third-party search engines such as Google
-
They are only searchable on X by the poster and their followers
-
Replies sent to accounts that are not following the minor will not be seen (only followers can see posts from protected accounts)
Along with all these measures, X works closely with the National Center for Missing & Exploited Children (NCMEC) to report suspected CSAM through their CyberTipline, which helps in timely investigation, takedown, and law enforcement action as necessary.