
Discord will soon require all users worldwide to verify their age with a face scan or ID if they want to access adult content. The platform, which allows users to create and join interest-based communities, has more than 200 mln monthly users.
The new safety measures are designed to protect users by placing everyone in a teen-appropriate environment “by default”. While Discord already enforces age checks in the UK and Australia to comply with local online safety laws, the global rollout begins in early March.
«Nowhere is our safety work more important than when it comes to teen users,» said Discord policy head Savannah Badalich. «Rolling out teen-by-default settings globally builds on Discord’s existing safety system, giving teens strong protections while allowing verified adults flexibility.»
Under the new system, default settings will limit what users can see and how they interact. Only verified adults will be able to join age-restricted communities and view sensitive content. Direct messages from strangers will also be blocked until age verification is completed.
Drew Benvie, head of social media consultancy Battenhall, said the changes are «a positive move» but warned implementation could be challenging given Discord’s millions of communities.
«The platform could lose users if it backfires, but it could attract new users drawn to its enhanced safety standards,» he said.
How age verification works
Users will verify their age by uploading a photo ID or taking a video selfie, with AI estimating their facial age. Discord said no face scans will be stored and ID uploads will be deleted after verification. The company stressed that neither it nor its verification partner will keep personal data.
Privacy advocates have warned such methods carry risks, particularly after an October 2025 incident in which ID photos of around 70,000 users were potentially exposed in a hack.
Industry context and regulatory pressure
Discord’s announcement follows reports that the company is exploring a public listing. Its new measures, including a teen advisory council, mirror similar steps taken by Facebook, Instagram, TikTok and Roblox.
«Forced age verification and safe content by default are changes other social networks are likely to watch closely,» said Benvie.
Social media platforms have faced increasing pressure from lawmakers to protect children and teens. Discord CEO Jason Citron was questioned about child safety measures at a U.S. Senate hearing in 2024 alongside leaders from Facebook, Snap and TikTok.
From economics and politics to business, technology and culture, Kursiv Uzbekistan brings you key news and in-depth analysis from Uzbekistan and around the world. To stay up to date and get the latest stories in real time, follow our Telegram channel.