Discord age verification: face scan or ID from March

Discord will require users worldwide to prove their age to access adult content. The company will switch all profiles to a teen-appropriate experience by default. Rollout begins in early March, with a phased approach for new and existing accounts.

What changes for users

Unverified users will see stricter defaults. Age-restricted servers and channels will remain hidden. Sensitive media stays blurred. Direct messages from unknown users are limited. Adults who verify can adjust these settings and enter mature communities.

How verification will work

Users may confirm age with a government ID upload or a short video selfie. An AI tool estimates facial age from the video. Discord says it deletes ID uploads after checks and does not retain face scans. The process targets adult-only areas and some settings changes, not basic chat access.

“Most adults won’t need it”

The platform will use an “age inference” model to reduce prompts. It looks at account tenure, device signals, and high-level activity patterns—never private messages—to estimate if a user is an adult. Discord says the vast majority of people will keep using the app without extra steps.

Privacy questions after a past breach

Privacy groups warn about risks tied to ID checks and face estimation. Their concern grew after a third-party age-verification vendor was hacked in October 2025, exposing about 70,000 ID images used in appeals. Discord cut ties with that vendor and said its own systems were not breached.

Why Discord is moving now

Social platforms face pressure to protect teens. Many have rolled out stricter defaults and verification for mature features. Discord tested age checks in the UK and Australia to meet local rules, then expanded. Similar safety pushes have appeared at Meta Platforms’s Instagram, TikTok, and Roblox.

The policy context

Lawmakers have intensified scrutiny after high-profile hearings on child safety. In January 2024, Jason Citron testified alongside other tech leaders before the U.S. Senate Judiciary Committee. The committee pressed firms to prevent abuse and to default minors into safer experiences.

What to watch next

Key signals will come in March. First, how many users are asked to verify and how fast the process works. Second, whether teen-by-default settings reduce exposure to adult spaces without harming everyday use. Third, if the company publishes clear, audited data on deletion of IDs and selfie videos. Early communications stress safety by default for teens and minimal friction for verified adults. Execution—and transparency—will determine user trust.

Leave a Reply

Your email address will not be published. Required fields are marked *