New Jersey AG sues Discord over alleged child safety failures


New Jersey’s Attorney General Matthew Platkin is suing Discord over the chat company’s child safety features. The lawsuit claims that Discord has “misled parents about the efficacy of its safety controls and obscured the risks children faced when using the application.”

The Office of the Attorney General and the state’s Division of Consumer Affairs concluded that Discord violated New Jersey’s Consumer Fraud Act after a multiyear investigation into the company. The details of the lawsuit are currently sealed, but Platkin’s announcement suggests a few ways he plans to argue Discord’s approach may have endangered children. He says the app uses default settings that “allow users to receive friend requests from anyone on the app” and that it makes it simple to create an account when you’re under 13. According to Platkin, Discord “only requires individuals to enter their date of birth to establish their age when creating an account.”

When ask for comment, Discord offered the following statement:

Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer. Given our engagement with the Attorney General’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today. We dispute the claims in the lawsuit and look forward to defending the action in court.

Discord has introduced multiple features over the years with the express purpose of protecting younger users. Following a report that detailed 35 cases involving Discord in which adults were prosecuted on charges like “kidnapping, grooming or sexual assault,” the company introduced its Family Center tool, which lets adults track what their children do on the app. Teen Safety Assist, also introduced in 2023, added automatic content filters and a new warning system for people who violate the apps guidelines. In 2025, Discord launched a nonprofit coalition called Roost with the express purpose of developing open-source child safety tools.

Discord, like other social platforms, has faced scrutiny before, and the pressure seems like it’s only going to increase. Back in 2024, California lawmakers proposed the idea of blocking children’s access to algorithmic social feeds, and just this year Utah passed an age verification law for app stores, a decidedly blunt way to try and guarantee child safety.



Source link

Scroll to Top