While many conversations about deepfakes still sound theoretical, in K–12, they are not. Over the last year, we have seen a steady shift from “this could happen someday” to “this is happening everywhere,” and the incidents that cause the most damage tend to fall into two categories.

The first is peer-to-peer synthetic nudes. The second is an adult with trusted access using AI in a criminal way. One spreads fast, while the other can collapse trust across an entire school community.

 

Peer-to-Peer Synthetic Nudes and the Wildfire Effect

Peer-to-peer synthetic nudes tend to spread quickly because the friction is low and the incentives are high. Students do not need advanced skills or a complicated workflow. In the Thibodaux, Louisiana case, a 13-year-old girl reported that boys were circulating AI-generated nude images of her and other students. The images moved through the same channels every district already struggles with, including Snapchat. The school initially found no evidence and took no action, and it was only after a bus confrontation, which led to the girl being expelled, that more attention was paid to the situation. Law enforcement later charged two boys with unlawfully disseminating the AI-generated images. The sequence is a painful example of how quickly a student can be harmed, then re-victimized by a confused or inconsistent response.

Lancaster Country Day School in Pennsylvania shows how quickly the scope can expand. Authorities announced juvenile charges against two students connected to AI-generated pornographic images of other students, and reporting describes allegations involving sharing images of dozens of victims with each other through Discord. The story has continued to evolve, including renewed attention on whether existing mandatory reporting rules match this new reality. In this case, the DA determined that the school did not have a duty to report this incident despite being mandatory reporters, as child-on-child harm is not considered child abuse under the current letter of their laws. While this is concerning on its own, the important takeaway is that a single student’s behavior can create a wide victim pool, a long trail of trauma, and a prolonged cycle of community distrust.

 

Adult Misuse and the “Structural Fire”

If peer-to-peer synthetic nudes are the wildfire, adult misuse is the structural fire. The Florida UBIC Academy case illustrates why. Florida’s Attorney General announced that a sixth-grade teacher, David McKeown, received a 135-year prison sentence for crimes involving child sexual abuse material and other offenses. Subsequent reporting states that investigators said he used AI to generate abusive content using students’ photos, and then distributed these images through Discord during school hours. This kind of incident quickly turns into a crisis about supervision, reporting, access, and what safeguards failed.

A similar incident in the Austin Independent School District shows that this risk isn’t isolated. Local reporting in March 2025 describes a principal’s letter stating that new charges against an elementary teacher stemmed from AI-generated materials created using photos of students within his classroom. Whether the school is large or small, the effect is the same: parents do not just want consequences for an individual. They want to know why the system did not protect children sooner.

 

What Districts Should Do Before the Next Incident

These two archetypes require different instincts. With peer-to-peer synthetic nudes, speed matters most. The harm compounds as the content spreads, so the priority is containment, victim protection, and rapid takedown. The most common failure mode is not a lack of technology. It is a lack of clarity: inconsistent evidence handling, slow escalation, informal “we will look into it” responses, and disciplinary decisions that punish the victim for reacting to humiliation.
With adult misuse, rigor matters most. Evidence handling, reporting pathways, and documentation have to be tight from the first hour, because the stakes are higher and the investigation lasts longer. The failure mode here is operational drift. Too many people get involved, too many versions of the story circulate, and the district’s response becomes reactive instead of controlled.

So what should a district do, practically, before the next incident lands on a Tuesday afternoon?

First, separate the playbooks. Do not rely on one generic “deepfake policy.” Peer-to-peer synthetic nudes need a rapid containment workflow and a trauma-informed student support pathway. Adult misuse needs a strict reporting and documentation workflow with defined roles and limited access to sensitive material.

Second, make the first response humane by design. Students targeted by synthetic sexual harassment should not have to prove their innocence or relive the content to be taken seriously. Minimize exposure, limit viewing to essential adults, and prioritize counseling and safety planning alongside the investigation.

Third, make takedowns easier on your staff and safer for students by pointing families to NCMEC’s Take It Down service and the CyberTipline instead of expecting schools to improvise removal requests in the heat of a crisis. Take It Down lets a young person generate a digital fingerprint of an image or video locally, without uploading the file. It’s not a magic wand, and it doesn’t cover everything, but it gives districts a credible, standardized “do this right now” option when explicit content involving minors is spreading.

Finally, don’t let the response hinge on a fast verdict about authenticity. In real incidents, you may not be able to tell whether an image is AI-generated or an actual photo, and waiting to “prove it” can cost you the only thing that matters in the first hours: containment and student safety. Treat any sexualized image involving a student as urgent until it is resolved, move immediately on harm reduction and reporting, and document what you do.

If you want a single takeaway, it is this: some deepfake problems are high-frequency and high-spread, and others are low-frequency but catastrophic. Districts need to be prepared for both, because the costs of improvisation are paid by students first.