Games like Roblox, Minecraft, and Fortnite are immense creative spaces—and busy social networks. The same features that make them fun—open chat, gifting economies, fast friending, countless user-made worlds—also create cover for people who manipulate kids. Over the past year, investigators and child-safety groups have warned about a sharp rise in grooming and financially motivated sextortion that often starts as casual in-game chat and escalates once contact moves off-platform.

How grooming typically unfolds in Roblox-style worlds

Most grooming follows a familiar sequence. A stranger poses as a near-age peer in public chat, then pivots to private channels such as whispers, party voice, private servers, or another app. Once a conversation shifts to Snapchat, Discord, Instagram, WhatsApp, or Telegram, moderation visibility drops and pressure increases. After brief flattery or gifts like Robux, boundaries are tested with “trade pics?” dares and secrecy. If a compromising image is obtained, it often turns into sextortion, with threats to share content unless more images or money are sent. The FBI reports a surge in these cases, with demands typically intensifying even when victims pay.

On Roblox, communication moves quickly from public chat to DMs, voice, or private servers, then off-platform. Roblox has added parent-link controls, spending caps, content-maturity settings, and tighter messaging limits, but none of that can police conversations that have already migrated. Parents and platforms now stress catching the off-platform pivot, “add me on snap,” “what’s your discord,” “join my server”, as an early, actionable red flag.

Language that signals risk

Slang evolves, but certain cues persist. “ASL?”(“Age, Sex, Location”) remains a fast way to probe age; “LMIRL/MIRL”(“Let’s Meet In Real Life”) signals a push toward meeting; —“S2R” (“send to receive”), “WTTP” (“want to trade pictures?”), “GNOC” (“get naked on cam”), or “NP4NP” (“naked pic for naked pic”) are classic “trade” acronyms; “lewd,” “NSFW,” “Rule 34,” and “Rule 63” show up around sexualized fan content; and secrecy phrases like “our little secret” often precede escalation. “POS,” “PIR,” and “P911” are shorthand for “parent over shoulder,” “parent in room,” and “parent alert”—signals that a young person may be hiding a chat. In practice, the most reliable flags are clusters: age-probing plus an off-platform invite, or sexual slang paired with requests for a “face reveal.” Education groups now emphasize teaching kids to pause at those clusters and tell an adult immediately—before an image is shared.

What recent cases are showing

Recent filings and reporting describe the same cross-platform pattern: first contact on Roblox, grooming and pressure on another app, and, too often, real-world harm. Louisiana’s attorney general filed suit against Roblox this month, alleging its systems allowed predators to “thrive.” In Georgia, a suit says predators used Roblox to harass and extort a nine-year-old. In Texas, a separate filing describes initial contact on Roblox at age ten, then a move to Discord before the alleged assault. California plaintiffs likewise say a ten-year-old was groomed and abducted after meeting an adult on Roblox, with Discord used to continue the manipulation. Companies dispute liability and cite safety investments, but the pattern is consistent: quick intimacy, a hop to disappearing messages, escalating demands.

Beyond the U.S., courts are imposing heavy sentences in multi-victim cases tied to game platforms. In Spain this summer, prosecutors sought hundreds of years for a man who contacted more than twenty minors via an online game, using gifts and attention to gain trust before coercing sexual content over video calls.

Sextortion remains the fastest-escalating threat. Federal prosecutors tied a high-profile ring to the 2022 death of Michigan teen Jordan DeMay; multiple defendants have since been sentenced, and related money-laundering cases continue. Investigators stress that boys are increasingly targeted, threats balloon even if payments are made, and quick reporting is critical to stopping the spiral.

What platforms are doing—and what still falls to adults

Roblox announced “Sentinel,” an AI system designed to spot subtle grooming patterns in chat, reporting roughly 1,200 potential child-exploitation attempts to the National Center for Missing & Exploited Children in the first half of 2025 and open-sourcing the tool to encourage wider adoption. Recent policy updates are also narrowing access to sexualized spaces and adding verification checks for “adult-like” worlds.
This is progress, not a cure-all. Technical measures work best alongside tighter parental controls, clear rules about off-platform invites, and rapid, shame-free reporting the moment a chat turns sexual or secret.

For families and schools, the response is straightforward but urgent: treat any push off-platform as a reason to pause and check in; assume “trade” language is a coercion precursor; and if a child has shared or is being threatened with images, involve law enforcement and use NCMEC resources rather than paying. The FBI’s guidance is clear: paying doesn’t end extortion—it often accelerates it.

A child-first way forward

Young people use these worlds to belong, be seen, and play. That’s why shame-free support matters. Teach kids to refuse off-platform invitations from people they meet in games and to stop, screenshot, and tell an adult the moment a chat turns sexual or secret. If images have been shared or threatened, don’t pay; preserve evidence, contact law enforcement, and use removal tools like NCMEC’s Take It Down to scrub content across platforms. Shame-free response, “you’re not in trouble, and we can fix this”, is the difference between a short incident and a spiral.  Technical guardrails help, but what most reliably disrupts grooming is attention to the pattern itself: quick intimacy, secrecy, gifts, and the off-platform hop. When kids and adults can name those moves, they can stop them earlier.