Roblox, an immersive gaming and creator platform, announced another big investment in its multi-layered approach to platform safety: real-time multimodal moderation.
While traditional tools moderate individual items (a single 3D object or a snippet of text), Roblox’s new AI system looks at an entire scene simultaneously from the user’s point of view – including 3D objects, avatars, and text – capturing all of these elements together in a specific moment to assess whether the combination of content types breaks the platform rules.
The Impact:
- 5,000 servers shut down daily: Since deployment, the system has shut down thousands of violating servers per day that violate our Community Standards.
- Proactive safety net: While Roblox already uses a combination of AI and a team of human safety experts to review all content uploaded to the platform before it is published, this new system acts as a continuous safety net during gameplay. Now, Roblox can evaluate a combination of problematic text, 3D drawings, or avatar movements in real-time and shut down that specific server immediately – often before a user ever encounters it.
- Targeted: By targeting only, the violating server – rather than the entire experience – well-intentioned players can continue their sessions uninterrupted.

This marks a significant shift in how UGC platforms use AI to help keep users safe and appropriately moderate content.
If you have an interesting Article / Report/case study to share, please get in touch with us at editors@roymediative.com roy@roymediative.com, 9811346846/9625243429.







