Connectivity

When it comes to online safety, metaverse platform Roblox isn’t playing around

Policy exec Nicky Jackson Colaco talks AI, content moderation, and privacy legislation.
article cover

Sopa Images/Getty Images

5 min read

Roblox bills itself as a playground where kids’ imaginations can run wild, whether it’s re-creating digital versions of chain restaurants or staging entire movies. Now, it’s looking to bring a fresh voice to another (higher-stakes) sandbox: Capitol Hill.

While execs from peer platforms like X, Discord, and Snap have been subpoenaed by Congress to testify about the dangers kids face online—and execs from others like Meta and TikTok have appeared voluntarily—Roblox public policy head Nicky Jackson Colaco said her platform isn’t waiting to be summoned.

In an interview with Tech Brew, Colaco said Roblox aims to be proactive and present itself as a partner for lawmakers who are working to balance child safety with the realities of online platforms. She added that AI has a big future for encouraging both civility and creativity on the platform.

“I believe that the intention of all these laws that we’re seeing now, and proposed regulations, are legitimately to keep children safe. And the question is, how do we do that in the right way?” she said. “There can be really productive conversations between tech companies and policymakers about how that happens. I think we’d really like to be part of those conversations.”

Put me in, coach

Roblox is inserting itself at a pivotal moment: Senate bills that attempt to address online harms are hot topics on the Hill. The Kids Online Safety Act would require platforms to keep algorithms from recommending content to minors that covers potentially harmful topics, like suicide and eating disorders. And the Children and Teens’ Online Privacy Protection Act, known as COPPA 2.0, would address Big Tech data collection and usage practices that its co-sponsor, Sen. Ed Markey, has said are partly responsible for fueling a youth mental health crisis.

Roblox has met with sponsors of both bills, Colaco said, adding that the platform will continue to provide feedback as the legislation evolves. KOSA’s critics say it could restrict access to LGBTQ+ and conservative content, and COPPA 2.0 has come under fire for potentially requiring platforms to collect even more data to determine users’ ages.

For COPPA, which would expand 25-year-old privacy protections to older teens, Roblox’s conversations have centered around determining “what is the right experience” for 13- to 16-year-old internet users, Colaco said.

In the case of KOSA, Roblox’s discussions with legislators targeted language where it might be “overreaching, or it might have more unintended consequences,” she said, particularly around what constitutes harmful content.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

She suggested overly broad definitions could have a “chilling effect on imagination and creativity.”

“I just don’t believe in my heart that anybody is introducing this legislation to hurt kids. Everybody wants good outcomes for kids,” Colaco said. “But the question is how we get there, and of course, we want to be part of that conversation.”

Safer spaces?

Roblox isn’t only relying on the legislative process to address kids’ online experiences. It’s also using some advances in AI (what the brand dubs “intelligent automation”) to tackle cyberbullying, a problem that’s arguably as old as the internet itself.

The platform is employing multimodal AI that will eventually be able to evaluate whether combinations of images and text might be offensive or intended to hurt another user, Colaco said. And it’s generating early results: The model can already flag interactions in voice conversations that violate the platform’s rules.

Colaco gave an example of what the future of AI-supported content moderation could look like.

“In a previous world, I could upload a picture of a pig,” she said. “Some kind of automated system could say, ‘That’s a pig.’ And then going forward, I could put text below that and say, ‘Oh hey, that’s Kelcee.’”

Rude! How do we shut that down?

“Multimodal AI, in the future, will be able to understand that there’s a connection between the word Kelcee and the picture [of a] pig, and probably detect that that’s insulting or potentially bullying,” Colaco said. “This concept of multimodal AI will help us be much better at addressing the complexity of safety and moderation.”

Creating the future

Colaco sees another job for AI in the near future: outsourcing coding and design for end users. Last year, the platform rolled out Roblox Assistant, which employs generative AI and lets creators converse with the chatbot to build scenes based on natural-language prompts.

“I might not need to be able to code to look at an interface and say, ‘Create a red car, make that car drive down a street, jump off a jump, and land in a pool of water,’” she said. “I could design that using my imagination, and have AI tools translate that into a coded experience.”

According to Colaco, this functionality can be democratizing.

“It allows the world of what has traditionally been computer science to be accessible to so many more people,” she said. “That will be a really cool thing to witness as the platform continues to grow.”

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.