The dollars and sense of prioritizing child safety online (or, how to not be fined half a billion dollars)

By Mike Pappas, CEO, Modulate

On December 19, 2022, the FTC announced an agreement by Epic Games, the makers of Fortnite, to pay over half a billion dollars in relation to significant issues around the treatment of children online. A large block of costs will be in the form of reimbursements to customers, in response to what the FTC deemed “dark patterns” with respect to in-game purchases. But the majority of the costs relate to COPPA violations by Epic; and reveal an evolution to the way the FTC views COPPA protections and child safety issues, which every online platform should be taking notice of.

Historically, many online platforms have treated privacy and safety as being in tension with each other. And there is a sliver of truth to this – an absolutely private platform would lack the tools to actually protect its users, and no platform could offer absolute safety without watching every single thing each user did in intimate detail. It’s an extreme mistake to say that this means there’s no synergy to be had between privacy and safety, as we’ll discuss more below. But thus far, these platforms, faced with what they see as a binary choice, have recognized regulations like COPPA, GDPR, and others have real teeth in the privacy sphere whereas safety-focused regulation has, according to industry perspectives, lagged behind. As such, online platforms have generally prioritized privacy, while often neglecting safety or relying on excuses like “we can’t be held accountable for what we weren’t aware of.”

In this light, the Epic case takes on a new level of significance. The FTC’s COPPA certainly noted some deficiencies with respect to direct privacy protections. But the meat of their case actually centered on the ways that lack of privacy resulted in worse safety outcomes.

Default settings harm children and teens: Epic’s settings enable live on-by-default text and voice communications for users. The FTC alleges that these default settings, along with Epic’s role in matching children and teens with strangers to play Fortnite together, harmed children and teens. Children and teens have been bullied, threatened, harassed, and exposed to dangerous and psychologically traumatizing issues such as suicide while on Fortnite.

FTC Announcement, Dec 19, 2022

To briefly summarize, the FTC argued that numerous safety issues, such as harassment, threats, and bullying, are prolific in online text and voice chat; and that Epic’s default privacy settings didn’t do enough to protect kids from being targeted through these channels. The conclusion in this case was that Epic must change its default privacy settings, such that children are not able to voice or text chat without their parents consent.

Which does technically solve the problem, except for the part where it cripples a key capability of modern online games – their ability to offer a space for kids to socialize, form bonds, explore their own identities, build confidence, mobilize communities, and so much more.

Yes, you could argue that this capability still exists, just so long as the parents consent to it. But not every child is fortunate enough to have tech-savvy, responsive, or even present parents. In addition, parental control and age verification features have a rocky success rate at best, and are well known for being gamed in a variety of top games. Requiring parental consent is a sensible intervention for the moment; but it’s a patch, and it raises the question of what it would mean to really attack this problem at its root.

Well, attacking the problem at the root is fairly straightforward, at least in concept – it would just mean actually curating safe, respectful online chats, instead of the toxicity that gaming and the wider internet have unfortunately become known for.

Many people believe this is impossible; but it’s worth noting that, as a society, we do it all the time. If someone shows up to a playground in real life and begins loudly talking about white supremacy, or requesting photos of genitalia (even from adults, rather than kids), we have a number of ways we correct their behavior, ranging from a gentle reprimand up to and including police intervention. This doesn’t make playgrounds 100%, absolutely, unquestionably safe…but it does prevent kids from picking up questionable ideologies or offensive slurs as a matter of course in the way they often do online.

How do we solve this issue on playgrounds? It’s not by shadowing our kids, listening to every word they say and recording every move they make. But it is by having authority figures – parents, schoolteachers, coaches, etc – nearby enough that they can notice from afar when trouble seems to be starting to break out. Raised voices, an unexpected person joining the group, or even things suddenly going too quiet – these can all be clues to those nearby that something has gone wrong, and the kids need someone to step in for their own sake.

The analogous solution in the online world is known as “proactive moderation.” When done well, proactive moderation enables platforms to act just like the parents keeping an eye on the playground. The platform doesn’t listen to every little detail, doesn’t follow each kid around or even necessarily need to know which kid is which; but they do notice the telltale signs of trouble breaking out, and can then step closer, start actually paying attention to the details, and intervene in whatever way is appropriate.

In text chat, there are a number of existing solutions which offer proactive insights to studios, and many which are able to do so while maintaining due respect for the privacy of the children involved. This problem has historically been more difficult in voice chat, resulting in many studios like Epic choosing not to moderate voice chat (often in the name of preserving player privacy), though it’s become possible in recent years. 

A robust proactive voice moderation system – at least one built with privacy in mind – doesn’t listen to what’s being said in conversations; it starts out monitoring from afar, looking for simple things like heated emotions, uncharacteristic behavior, or participants who aren’t supposed to be there. It’s only after signs of trouble are recognized that it will ‘step closer’ and begin listening more closely to better understand the problem. Even then, the identities of kids and players won’t be recorded, nor will their behavior be logged long-term. Just as a good samaritan might flag to a teacher that “the kid in the orange shirt looks like he needs help”, a proactive voice moderation tool will flag to online platforms “there’s someone over here you should take a look at” – ensuring the platforms can intervene and protect their users, without putting user privacy at risk.

This gives us the opportunity to have our cake and eat it too – to enable both player safety and player privacy, despite the apparent tension between the concepts. In the physical world, we constantly walk this line between respecting the privacy of our kids and ensuring their safety, and while online platforms have fallen behind, that doesn’t mean they can’t live up to this ideal.. And, as this recent case shows, online platforms are rapidly feeling the pressure to shoot for this middle ground – not forsaking safety in the name of privacy, nor vice versa, but balancing both together,  and doing it fast. (Part of the FTC’s focus on Epic came from their frustration that Epic took too long to deploy sufficient safety features.) 

At Modulate, we believe in an internet experience that’s safe, private, authentic, and fun. For too long, platforms have assumed we can’t have them all, and left users in the lurch. There IS a way forward, but consumers and regulators won’t be satisfied until we put in the work and deliver experiences that check all the boxes together.

About Guest Author

Check Also

When We Made… Like a Dragon: Ishin!

Ryu Ga Gotoku Studio’s chief producer Hiroyuki Sakamoto talks to frequent Kamurocho visitor Vince Pavey about the remake of Like a Dragon: Ishin!, while the pair avoid the swords of the notorious Shinsengumi