California has an opportunity to shape how the world protects children online

What was once a nagging worry that your kids were spending too much time gazing and scrolling is now supported by a growing societal awareness of a canon of harms: depression, self-harm, suicide, grooming, deliberate addictive loops, child sexual abuse, and an aggressive presence of hardcore porn in the hands of young children and coding for 6 years old.

The brutal reality is that the online world was simply not designed with children in mind, yet children are everywhere, even on sites that sternly insist that they must say they are 18 in order to enter.

A proposed bill for an “Age Appropriate Design Code” was recently approved by the Privacy Committee of the California Assembly by a vote of 10-0 with one abstention. It closely tracks the U.K.’s Age Appropriate Design Code that came into law six months ago.

The U.K. Code is a paradigm shift in how we tackle tech’s problem with kids. First, it defines a child as a person under the age of 18. The current norm of treating 13-year-olds online as adults fails to protect the most vulnerable period in a child’s life when they are no longer overseen closely by adults, but neither are they adults themselves.

The U.K. Code applies to all services “likely to be accessed” by children. That means where children are, not where we might like them to be. It covers social media, games, and services that were not designed for children yet have a large proportion or number of children as users.

Perhaps most crucially, it introduces a product safety approach in which the tech provider holds responsibility for considering the impact of its product on kids.

No law can make a complex system 100% safe, nor will one law tackle all the ills of the digital world. Instead of trying to define rigid legal boundaries, the Code imposes a principles-based duty that, as legal scholar Philip Howard observes, has the virtue of driving providers towards doing what’s responsible in the current environment, rather than rote compliance with specific rules that may have been written for an environment that no longer exists.

Instead of narrowly specifying rules about content, which can easily shade into censorship, the Code imposes a duty on companies to ask themselves: What would you do differently if you knew your end user was a child?

Among its 15 provisions are requirements to ensure that they can no longer turn a blind eye when their algorithms recommend self-harm or suicide material, coax children into addictive loops to maximize attention, or share their real-time location.

While the U.K. Code authorizes large fines, it is structured to identify risks and fix them, in a collaborative regulatory model that solves problems as they come up, not a framework focused on “gotcha.”

The platforms’ responses to the U.K. code have been encouraging. YouTube turned off auto-play for users under 18, TikTok stopped notifications through the night giving the App a bedtime of nine p.m. for those under 15 and 10 p.m., for ages 16 and 17. Google has defaulted to safe search for those under 18 and taken the 18+ Apps out of the store if you are registered as a child. Instagram and TikTok have (finally) prevented unknown adults from directly messaging kids. These are just a handful of the hundreds of changes that include changes to privacy, sharing, and profiling, as well as offering wellbeing measures and age-specific information, all to comply with the code.

The principles-based code has the virtue of looking forward, not fighting the last war. It encourages good practices based on what we know at any point. It applies to the next TikTok, as well as this one. It requires that the metaverse be designed with children in mind.

Tech companies should not be incentivized to circle their wagons to defend the past but rather to look ahead and embrace their responsibility to keep children safe whatever the nature of the service or the technology used.

The harm to young people from our current “anything goes” approach is now undeniable. The U.S. has the power to create a much healthier global norm. The U.K. code has shown the way, and California lawmakers have taken a bold step by backing this new paradigm. Parents are desperately calling for change and kids themselves are increasingly asking for a kinder and safer environment in which to grow up.

The question now is: Will big tech come to the table, or will they just continue to deploy ads that say how much they care about kids while simultaneously using their power to keep them hooked on damaging content?

The introduction of the California Code is a first step toward protecting children in California. If it pressures the platforms to roll out these changes everywhere, it would be a giant leap toward protecting children around the globe.

There are over 600 million children online. It's time that we compel tech companies to design their products with children in mind.

Beeban Kidron OBE is a crossbench peer in the U.K. House of Lords, and the architect of the U.K.’s Age Appropriate Design Code. Jonathan Haidt is a social psychologist at New York University’s Stern School of Business, and a co-author of The Coddling of the American Mind.

Views: 3

Comment

You need to be a member of On Feet Nation to add comments!

Join On Feet Nation

© 2024   Created by PH the vintage.   Powered by

Badges  |  Report an Issue  |  Terms of Service