At least 15 states have enacted or are pursuing legislation that would require online companies to protect the safety and privacy of kids using their platforms, putting pressure on Congress to pass more unifying federal legislation.
California in 2021 was among the first to do so when it enacted a measure that requires social media and online companies to “prioritize” the health and well-being of children before launching apps and services publicly.
The California law was halted after NetChoice, a tech industry trade group, sued to block it. In September, a U.S. District Court ruled that parts of the law probably violated First Amendment rights to free speech. California Attorney General Rob Bonta has appealed the ruling to the U.S. Court of Appeals for the 9th Circuit.
Despite that setback, however, legislators in other states have proposed bills modeled on California’s approach, known as age-appropriate design, as well as other measures that require parental consent for kids using online services.
Advocates for children’s online safety are hoping that Congress will enact federal legislation rather than allowing a piecemeal, state-by-state approach. They hope to rein in tech platforms designed to keep kids online for hours every day, blaming the platforms for a host of mental health problems, sleeplessness and eating disorders.
A study by the Harvard T.H. Chan School of Public Health released in late December found that social media companies Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter) and YouTube collectively generated $11 billion in ad revenue in 2022 from U.S. based users younger than 18. Of that, about $2.1 billion came from users 12 or under who are not permitted on such platforms under the terms of service, the study found.
Unlike federal data privacy legislation, which has stalled in Congress while states have enacted measures, kids’ online safety is a more tangible issue for voters and…
Read the full article here