The Kids Online Safety Act, or KOSA, first introduced in 2022, would impose sweeping new obligations on an array of digital platforms, including requiring that companies “exercise reasonable care” to prevent their products from endangering kids.
The safeguards would extend to their use of design features that could exacerbate depression, sexual exploitation, bullying, harassment and other harms.
Require that platforms enable their most protective privacy and safety settings by default for younger users.
Offer parents greater tools to monitor their kids’ activity.
The first major consumer privacy or child online safety measure to clear a chamber of Congress in decades.
What would KOSA require technology companies to do?
arrow left
arrow right
“Exercise reasonable care” when designing features to avoid causing or exacerbating problems such as depression, bullying and harassment.
Limit who can talk to youths through their online accounts.
Limit design features, such as infinite scrolling or notifications, that keep younger users online.
Make it easy for youths to delete their accounts and data.
Make younger users’ accounts default to the most protective privacy and safety settings.
Provide easy-to-use parental controls, including the ability to see and change their children’s privacy settings.
Allow parents to see the total time their children are spending on a platform and set restrictions.
Allow parents and educators to easily report harm to the companies.
The proposal gained significant traction in Washington amid mounting bipartisan concern that social media platforms could deepen mental health issues among kids and teens and expose children to dangerous material online.
No comments:
Post a Comment