Mar 18 2024

The Kids Online Safety Act (KOSA) Explained: What the Drafted Bill Could Mean for Your Online Platform

Post By: Safer / 3 min read


The Kids Online Safety Act (KOSA) is a comprehensive children’s online safety bill that seeks to establish legal standards to protect minors and require platforms to better mitigate online harms. The bill passed out of the Senate Commerce, Science, and Transportation Committee in mid-December 2023 and, as of mid-March 2024, is pending consideration on the Senate floor. The bill recently gained a lot of attention after undergoing revisions and gaining co-sponsorship from 62 senators (now up to 65) in mid-February.

What are the key provisions in KOSA?

Duty of Care Provision

KOSA would establish a “duty of care” for covered platforms, which means that platforms would be mandated to “exercise reasonable care” in the creation and implementation of any design or feature to prevent/mitigate various harms:

  • Mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors
  • Patterns of use that indicate or encourage addiction-like behaviors
  • Physical violence, online bullying, and harassment of minors.
  • Sexual exploitation and abuse
  • Promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol
  • Predatory, unfair, or deceptive marketing practices, or other financial harms

Safeguards for Minors

KOSA would require covered platforms to put many features in place, enabled by default, to safeguard minors, specifically:

  • Restricted communications
  • Restricted public access to personal data
  • Restricted sharing of geolocation data
  • Limited features that result in “compulsive usage” of the platform

KOSA would also provide minors with the ability to opt-out of algorithmic recommendations, delete their account and associated data, and limit the amount of time spent on the platform.

Tools for Parents

KOSA would require covered platforms to introduce tools for parents to manage the minor’s privacy and account settings, restrict purchases, and view metrics on usage; as well as ensure that both parents and minors have access to easy-to-use reporting mechanisms to report harms on the platform.

Transparency Requirement

KOSA would require covered platforms to undergo independent, third-party audits and issue public transparency reports detailing foreseeable risks to minors, as well as the prevention and mitigation efforts they have taken to address these harms.

What are the potential paths forward?

As of mid-March 2024, KOSA has gained strong bi-partisan support (co-sponsorship) from 65 senators, which is theoretically enough to pass a Senate floor vote; however, one has not been held or scheduled yet. If KOSA passes the Senate, it will then be up to the House of Representatives to consider the bill. The path forward in the House is uncertain and there has not been a similar or identical bill to KOSA introduced in the House to date.

It is important to note that earlier versions of the bill gave broader enforcement powers to state Attorneys General (AGs), which concerned many LGBTQ+ groups. The February revisions to KOSA restricted the enforcement powers of state AGs to only certain parts of the bill and, notably, gave sole enforcement of the “duty of care” provision to the Federal Trade Commission (FTC). The revisions, which are reflected in the current version of the bill in the Senate, led many LGBTQ+ groups to drop their opposition to KOSA.

If KOSA gains support in both chambers, what could this mean for covered platforms?

Note that KOSA would only apply to “covered platforms” which are defined as an “online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” There are several exclusions to this definition, including common carriers, broadband internet services, email services, video conferencing services, wireless messaging services, nonprofits, schools, libraries, news websites, B2B software, and VPNs.

The duty of care provision could require platforms to rethink their approach to platform design, community guidelines, and content moderation policies with prevention of various harms to minors at the top of mind.

The safeguards for minors provision could require platforms to implement new tools or functionalities for users. If platforms are looking to implement new safeguards, Safety by Design should be top of mind and Red Teaming may be beneficial, to ensure that minors or bad actors cannot easily circumvent safeguards.

The transparency requirement would require platforms to provide more transparency into their efforts to combat various harms, including online child sexual exploitation and abuse. Regardless of this particular bill's future, online platforms seeking to build trust with the public can consider proactively building a framework for transparency. Our partners at TechCoalition have developed such a framework, specific to online child sexual exploitation and abuse.

You've successfully subscribed to Safer: Proactive Solution for CSE and CSAM Detection!