1. Welcome! Please take a few seconds to create your free account to post threads, make some friends, remove a few ads while surfing and much more. ClutchFans has been bringing fans together to talk Houston Sports since 1996. Join us!

Kids Online Saftey Act (KOSA)

Discussion in 'BBS Hangout: Debate & Discussion' started by Amiga, Feb 15, 2024.

  1. Amiga

    Amiga 10 years ago...
    Supporting Member

    Joined:
    Sep 18, 2008
    Messages:
    21,963
    Likes Received:
    18,709
    Important. Hopefully, the House doesn't block it.

    Bill Text

    This bill, called the Kids Online Safety Act, appears to establish new regulations for online platforms and social media companies regarding the safety and privacy of minors (children under age 17).

    Some key points:
    • Requires online platforms to exercise "reasonable care" to prevent harms to minors like mental health issues, addiction, bullying, exploitation, etc. Platforms must provide tools for minors and parents to control privacy settings, limit use, report issues, etc.
      • Platforms must try to prevent and mitigate certain harms to minors in the design and implementation of any features, including:
      • Mental health issues like anxiety, depression, eating disorders
      • Addiction and compulsive usage behaviors
      • Physical violence, bullying and harassment
      • Sexual exploitation and abuse
      • Promotion of drugs, tobacco, gambling, alcohol
      • Platforms must provide minors with accessible options to:
      • Limit communications from other users
      • Prevent public access to their data
      • Limit design features that increase engagement (like infinite scroll, notifications)
      • Control personalized recommendation systems
      • Restrict sharing of their geolocation
      • Delete their account and personal data
      • Limit time spent on the platform
      • Platforms must enable parental controls over minors' accounts to manage settings, limit purchases, and monitor time spent. Parental tools must be enabled by default for young minors.
      • Platforms must provide an easy way for parents and others to report harms to minors and address reports promptly.

    • Requires transparency reports from large online platforms assessing risks to minors and steps taken to mitigate those risks.
      • Large online platforms (over 10 million monthly active US users) that are primarily focused on user-generated content and discussion must issue annual transparency reports.
      • The reports must assess the risks of harms to minors on the platform and steps taken to address those risks. This includes things like:
      • An estimate of the number and ages of minor users
      • The median time minors spend on the platform
      • Reports received related to different categories of harm (e.g. bullying, exploitation)
      • Use of design features that increase engagement and compulsive usage
      • How minors' personal data is collected and used
      • Evaluation of the effectiveness of safety features and parental controls
      • The reports must be based on independent third-party audits that examine the platform's algorithms, research, data, and other information relevant to child safety risks.
      • The transparency reports aim to provide enhanced public accountability regarding platforms' impacts on minors and their efforts to address those impacts.
      • The FTC can issue orders requiring select platforms to provide data to inform additional research on social media harms to minors.

    • Requires platforms that use "opaque algorithms" (based on user data) to allow users to easily switch to a chronological feed. Prohibits differential pricing based on algorithm use.
      • The Act requires platforms that use "opaque algorithms" to allow users to easily switch to a chronological feed.
      • An opaque algorithm is one that selects and ranks content based on user-specific data that wasn't expressly provided for that purpose. This includes things like past searches, browsing history, location history, etc.
      • Platforms must notify users that they use opaque algorithms and provide information about the types of user data they collect and use.
      • Platforms must offer users the ability to easily switch to an "input-transparent algorithm" which only uses data the user expressly provides like search terms, filters, preferences, etc.
      • Platforms cannot charge users differently or otherwise discriminate based on their choice of algorithm.
      • The requirements aim to give users more control over how platforms utilize their data to curate content in non-transparent ways.
      • After a 1 year transition period, platforms can only use opaque algorithms if they comply with these requirements.
      • Enforcement is handled by the Federal Trade Commission under its authority over unfair and deceptive practices.

    • Preempts state laws only when directly conflicting with the Act's provisions. Otherwise allows states to enact stronger protections.
    • Gives enforcement authority to the Federal Trade Commission (FTC) and state attorneys general.
    In summary, the bill aims to increase child safety and privacy protections on social media and other online platforms, with requirements for safety features, transparency, and algorithm controls. Oversight and enforcement is handled by the FTC and state governments.

    https://www.washingtonpost.com/technology/2024/02/15/kids-online-safety-act-kosa-senate/

    With more than 60 backers, an updated Kids Online Safety Act finally has a path to passage in the Senate but faces uncertainty in the House.


    After months of negotiations, senators announced Thursday that a sprawling bill to expand protections for children online had secured more than 60 backers, clearing a path to passage for what would be the most significant congressional attempt in decades to regulate tech companies.

    The Kids Online Safety Act, or KOSA, first introduced in 2022, would impose sweeping new obligations on an array of digital platforms, including requiring that companies “exercise reasonable care” to prevent their products from endangering kids. The safeguards would extend to their use of design features that could exacerbate depression, sexual exploitation, bullying, harassment and other harms.

    If passed, it would become the first major consumer privacy or child online safety measure to clear a chamber of Congress in decades. Congress has failed to pass major new internet laws despite years-long attempts to rein in Silicon Valley giants.

    The proposal gained significant traction in Washington amid mounting bipartisan concern that social media platforms could deepen mental health issues among kids and teens and expose children to dangerous material online.

    The push gained major backers, including President Biden, who endorsed the bill in July, saying of KOSA: “Pass it. Pass it. Pass it.”

    The measure already had backing from nearly half the Senate, but senators announced new support from key lawmakers, including Sen. Ted Cruz (Tex.), the top Republican on the commerce committee, and Schumer, who ultimately decides whether and when bills get taken up on the floor. Senators made the proposal a significant focus during a recent high-profile hearing on child safety with the CEOs of Meta, TikTok and other tech firms.

    While senators have largely focused on advancing more stringent protections for children and teens online, House lawmakers have devoted their energy to attempting to pass a so-called comprehensive data privacy bill that would expand safeguards for all users, not just kids. The key House committee in 2022 cleared a landmark privacy bill, but the push has since stagnated.

    The impasse between House and Senate leaders has created a regulatory vacuum that state legislators have increasingly sought to fill by passing their own privacy and child safety bills. But states’ child safety efforts have faced numerous legal setbacks, with industry groups winning early legal challenges to halt laws that would otherwise impose stricter safety obligations on tech companies or require that parents sign off on their teens’ social media use.

    Efforts to broaden protections for children online gained momentum after Facebook whistleblower Frances Haugen disclosed internal research in 2021 showing that the company’s platforms at times worsened body image issues among some teenage girls. After the disclosures, Blumenthal and Blackburn launched an investigation that included hearing from Haugen at a blockbuster session and ultimately led to the creation of KOSA.





     

Share This Page

  • About ClutchFans

    Since 1996, ClutchFans has been loud and proud covering the Houston Rockets, helping set an industry standard for team fan sites. The forums have been a home for Houston sports fans as well as basketball fanatics around the globe.

  • Support ClutchFans!

    If you find that ClutchFans is a valuable resource for you, please consider becoming a Supporting Member. Supporting Members can upload photos and attachments directly to their posts, customize their user title and more. Gold Supporters see zero ads!


    Upgrade Now