1,285 enforcement actions from 14 federal and state jurisdictions. Every event traced back to its official government source.
1,285
Total Actions
14
Jurisdictions
$35.3B+
Total Fines Tracked
The Supreme Court upheld a Texas law requiring pornography websites to implement age-verification measures to protect children from explicit content. Attorney General Ken Paxton is enforcing the law with fines for violations and has sued Aylo Global Entertainment for non-compliance.
Florida Attorney General James Uthmeier filed a lawsuit against Snap, Inc., operator of Snapchat, for violating Florida’s HB3 child social media protection law and the Florida Deceptive and Unfair Trade Practices Act (FDUTPA). The suit alleges Snap knowingly allowed children under 13 to create accounts, failed to obtain parental consent for 14-15 year old users, deployed addictive dark pattern design features to children, and deceived parents about platform risks including predator access, drug sales, and harmful content. The legal action seeks to hold Snap accountable for noncompliance with Florida child safety and privacy laws.
The Connecticut Office of the Attorney General released an updated enforcement report on the Connecticut Data Privacy Act (CTDPA) for 2024, summarizing investigations into companies handling connected vehicles, genetic data, palm recognition, teen messaging apps, and facial recognition. The report outlines expanded enforcement priorities around opt-out practices and dark patterns, and includes legislative recommendations to strengthen the CTDPA.
The New Jersey Attorney General filed a lawsuit against Discord, Inc. for deceptive business practices under the Consumer Fraud Act. Discord misrepresented its Safe Direct Messaging and age verification features, failing to protect children from
Florida Attorney General James Uthmeier issued a subpoena to Roblox on April 16, 2025, as part of an investigation into the gaming platform’s child-protection policies and children’s data practices. The subpoena demands documents related to Roblox’s marketing to children, age-verification procedures, chat moderation, and processing of minors’ personal data, following reports of children being exposed to harmful content and predatory actors on the platform. No fines or remedies have been imposed yet, as the investigation is ongoing.
Connecticut Attorney General William Tong announced proposed legislation to protect minors from addictive social media features. The bill would prohibit exposing minors to harmful algorithms without parental consent, set default usage limits and notification restrictions, and require annual reporting by social media companies. This follows ongoing legal actions against Meta and TikTok for youth addiction concerns.
Texas Attorney General Ken Paxton defended House Bill 1181 at the U.S. Supreme Court, which requires online pornography sites to verify users' ages to protect children from harmful content. The law was challenged by pornography distributors, but Texas won at the Fifth Circuit and is now defending its constitutionality. Texas has also sued Aylo Global Entertainment for non-compliance, leading to Pornhub's shutdown in Texas.
Texas Attorney General Ken Paxton filed a lawsuit against TikTok for deceptively promoting its app as safe for children despite the prevalence of inappropriate and explicit content. The action alleges violations of the SCOPE Act, which protects children's online privacy, and follows a previous lawsuit regarding data privacy issues.
Texas Attorney General Ken Paxton launched investigations into Character.AI and 14 other companies, including Reddit, Instagram, and Discord, over potential violations of children’s privacy and safety laws. The investigations focus on compliance with the SCOPE Act and Texas Data Privacy and Security Act (TDPSA), which require parental consent for sharing minors’ data and mandate notice and consent requirements for children’s personal information. No fines or remedies have been imposed as the investigations are ongoing.
Texas Attorney General Ken Paxton announced investigations into 15 companies, including Character.AI, Reddit, Instagram, and Discord, for potential violations of the SCOPE Act and TDPSA concerning children's privacy. The investigations target practices such as unauthorized sharing of minors' personal data and failure to provide parental controls. This action is part of Texas's broader initiative to enforce data privacy laws.
New York Attorney General Letitia James and California Attorney General Rob Bonta led a bipartisan coalition of 14 attorneys general in filing lawsuits against TikTok on October 8, 2024, alleging the platform harmed children’s mental health through addictive features and violated COPPA by collecting and monetizing data from users under 13 without parental consent. The lawsuits seek to halt TikTok’s harmful practices, impose financial penalties including disgorgement of profits from illegal practices, and secure damages for affected users. TikTok is also accused of misrepresenting the effectiveness of its safety tools and failing to warn users about harms from dangerous viral challenges and beauty filters.
Texas Attorney General Ken Paxton filed a lawsuit against TikTok for violating the Securing Children Online through Parental Empowerment (SCOPE) Act by sharing minors’ personal identifying information without parental consent and failing to provide parents with tools to manage their children’s account privacy settings. The lawsuit seeks civil penalties of up to $10,000 per violation and injunctive relief to prevent future violations. TikTok is accused of prioritizing profit over the online safety and privacy of Texas children.
The FTC staff report examined data practices of nine major social media and video streaming companies and found they engaged in vast surveillance of users with lax privacy controls and inadequate safeguards for children and teens. The report recommends limiting data collection, restricting targeted advertising, and strengthening protections for young users, and calls for comprehensive federal privacy legislation.
The Federal Trade Commission filed an amicus brief in a lawsuit where parents sued IXL Learning for allegedly collecting and selling children's data without proper consent. The FTC argued that under COPPA, school district agreements to arbitration do not bind parents. The brief opposes IXL Learning's attempt to compel arbitration.
The FTC and DOJ sued TikTok and ByteDance for violating COPPA by collecting personal information from children under 13 without parental consent. The complaint alleges that TikTok knowingly allowed millions of children on its platform and failed to comply with a 2019 consent order. The lawsuit seeks civil penalties and a permanent injunction.
Texas Attorney General Ken Paxton announced a settlement with Multi Media, LLC, operator of Chaturbate, for violating Texas age verification law HB 1181. The company agreed to implement an age verification service on its website to prevent minors from accessing adult content. No monetary penalty was imposed in this settlement.
The FTC has proposed amendments to the COPPA Rule to enhance children's privacy protections. Key changes include requiring separate parental consent for targeted advertising, prohibiting conditioning access on data collection, limiting push notifications, strengthening data security and retention requirements, and restricting commercial use in educational technology. The proposal shifts responsibility from parents to companies to safeguard children's data.
The FTC proposed modifications to its 2020 privacy order with Meta, alleging violations including non-compliance with the order, misleading parents about Messenger Kids, and unauthorized data sharing. The proposed changes include banning monetization of youth data, pausing new product launches, and strengthening privacy requirements.
New Jersey is co-leading a multistate investigation into TikTok to determine if the platform violates consumer protection laws by using techniques that increase engagement among young users, potentially causing mental and physical harm. The investigation will examine what TikTok knows about these harms to children, teenagers, and young adults.
New Jersey is co-leading a nationwide investigation into whether Instagram and its parent company Meta Platforms, Inc. are violating state consumer protection laws by employing techniques that induce children, teenagers, and young adults to use the platform in potentially harmful ways. The bipartisan coalition of attorneys general is examining the potential mental and physical health harms resulting from extended engagement, including depression, anxiety, and body image issues.
A caseworker with the New Jersey Division of Child Protection and Permanency was charged with criminal offenses for allegedly accessing and disclosing confidential DCF database records without authorization. The charges include Computer Theft and Unlawful Access and Disclosure. The investigation was conducted by the New Jersey State Police.
The FTC removed Aristotle International, Inc. from its list of approved COPPA Safe Harbor programs due to insufficient monitoring of member companies' compliance with COPPA guidelines. This action prevents operators from using Aristotle's program for favorable regulatory treatment and marks the first such removal since COPPA's inception.
The FTC finalized a settlement with Miniclip, S.A. for falsely claiming it was a member of the CARU COPPA safe harbor program. Miniclip is prohibited from misrepresenting its participation in privacy programs and subject to compliance and recordkeeping requirements.
Unixiz, Inc. agreed to shut down its i-Dressup teen social website and pay $98,618 in civil penalties to settle allegations that it violated COPPA by collecting personal information from over 2,500 New Jersey children without parental consent and failed to safeguard user data, leading to a 2016 data breach affecting more than 24,000 New Jersey residents.
$99K
The New Jersey Attorney General settled with Dokogeo, the developer of the Dokobots app, for violating COPPA by collecting personal information from children without parental consent. The settlement requires Dokogeo to disclose its data practices, stop collecting children's data, delete existing children's data, and pay a suspended $25,000 penalty.
$25K
All data sourced from official government enforcement pages.