1,285 enforcement actions from 14 federal and state jurisdictions. Every event traced back to its official government source.
1,285
Total Actions
14
Jurisdictions
$35.3B+
Total Fines Tracked
Connecticut’s legislature passed House Bill 5312, creating new civil enforcement mechanisms for deepfake digital sexual assault, including unauthorized dissemination of synthetically created intimate images and AI-generated child pornography. The bill establishes a private right of action for victims and empowers the Connecticut Attorney General to pursue civil injunctions and penalties against abusers and platforms hosting illegal content. This builds on prior Connecticut laws criminalizing unauthorized intimate image dissemination.
Connecticut Attorney General William Tong issued a statement on May 1, 2026, announcing the final passage of bipartisan legislation targeting youth social media addiction and artificial intelligence harms. The legislation imposes new obligations on social media companies regarding minor account settings, parental consent, and reporting, as well as requirements for AI chatbot operators and employers using automated decision tools. The statement also references ongoing enforcement actions against Meta and TikTok for allegedly designing addictive platform features for youth.
Florida Attorney General James Uthmeier opened a civil investigation into Discord and issued a subpoena demanding documents related to its marketing to children, age-verification processes, content moderation, parental controls, and reporting of child exploitative activity. The investigation alleges potential violations of Florida’s Deceptive and Unfair Trade Practices Act, citing the platform’s widespread use by child predators to target minors. Discord must produce records on its child safety practices, minor user data, and complaint handling related to child exploitation.
Virginia Attorney General Jay Jones announced intent to enforce new provisions of the Virginia Consumer Data Protection Act that limit minors' social media usage to one hour per day without parental consent. The law, effective January 1, 2026, requires age verification and verifiable parental consent to change time limits, with potential penalties up to $7,500 per violation and injunctive relief. This follows a motion to dismiss a lawsuit by NetChoice challenging the law.
Florida Attorney General James Uthmeier filed a lawsuit against Snap, Inc., operator of Snapchat, for violating Florida’s HB3 child social media protection law and the Florida Deceptive and Unfair Trade Practices Act (FDUTPA). The suit alleges Snap knowingly allowed children under 13 to create accounts, failed to obtain parental consent for 14-15 year old users, deployed addictive dark pattern design features to children, and deceived parents about platform risks including predator access, drug sales, and harmful content. The legal action seeks to hold Snap accountable for noncompliance with Florida child safety and privacy laws.
The New Jersey Attorney General filed a lawsuit against Discord, Inc. for deceptive business practices under the Consumer Fraud Act. Discord misrepresented its Safe Direct Messaging and age verification features, failing to protect children from
New York Attorney General Letitia James settled with Saturn Technologies, developer of the Saturn social networking app for high school students, over failures to protect young users’ privacy. The Office of the Attorney General found the company disabled required email verification for thousands of schools, used inadequate age and identity checks, retained user contact data after access was revoked, and failed to maintain proper privacy records. Saturn will pay $650,000 in penalties and implement enhanced privacy protections for minor users, including mandatory bi-annual privacy setting reviews and data deletion requirements.
$650K
Connecticut Attorney General William Tong announced proposed legislation to protect minors from addictive social media features. The bill would prohibit exposing minors to harmful algorithms without parental consent, set default usage limits and notification restrictions, and require annual reporting by social media companies. This follows ongoing legal actions against Meta and TikTok for youth addiction concerns.
Texas Attorney General Ken Paxton filed a lawsuit against TikTok for deceptively promoting its app as safe for children despite the prevalence of inappropriate and explicit content. The action alleges violations of the SCOPE Act, which protects children's online privacy, and follows a previous lawsuit regarding data privacy issues.
Texas Attorney General Ken Paxton launched investigations into Character.AI and 14 other companies, including Reddit, Instagram, and Discord, over potential violations of children’s privacy and safety laws. The investigations focus on compliance with the SCOPE Act and Texas Data Privacy and Security Act (TDPSA), which require parental consent for sharing minors’ data and mandate notice and consent requirements for children’s personal information. No fines or remedies have been imposed as the investigations are ongoing.
New York Attorney General Letitia James and California Attorney General Rob Bonta led a bipartisan coalition of 14 attorneys general in filing lawsuits against TikTok on October 8, 2024, alleging the platform harmed children’s mental health through addictive features and violated COPPA by collecting and monetizing data from users under 13 without parental consent. The lawsuits seek to halt TikTok’s harmful practices, impose financial penalties including disgorgement of profits from illegal practices, and secure damages for affected users. TikTok is also accused of misrepresenting the effectiveness of its safety tools and failing to warn users about harms from dangerous viral challenges and beauty filters.
Texas Attorney General Ken Paxton filed a lawsuit against TikTok for violating the Securing Children Online through Parental Empowerment (SCOPE) Act by sharing minors’ personal identifying information without parental consent and failing to provide parents with tools to manage their children’s account privacy settings. The lawsuit seeks civil penalties of up to $10,000 per violation and injunctive relief to prevent future violations. TikTok is accused of prioritizing profit over the online safety and privacy of Texas children.
The FTC staff report examined data practices of nine major social media and video streaming companies and found they engaged in vast surveillance of users with lax privacy controls and inadequate safeguards for children and teens. The report recommends limiting data collection, restricting targeted advertising, and strengthening protections for young users, and calls for comprehensive federal privacy legislation.
The FTC and DOJ sued TikTok and ByteDance for violating COPPA by collecting personal information from children under 13 without parental consent. The complaint alleges that TikTok knowingly allowed millions of children on its platform and failed to comply with a 2019 consent order. The lawsuit seeks civil penalties and a permanent injunction.
New Jersey, leading a coalition of 41 other attorneys general, sued Meta for knowingly designing addictive Instagram and Facebook features targeting children and teens while falsely claiming the platforms were safe. The lawsuit alleges Meta collected personal data from users under 13 without parental consent, violating the federal Children's Online Privacy Protection Act (COPPA) and state consumer protection laws like the New Jersey Consumer Fraud Act.
A coalition of 42 attorneys general filed a federal lawsuit against Meta, alleging that the company designed addictive features that harm youth mental health and violated COPPA by collecting children's data without parental consent. The lawsuit seeks injunctive relief, monetary penalties, and restitution.
The FTC proposed modifications to its 2020 privacy order with Meta, alleging violations including non-compliance with the order, misleading parents about Messenger Kids, and unauthorized data sharing. The proposed changes include banning monetization of youth data, pausing new product launches, and strengthening privacy requirements.
New Jersey is co-leading a multistate investigation into TikTok to determine if the platform violates consumer protection laws by using techniques that increase engagement among young users, potentially causing mental and physical harm. The investigation will examine what TikTok knows about these harms to children, teenagers, and young adults.
New Jersey is co-leading a nationwide investigation into whether Instagram and its parent company Meta Platforms, Inc. are violating state consumer protection laws by employing techniques that induce children, teenagers, and young adults to use the platform in potentially harmful ways. The bipartisan coalition of attorneys general is examining the potential mental and physical health harms resulting from extended engagement, including depression, anxiety, and body image issues.
Unixiz, Inc. agreed to shut down its i-Dressup teen social website and pay $98,618 in civil penalties to settle allegations that it violated COPPA by collecting personal information from over 2,500 New Jersey children without parental consent and failed to safeguard user data, leading to a 2016 data breach affecting more than 24,000 New Jersey residents.
$99K
All data sourced from official government enforcement pages.