1,285 enforcement actions from 14 federal and state jurisdictions. Every event traced back to its official government source.
1,285
Total Actions
14
Jurisdictions
$35.3B+
Total Fines Tracked
Connecticut’s legislature passed House Bill 5312, creating new civil enforcement mechanisms for deepfake digital sexual assault, including unauthorized dissemination of synthetically created intimate images and AI-generated child pornography. The bill establishes a private right of action for victims and empowers the Connecticut Attorney General to pursue civil injunctions and penalties against abusers and platforms hosting illegal content. This builds on prior Connecticut laws criminalizing unauthorized intimate image dissemination.
Connecticut Attorney General William Tong issued a statement on May 1, 2026, announcing the final passage of bipartisan legislation targeting youth social media addiction and artificial intelligence harms. The legislation imposes new obligations on social media companies regarding minor account settings, parental consent, and reporting, as well as requirements for AI chatbot operators and employers using automated decision tools. The statement also references ongoing enforcement actions against Meta and TikTok for allegedly designing addictive platform features for youth.
The FTC settled with Humor Rainbow, Inc. (operator of OkCupid) and Match Group Americas over allegations that OkCupid deceived users by sharing personal data including photos and location information with an unauthorized third party, contrary to its privacy policy promises to inform users and provide opt-out opportunities. The settlement permanently prohibits the companies from misrepresenting their data collection, use, disclosure, and privacy control practices. No monetary penalty was imposed.
Florida Attorney General James Uthmeier opened a civil investigation into Discord and issued a subpoena demanding documents related to its marketing to children, age-verification processes, content moderation, parental controls, and reporting of child exploitative activity. The investigation alleges potential violations of Florida’s Deceptive and Unfair Trade Practices Act, citing the platform’s widespread use by child predators to target minors. Discord must produce records on its child safety practices, minor user data, and complaint handling related to child exploitation.
Florida Attorney General James Uthmeier filed a lawsuit against Snap, Inc., operator of Snapchat, for violating Florida’s HB3 child social media protection law and the Florida Deceptive and Unfair Trade Practices Act (FDUTPA). The suit alleges Snap knowingly allowed children under 13 to create accounts, failed to obtain parental consent for 14-15 year old users, deployed addictive dark pattern design features to children, and deceived parents about platform risks including predator access, drug sales, and harmful content. The legal action seeks to hold Snap accountable for noncompliance with Florida child safety and privacy laws.
New York Attorney General Letitia James settled with Saturn Technologies, developer of the Saturn social networking app for high school students, over failures to protect young users’ privacy. The Office of the Attorney General found the company disabled required email verification for thousands of schools, used inadequate age and identity checks, retained user contact data after access was revoked, and failed to maintain proper privacy records. Saturn will pay $650,000 in penalties and implement enhanced privacy protections for minor users, including mandatory bi-annual privacy setting reviews and data deletion requirements.
$650K
Texas Attorney General Ken Paxton launched investigations into Character.AI and 14 other companies, including Reddit, Instagram, and Discord, over potential violations of children’s privacy and safety laws. The investigations focus on compliance with the SCOPE Act and Texas Data Privacy and Security Act (TDPSA), which require parental consent for sharing minors’ data and mandate notice and consent requirements for children’s personal information. No fines or remedies have been imposed as the investigations are ongoing.
New York Attorney General Letitia James and California Attorney General Rob Bonta led a bipartisan coalition of 14 attorneys general in filing lawsuits against TikTok on October 8, 2024, alleging the platform harmed children’s mental health through addictive features and violated COPPA by collecting and monetizing data from users under 13 without parental consent. The lawsuits seek to halt TikTok’s harmful practices, impose financial penalties including disgorgement of profits from illegal practices, and secure damages for affected users. TikTok is also accused of misrepresenting the effectiveness of its safety tools and failing to warn users about harms from dangerous viral challenges and beauty filters.
Texas Attorney General Ken Paxton filed a lawsuit against TikTok for violating the Securing Children Online through Parental Empowerment (SCOPE) Act by sharing minors’ personal identifying information without parental consent and failing to provide parents with tools to manage their children’s account privacy settings. The lawsuit seeks civil penalties of up to $10,000 per violation and injunctive relief to prevent future violations. TikTok is accused of prioritizing profit over the online safety and privacy of Texas children.
The FTC staff report examined data practices of nine major social media and video streaming companies and found they engaged in vast surveillance of users with lax privacy controls and inadequate safeguards for children and teens. The report recommends limiting data collection, restricting targeted advertising, and strengthening protections for young users, and calls for comprehensive federal privacy legislation.
The FTC and DOJ sued TikTok and ByteDance for violating COPPA by collecting personal information from children under 13 without parental consent. The complaint alleges that TikTok knowingly allowed millions of children on its platform and failed to comply with a 2019 consent order. The lawsuit seeks civil penalties and a permanent injunction.
Meta captured facial recognition data from millions of Texans without consent, violating Texas biometric privacy laws. The company agreed to pay $1.4 billion over five years to settle the case. This is the largest privacy settlement obtained by a single state.
$1.4B
A coalition of 42 attorneys general filed a federal lawsuit against Meta, alleging that the company designed addictive features that harm youth mental health and violated COPPA by collecting children's data without parental consent. The lawsuit seeks injunctive relief, monetary penalties, and restitution.
New Jersey, leading a coalition of 41 other attorneys general, sued Meta for knowingly designing addictive Instagram and Facebook features targeting children and teens while falsely claiming the platforms were safe. The lawsuit alleges Meta collected personal data from users under 13 without parental consent, violating the federal Children's Online Privacy Protection Act (COPPA) and state consumer protection laws like the New Jersey Consumer Fraud Act.
The FTC proposed modifications to its 2020 privacy order with Meta, alleging violations including non-compliance with the order, misleading parents about Messenger Kids, and unauthorized data sharing. The proposed changes include banning monetization of youth data, pausing new product launches, and strengthening privacy requirements.
The FTC charged Facebook with deceiving consumers about its privacy practices and violating a 2012 consent order. In July 2019, Facebook agreed to pay a $5 billion civil penalty and accept comprehensive new privacy restrictions.
$5.0B
All data sourced from official government enforcement pages.