Court Rules

Privacy Enforcement Tracker

1,285 enforcement actions from 14 federal and state jurisdictions. Every event traced back to its official government source.

1,285

Total Actions

14

Jurisdictions

$35.3B+

Total Fines Tracked

Access this data programmatically:MCP Server API Docs
CTNew Law

Bad actor platforms

Connecticut’s legislature passed House Bill 5312, creating new civil enforcement mechanisms for deepfake digital sexual assault, including unauthorized dissemination of synthetically created intimate images and AI-generated child pornography. The bill establishes a private right of action for victims and empowers the Connecticut Attorney General to pursue civil injunctions and penalties against abusers and platforms hosting illegal content. This builds on prior Connecticut laws criminalizing unauthorized intimate image dissemination.

LowConsent FailureChildren's Data
CTNew Law

social media companies

Connecticut Attorney General William Tong issued a statement on May 1, 2026, announcing the final passage of bipartisan legislation targeting youth social media addiction and artificial intelligence harms. The legislation imposes new obligations on social media companies regarding minor account settings, parental consent, and reporting, as well as requirements for AI chatbot operators and employers using automated decision tools. The statement also references ongoing enforcement actions against Meta and TikTok for allegedly designing addictive platform features for youth.

LowChildren's DataAI/Automated DecisionsConsent Failure
FLInvestigation

Discord

Florida Attorney General James Uthmeier opened a civil investigation into Discord and issued a subpoena demanding documents related to its marketing to children, age-verification processes, content moderation, parental controls, and reporting of child exploitative activity. The investigation alleges potential violations of Florida’s Deceptive and Unfair Trade Practices Act, citing the platform’s widespread use by child predators to target minors. Discord must produce records on its child safety practices, minor user data, and complaint handling related to child exploitation.

LowChildren's DataConsent Failure
VAEnforcement Action

Social Media Platforms

Virginia Attorney General Jay Jones announced intent to enforce new provisions of the Virginia Consumer Data Protection Act that limit minors' social media usage to one hour per day without parental consent. The law, effective January 1, 2026, requires age verification and verifiable parental consent to change time limits, with potential penalties up to $7,500 per violation and injunctive relief. This follows a motion to dismiss a lawsuit by NetChoice challenging the law.

LowChildren's Data
FLEnforcement Action

Snap, Inc.

Florida Attorney General James Uthmeier filed a lawsuit against Snap, Inc., operator of Snapchat, for violating Florida’s HB3 child social media protection law and the Florida Deceptive and Unfair Trade Practices Act (FDUTPA). The suit alleges Snap knowingly allowed children under 13 to create accounts, failed to obtain parental consent for 14-15 year old users, deployed addictive dark pattern design features to children, and deceived parents about platform risks including predator access, drug sales, and harmful content. The legal action seeks to hold Snap accountable for noncompliance with Florida child safety and privacy laws.

LowChildren's DataConsent FailureNotice Failure
NJEnforcement Action

Discord, Inc.(Discord)

The New Jersey Attorney General filed a lawsuit against Discord, Inc. for deceptive business practices under the Consumer Fraud Act. Discord misrepresented its Safe Direct Messaging and age verification features, failing to protect children from

LowChildren's DataSecurity Failure
CTNew Law

Social Media Companies

Connecticut Attorney General William Tong announced proposed legislation to protect minors from addictive social media features. The bill would prohibit exposing minors to harmful algorithms without parental consent, set default usage limits and notification restrictions, and require annual reporting by social media companies. This follows ongoing legal actions against Meta and TikTok for youth addiction concerns.

LowChildren's DataDark Patterns
TXEnforcement Action

TikTok

Texas Attorney General Ken Paxton filed a lawsuit against TikTok for deceptively promoting its app as safe for children despite the prevalence of inappropriate and explicit content. The action alleges violations of the SCOPE Act, which protects children's online privacy, and follows a previous lawsuit regarding data privacy issues.

LowChildren's Data
TXInvestigation

Character.AI, Reddit, Instagram, Discord, and 14 other companies

Texas Attorney General Ken Paxton launched investigations into Character.AI and 14 other companies, including Reddit, Instagram, and Discord, over potential violations of children’s privacy and safety laws. The investigations focus on compliance with the SCOPE Act and Texas Data Privacy and Security Act (TDPSA), which require parental consent for sharing minors’ data and mandate notice and consent requirements for children’s personal information. No fines or remedies have been imposed as the investigations are ongoing.

LowChildren's DataConsent FailureNotice Failure
NYEnforcement ActionMultistate

TikTok

New York Attorney General Letitia James and California Attorney General Rob Bonta led a bipartisan coalition of 14 attorneys general in filing lawsuits against TikTok on October 8, 2024, alleging the platform harmed children’s mental health through addictive features and violated COPPA by collecting and monetizing data from users under 13 without parental consent. The lawsuits seek to halt TikTok’s harmful practices, impose financial penalties including disgorgement of profits from illegal practices, and secure damages for affected users. TikTok is also accused of misrepresenting the effectiveness of its safety tools and failing to warn users about harms from dangerous viral challenges and beauty filters.

LowChildren's DataConsent Failure
TXEnforcement Action

TikTok

Texas Attorney General Ken Paxton filed a lawsuit against TikTok for violating the Securing Children Online through Parental Empowerment (SCOPE) Act by sharing minors’ personal identifying information without parental consent and failing to provide parents with tools to manage their children’s account privacy settings. The lawsuit seeks civil penalties of up to $10,000 per violation and injunctive relief to prevent future violations. TikTok is accused of prioritizing profit over the online safety and privacy of Texas children.

LowChildren's DataConsent FailureUnauthorized Data Sharing
FTCGuidance

Major Social Media and Video Streaming Companies (Amazon, Meta, YouTube, X, Snap, TikTok, Discord, Reddit, WhatsApp)(Major Social Media and Video Streaming Companies)

The FTC staff report examined data practices of nine major social media and video streaming companies and found they engaged in vast surveillance of users with lax privacy controls and inadequate safeguards for children and teens. The report recommends limiting data collection, restricting targeted advertising, and strengthening protections for young users, and calls for comprehensive federal privacy legislation.

LowChildren's DataOpt-Out FailureUnauthorized Data Sharing
FTCEnforcement Action

TikTok and ByteDance(TikTok)

The FTC and DOJ sued TikTok and ByteDance for violating COPPA by collecting personal information from children under 13 without parental consent. The complaint alleges that TikTok knowingly allowed millions of children on its platform and failed to comply with a 2019 consent order. The lawsuit seeks civil penalties and a permanent injunction.

LowChildren's DataConsent FailureNotice Failure
FTCAdministrative Order

Meta

The FTC proposed modifications to its 2020 privacy order with Meta, alleging violations including non-compliance with the order, misleading parents about Messenger Kids, and unauthorized data sharing. The proposed changes include banning monetization of youth data, pausing new product launches, and strengthening privacy requirements.

LowChildren's DataConsent FailureNotice Failure
NJInvestigationMultistate

TikTok

New Jersey is co-leading a multistate investigation into TikTok to determine if the platform violates consumer protection laws by using techniques that increase engagement among young users, potentially causing mental and physical harm. The investigation will examine what TikTok knows about these harms to children, teenagers, and young adults.

LowChildren's Data
NJInvestigationMultistate

Meta Platforms, Inc.(Meta)

New Jersey is co-leading a nationwide investigation into whether Instagram and its parent company Meta Platforms, Inc. are violating state consumer protection laws by employing techniques that induce children, teenagers, and young adults to use the platform in potentially harmful ways. The bipartisan coalition of attorneys general is examining the potential mental and physical health harms resulting from extended engagement, including depression, anxiety, and body image issues.

LowChildren's Data
NJConsent Decree

Unixiz, Inc.(Unixiz)

Unixiz, Inc. agreed to shut down its i-Dressup teen social website and pay $98,618 in civil penalties to settle allegations that it violated COPPA by collecting personal information from over 2,500 New Jersey children without parental consent and failed to safeguard user data, leading to a 2016 data breach affecting more than 24,000 New Jersey residents.

LowChildren's DataSecurity Failure

$99K

Explore Enforcement Data