FTC’s Scaling Back Big Ambitions While Doubling Down on Kids, AI, and Enforcement

Last week’s summit on protecting kids online wasn’t your typical policy wonk gathering. Titled “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families”, the event was more than just a public airing of grievances—it was a tell by the administration. The Republican-led Federal Trade Commission (FTC), chaired by Andrew Ferguson, clearly signaled that it was making a sharp pivot from Lina Khan’s FTC, towards one that blends a moral crusade with a recalibration of regulatory strategy.
The Kids Are the Catalyst
Kids are no longer just a sidebar—they’re the spark driving the FTC’s next wave of enforcement. The agency made it crystal clear: protecting children online is Priority Number One. The Commission’s focus goes well beyond basic screen-time concerns. They’re also zeroing in on how AI interacts with minors, the risks of data collection, and the harm caused by addictive or deceptive algorithms. EdTech companies in particular, will face intense scrutiny over how they handle student data, with calls for greater transparency and stronger parental consent mechanisms. Age verification is becoming table stakes, with courts and states pushing for businesses to know who’s behind the screen.
Enforcement Expected Against COPPA and the Take It Down Act Violators
Forget waiting for sweeping new federal privacy laws, the FTC is wielding existing tools and the laws behind them with precision. Aggressive enforcement of the Children’s Online Privacy Protection Act (COPPA) is underway, including large settlements like the $20 million Cognosphere case. New COPPA rules also require parental opt-in consent before sharing kids’ data with third parties, ramping up transparency and control. The amended Rule is effective June 23, 2025 and enforcement will kick in shortly thereafter.
The Commission also plans to rely on the recently enacted Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (the “Take It Down Act”), which criminalizes the non-consensual sharing of intimate images and forces platforms to act fast on removals.
The FTC is also cracking down on deceptive apps that exploit minors, wielding its Section 5 authority to ban harmful practices. Age verification tech—think more IDs and selfies, and not just “click to confirm”—is gaining traction as a must-have for compliance.
EdTech providers are also expected to face additional fire, with enforcement aimed at ensuring they don’t monetize student data or evade parental consent. The FTC’s stance is clear: these companies can’t treat children’s data like a commodity anymore.
Significant Scaling Back from the Khan-Era Ambitions
The FTC, under Lina Khan, came in swinging, chasing sweeping rulemaking and big-picture reforms. But at the recent summit, the FTC telegraphed that this era was over. FTC Chair, Andrew Ferguson, stated
"We must carefully limit regulation and enforcement to the noble goal of protecting children, and ensure that we do not squelch the entrepreneurial spirit that makes America great. But we cannot stand idly by and invoke innovation as a reason not to take steps to protect children. The whole point of innovation in a well-ordered society is to promote the flourishing of ordinary families in that society. We therefore cannot make our children’s futures just another tradeoff for technological innovation. That would defeat the purpose of promoting innovation in the first place."
The message was pretty clear: Ferguson’s FTC is prioritizing targeted, public enforcement actions, especially where kids and AI are involved, rather than pursuing more lofty, open-ended policy quests that risk stifling innovation. That means we’re more likely to see actions against deceptive UX designs, dark patterns, and AI misuse that targets children, as well as age gates for keeping kids out of the darker places on the internet. Conversely, the FTC will be considerably more hands-off when it comes to regulating other areas, like privacy, consumer protection, and overall AI threats and impacts. To sum it up: the FTC recognizes innovation’s role but insists that children’s safety won’t be negotiable.
What It Means for Businesses
If your tech, Adtech, EdTech, or AI products target kids or teens, listen up:
- It’s time to pay attention. Enforcement is in; vague guidance and signals are out.
- Expect enforcement. Companies should brace for investigations, penalties, and public scrutiny for missteps. And it’s not just the FTC on the move: State attorneys general are also becoming active players in this space, adding further layers of risk.
- Know Your Customer(s age), If you are in the business of selling to kids, age verification needs to be top-of-mind and meaningful. Ditch those flimsy “click to confirm” mechanisms for stronger verification methods.
- AI use will be scrutinized. AI use involving minors will be under a microscope, with regulators demanding transparency and accountability. And when it comes to data misuse, the FTC is treating it like economic harm—and no longer just collateral damage.
The FTC’s new enforcement approach is fast, focused, and politically-supported from both sides of the aisle. Kids' privacy and protection is an unquestionably bipartisan issue in 2025. If your business is aimed at children, get your data practices, AI policies, and age verification processes in order now. The spotlight on kid-focused online safety is only getting brighter.