The Tigers at the Gate: Moving Privacy Forward Through Proactive Transparency
US privacy regulators don’t want a cyberpunk future, let alone a grimdark one. They are genuinely trying to understand new technologies and ways of doing business online. A little proactive transparency and courage to cooperate goes a long way.
In April, I attended the IAB Tech Lab’s conference, which focused on the perfect storm of regulatory and market pressures facing the adtech ecosystem. The theme was clear: adapt or be left behind in a privacy-forward operating environment. The urgency to navigate these pressures with integrity and transparency was echoed again at the NAI Summit 2024. But with sobering modifiers.
The first, expressed by Alan Chapell, NAI Board Chair, was for industry to have courage in the face of difficult decisions. Profile-targeting -- dubbed ‘surveillance advertising’ by the Center of Democracy and Technology among other advocacy groups -- may well need to go away unless done through a combination of PETs.
The second, offered by Gena Feist, Assistant AG of the State of New York, was for industry to take enforcers’ olive branches seriously. Her no-nonsense advice? “Don’t get cute with the regulators.”
The FTC, CPPA and other state AGs don’t want a cyberpunk future, let alone a grimdark one. They are genuinely trying to understand new technologies and ways of doing business online. Yet, their jobs become unnecessarily more difficult when companies (or their counsel) play games. That’s when the teeth come out.
“We [AG staff] have law degrees. We can see what’s happening [when we get runarounds]” – Gena Feist, AAG, New York
Credibility Crash
US regulators are not only becoming more knowledgeable but also more proactive. They are reading privacy policies, scrutinizing data handling practices and looking to engage on frontier matters of policy.
Take Opt-Out Preference Signals (OOPS) like GPC for instance. GPC is easy to set up and listen for because it is a simple HTTP header. Perhaps too simple as it only conveys whether the choice mechanism is turned on or off (GPC=1 or GPC=0). Unlike the more complex IAB EU's Transparency and Consent signal (TC String) signal itself does not encode information about the source of the opt-out signal, or provide details about how the signaling mechanism was implemented or presented to users.
In discussing the “smorgasbord of different privacy signals proliferating in the marketplace”, Joshua Koran, Chief Product Officer at InMarket, and Don Marti, VP of Ecosystem Innovation at Raptive discussed how unscrupulous actors could spoof GPC=0 (not opted out) signals to inflate ad revenues, and advertisers and their buy side partners would not be able to authenticate the signal (i.e. whether it was being adulterated by an MFA or pirate site) without cross-industry tools and information sharing.
They also raised a dystopian twist on fraudulent ID bridging and email bombing, where opt-out signals are weaponized against competitors. As sensational as this all sounds, it speaks to a genuine unease by industry in being required by law and policy to respect GPC and other such mechanisms, but with few protections for businesses. For example, Colorado requires that universal opt-out mechanisms not be turned on by default, but is yet to offer standards for evaluating valid implementations.
Regulators need to understand that these scenarios can happen and be open to industry-led guardrails such as those built into the IAB Tech Lab’s Global Privacy Platform.
To borrow from the sessions with CCPA Executive Director Ashkan Soltani and FTC Privacy and Identity Protection Attorney Ronnie Solomon, at a minimum there needs to be a “feedback loop between industry and regulators.”
The Regulators Who Were Plugged In
On the subject of self-regulation, the issue of sensitive data management is perhaps one that the NAI and policymakers have been the most aligned on. In the aftermath of the Dobbs decision invalidating Roe v. Wade, the FTC has been pushing towards an express affirmative consent standard for sensitive data. Some statehouses have shown a willingness to go further, with Maryland prohibiting the non-essential use and sale of sensitive characteristics, irrespective of user consent.
While definitional and operational differences remain, there is a converging view that the future of US privacy policy lies in European style data self-restraint. Indeed, as CPPA Director Soltani framed it, data minimization is an “underlying fence” that state and federal policymakers are keen to lean further into.
To them minimization is a virtuous forcing function: “Noone ever wants to get rid of personal data”. But by creating rigor around “What data do you have? What’s in the systems?” businesses will also put themselves in a better position to write meaningful privacy policies and respond to consumer privacy requests – table stakes NY, CT and NJ AG panelists said they will continue to coordinate actions over.
The sentiment was echoed by FTC’s Solomon, who described how the FTC was working hard to signal to industry what conduct was not OK, such as tying browsing data to sensitive locations like health clinics.
But even without the presence of precise location (lat/long) or other sensitive characteristics, as underscored by Solomon, the FTC has been advancing the view that all browsing data is inherently sensitive.
“Firms do not have free license to market, sell, and monetize people’s information beyond purposes to provide their requested product or service… The Commission will use all of its tools to continue to protect Americans from abusive data practices and unlawful commercial surveillance.” – FTC, in the cases of Avast etc
This is as much about how ubiquitously browsing data is collected, by basically everyone, as the relative ease with which revealing insights, at times sensitive ones, can be inferred from this data over time. Remember the story from 2012 where Target learned a teen was pregnant based on her shopping habits before she could tell her parents? Regulators do too, and are looking to reshape incentives driving unfettered data commoditization.
The Broke Company
But these issues aren’t so clear cut and place industry at a crossroads with the FTC.
For one, because browsing data (e.g. HTTP headers) are automatically transmitted through standard online interactions, including transactional ones among advertising services, its collection is effectively unavoidable (at least initially). And the industry already has codes and enhanced notice and consent standards for ‘sensitive’ data uses, and minimization standards for advertising to particularly vulnerable health audiences.
For another, the issue begs uncomfortable questions playing out in the EU today over the validity of ‘Consent or Pay’ business models. Was the user’s consent truly free? Did they really read or understand that long privacy notice? At a minimum, these issues require clearer classifications of data that is explicitly sensitive (i.e. by law) or implicated as ‘sensitive’ through its origin, context and intended use.
Regarding sensitive locations, as the Census Bureau knows, it isn’t so easy to map let alone filter out Sensitive Points of Interest. The ad industry has been following the NAI’s “prescient" location practice guidance but challenging questions remain. How do you draw a polygon around a protest, exactly? And what if it’s a security firm protecting a clinic from said protestors? How much data is ‘enough’? Should exemptions apply? What if it’s a law enforcement agency doing the collecting?
To the extent blanket limitations on collection are impractical, there should be more, robust dialogue about the role these and future standards can play in addressing the FTC’s valid concerns.
Rule Runners
In William Gibson’s masterpiece, Neuromancer, information is currency and subterfuge is an essential survival tool. We are certainly not there yet and I hope never will be. Neither do policymakers. Whether it’s through inquiries or press releases, they are “trying to be careful and thoughtful” about how they engage industry. If a part of a regulator’s job is to inform businesses how they can do better, it’s part industry’s job to educate regulators about the gray spaces while addressing regulatory concerns in good faith.
A little proactive transparency and courage to cooperate goes a long way. “We all like to have clarity” about practices and limitations, offered Solomon. “It’s all about reasonableness.”