Don’t judge a book by its cover – how a technology is named doesn’t tell you how it is used. This is the case with Data Clean Rooms (“DCRs”), which are not rooms, do not clean data, and have complicated implications for user privacy, despite their squeaky-clean name.
Data Clean Rooms are cloud data processing services that let companies exchange and analyze data, restrained by rules that limit data use. They are typically used when two companies want to exchange limited information about their customers. For example, a newspaper and a grocery store might use a DCR to evaluate the efficacy of an advertisement by identifying grocery sales made to newspaper subscribers. However, a close examination of DCRs yields an evergreen lesson: even ifprivacy enhancing technologies alone can’t protect privacy and even if they address some privacy risks, they can contribute to others.
In some cases, DCRs can add privacy protections to the handling of consumer data. In others, disclosure of consumer data via DCRs presents the same privacy risks as disclosure through other means liketracking pixels.DCRs, like other technologies that claim to protect privacy, can also be used to obfuscate privacy harms.
Companies shouldn’t view DCRs as a way to get around their obligations under the law or the promises they have made to consumers. DCRs don’t automatically prevent impermissible disclosure or use of consumer data; and unlawful disclosure or use of data is unlawful regardless of whether a DCR is involved. The FTC remains vigilant in policing any unfair practices or deceptive claims about data collection, disclosure, sale, or use – regardless of the technologies employed.
---
By default, most services that provide DCRs are not privacy preserving.
In common usage, a cleanroom is a space designed to be fully isolated from the area outside of it. While the term “Data Clean Room” might imply a highly controlled and sterilized environment, the function of DCRs isn’t to isolate data or ensure data quality. Instead, they are used to combine and analyze data from different companies and export a subset of records or a derivative analysis of that data.
What differentiates a DCR from a standard transfer of data between companies are the “constraints”: rules that limit the analysis of data in the clean room, as well as what can get exported out of the clean room. When constraints are appropriately designed, implemented, and monitored, a DCR can limit the use and disclosures of the data of the people represented in the datasets.
However, these protections are not typically automatic. Instead, companies must intentionally configure and deploy each constraint for it to be effective. Additionally, DCR services often default to allow both parties full access to all of the data, which makes mistakes and misconfigurations costly. Since data transfers facilitated by DCRs are so contingent on the specifics of their constraints, the use of a DCR itself isn’t a reliable guarantor of privacy.
DCRs facilitate the priorities of the companies that use them, which may not include privacy.
Ironically, the very features that can enable a DCR to protect privacy also make them excellent tools for selling data in ways that can jeopardize privacy. DCRs can provide a neutral environment for companies to compare user data and selectively purchase the precise subset that can augment the profiles they already possess. The use of DCR limits the exchange of information to what was paid for and can be useful, instead of requiring companies to share all of their data to find the matching records.
While selling data through a DCR protects each company’s business interests, it doesn’t always protect the privacy of the people represented in the data. Instead DCRs can be used in ways to make it easier to identify and track people, both across the web and in the real world. As they provide a pathway for information exchange between untrusted parties, DCRs can increase the volume of disclosure and sale of data, accelerated by bulk interfaces and industry standards. DCRs, like any technology, only protect privacy when companies choose to prioritize it – the technology itself isn’t inherently protective.
DCRs can add more places where data is accessible, increasing the risk of leaks and breaches.
As they enable more companies to exchange more data, DCRs also can add new avenues for data leaks and breaches. Giving another system access to a dataset expands the perimeter that needs to be defended against attack and error. DCR services (which are often offered as part of larger data warehousing services) can present a new site of data leaks and thefts – from misconfiguration, attack, and missing security practices. As an example, a DCR service that does not require its clients to use two-factor authentication when enabling access to their data would create the opportunity for a simple phishing or credential stuffing attack to compromise the privacy of millions of people.
In this ever-changing technical landscape, the Federal Trade Commission will continue to safeguard consumers against unfair and deceptive disclosures and uses of their data.
The FTC has brought numerous cases alleging violations of the FTC Act for sharing consumers’ data without authorization or using it in ways consumers would not expect. For instance, the FTC has brought cases against digital health platforms includingBetterHelp andGoodRx for allegedly disclosing sensitive health information to advertisers without consumer knowledge. The agency also alleged that data aggregatorInMarket combined geolocation data with other attributes to draw sensitive inferences for targeted advertising and that data brokerX-Mode sold precise geolocation data that could identify consumers at sensitive locations.
Liability for violations of the FTC Act isn’t magically mitigated by clever technology. Simply using a DCR to disclose consumer data won’t help a company avoid liability if such a disclosure or use would violate the FTC Act using another method.
The FTC also has a history of vigilance against the use of inaccurate technological claims to obfuscate privacy harms. In 2014, the FTCreached a settlement with Snapchat, alleging in part that it had misrepresented the privacy features of its disappearing messages feature. In 2020 the FTCreached a settlement with Zoom, including allegations Zoom had overstated the degree of security that their encryption had provided to users. As these examples show, the Federal Trade Commission holds companies accountable for their privacy claims and isn’t distracted by the technologies employed.
DCRs, like any technology, are not silver bullets for privacy and don’t change a company’s obligations to consumers to safeguard their data and faithfully disclose its collection, use, transfer, and sale.
***
Thank you to staff from across the agency who collaborated on this post: Grady Ward, Michael Sherling, Simon Fondrie-Teitler, Amritha Jayanti, Stephanie Nguyen, Leah Frazier, Mark Suter, Ben Wiseman.