Worried about AI Surveillance? Obliterate the Third-Party Doctrine

The showdown between the Department of War (Defense) and Anthropic over a government contract reveals a lot about American society’s relationship with technology and its governing institutions.
For one, we’re dealing with the advent of innovative technology that can be deployed and used by consumers and businesses to improve lives. But we’re also seeing that same technology be readily adopted by the largest and most consequential parts of the US government.
The limits of how that technology can and should be used should be up for debate, and a smart society will determine rules that ensure that liberties are protected while giving consumers and innovators the freedom to choose for themselves.
Government power and bulk surveillance
When it comes to the use of these technologies by the government, especially in the arcane world of procurement and state contracts, we should admit that it is hard to draw larger lessons or make forthright assumptions. The AI ecosystem is evolving so quickly that we want to avoid too many harmful actions by regulatory force.
But that’s complicated by more expansive use by government agencies than consumers would ever require. Many intelligent writers and commentators will handle this more deftly (check out Dean Ball’s essay Clawed and Lawfare for the specifics on the government’s “risky” vendor status for Anthropic’s products).
For consumer advocates such as myself who care about how technology can be accessed and used by the rest of us, the bigger question is more about how we remain free when these technologies are used by our own governments.
How about revising current law and jurisprudence to better address this?
As a simple fix, we need to repeal the Third-Party doctrine that enables much of what is at contention in this story. If the government could not route around constitutional protections by getting information from third-parties, none of this would be an issue.
The Sin of Third-Party Doctrine
The Fourth Amendment plainly states “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause…”
Since at least the 1960s, different court rulings have attempted to offer direction on how to interpret these protections when it comes to evolving technologies and arrangements. In Katz v. United States, the door was open to subjective expectations of privacy in public places. In US v. Miller and later Smith v. Maryland, the Supreme Court then created various legal tests that demarcated Fourth Amendment protections when information is freely given to others, such as telephone companies or banks.
It was from these subsequent rulings that the Third-Party doctrine was born, holding that once individuals shared information or data with third-parties, there was no more reasonable expectation of privacy.
The particulars of these arguments are obviously more complicated and nuanced than addressed here, (have at it, lawyers) but it stands that the loose interpretation of third-party doctrine has been a boon to the government that no longer needs judicial warrants and legal processes to access certain data of interest.
In a modern context, this means that law enforcement can rely on accessing cell phone tower data, our browsing history, our bank records, and perhaps even the logs of our conversations with AI chatbots because we freely give that up as consumers. No Fourth Amendment need apply.
The only reason we have concerns about the Department of War (Defense) gaining access to bulk data of Americans is because we have allowed the government to bypass centuries-old judicial warrant procedures by purchasing or accessing data from private companies. Whether it’s the NSA, your local police department, or DHS, the legal thresholds for accessing your privacy data are thinner than they’ve ever been.
We see much the same in the debates on FISA and Section 702 that enable government snooping on Americans, which my colleague James Czerniawski has testified in Congress on to urge reform.
The Gorsuch Path
It’s time to rethink the third-party doctrine so this can be removed from public debate and rights restored to individuals. If we can settle whether our privacy protections apply when we use third-parties to custody our property, then we can focus on the larger regulatory questions around AI.
As Ashley Baker has argued, we can look to one definitive Supreme Court opinion for guidance. The dissenting opinion of Associate Justice Neil Gorsuch in Carpenter v. United States is our golden arrow. He dissented from the majority, but only because he thought they applied the wrong legal foundation that wouldn’t go far enough in protecting privacy.
For Gorsuch, privacy protections do not all of a sudden disappear because property or data is held by another person. Rather, this is a “bailment” or custody arrangement, wherein another person holds your property for a specific purpose. Privacy property rights still apply, and must be respected. It does not mean that Fourth Amendment protections are suddenly non-existent.
Presented with the right case and the right arguments, there is a chance that the current Supreme Court could obliterate the third-party doctrine once and for all. And we’d be better off for it.
This doesn’t mean that all of the issues brought to the debate on Anthropic vs. OpenAI vs. the Trump Administration would be settled. But it does mean that consumer privacy would be better protected. And in this time of partisan debate, rapidly evolving technology and an ever-growing government, that’s a solid win.
Published at the Consumer Choice Center.