On November 13, 2007 the Miller Center for Public Affairs is hosting "Privacy vs. National Security," the second event in its National Discussion and Debate Series. I have been invited to argue in support of the above-titled proposition. I preview my argument here.
Read an opposing viewpoint by Marc Rotenberg here.
Maintaining privacy expectations that tolerate the planning or commission of catastrophic terrorist acts in secret by denying government access to available information that could help prevent such acts is inherently unreasonable and inconsistent with protecting liberty.
My argument is in three parts: first, potentially catastrophic terrorist attacks require preemptive security strategies; second, modern information technology alters basic assumptions on which many privacy expectations are premised; and, third, certain privacy expectations, especially for electronic data, do not reflect these altered circumstances and are unreasonable.
Privacy expectations are unreasonable when they are premised on simply prohibiting government from using available information. For example, they are unreasonable where they would preclude "connecting the dots" among disparate information sources for counterterrorism purposes by maintaining an artificial "wall" between law enforcement and foreign intelligence activities, or where they would arbitrarily make certain kinds of data - for example, library records "off limits" for counterterrorism needs.
Privacy expectations are also unreasonable when they are premised exclusively on establishing independently-derived "probable cause" prior to monitoring any electronic records even in those circumstances where the data itself may be the first -- or only -- available evidence of suspicious behavior, and where it would be wholly consistent with the Fourth Amendment to investigate or monitor such behavior under alternative standards, such as "reasonable suspicion." For example, privacy expectations are unreasonable where they would preclude programmatic monitoring of suspected terrorist communication channels or the routine surveillance of explosives purchases.
To be clear, I am not arguing for a "rebalancing" of security interests over privacy interests but rather proposing the fundamental need to rethink how traditional principles, policies and expectations can be applied under these changed circumstances. Indeed, the very metaphor of balancing is itself misleading because security and liberty are dual obligations - not dichotomous rivals to be traded one for the other - and there is no fulcrum at which point the correct amount of security and liberty can be achieved. As Thomas Powers has pointed out, they are a duality, "In a liberal republic, liberty presupposes security; [and] the point of security is liberty."
I. The Changing Nature of the Threat
Even the most strident civil libertarians concede that threats with the potential for catastrophic outcomes - for example, nuclear terrorism - require a preemptive approach [PDF] in which terrorists are identified and stopped before they can act. Preemption requires anticipating and countering potential future events (that is, developing "actionable intelligence").
However, short of clairvoyance, future events can only be anticipated by examining current or past associations or behaviors. Since catastrophic scale terrorist attacks will generally require communications and precursor behaviors likely to be observable or recorded in information systems, counter-terrorism intelligence in part requires surveillance (observation) and analysis of data to help identify (anticipate) potentially catastrophic threats so that limited counterterrorism resources can be allocated more successfully.
Further, simple "line at the border" perimeter-based defenses are no longer sufficient against terrorists who hide among civilian populations and within normal migration flows to mask their own organization and activities.
Thus, preemptive security strategies necessary to prevent catastrophic attacks inherently involve some form of preventative surveillance and investigation, which will require tempering rigid privacy expectations that are based simply or arbitrarily on keeping information secret.
II. The Changing Nature of Information Availability
Modern information technology has changed fundamental assumptions about information availability and usefulness upon which certain privacy expectations were previously based.
Networked digital information systems and distributed search applications have brought an end to the "practical obscurity" that existed when information was difficult to find or access. Now, information routinely available for one purpose is more easily found and retrieved for other uses, like preventing terrorist attacks.
Further, technical means of information acquisition, storage, and processing are capital intensive not labor or physical space intensive, thus, the economic cost per unit of information has and will continue to decrease. Therefore, whether or not government itself collects information directly, data evidencing associations or behaviors that was previously simply not recorded or stored at all will increasingly exist somewhere, and will be recoverable and subject to analysis where necessary.
In addition, powerful new technologies are being developed to help "make sense of [this] data" by automating certain analytic techniques (including link analysis and pattern recognition). Although critics tend to denigrate their potential, these technologies have already proven themselves in myriad analytic situations previously requiring sophisticated human judgments. These technologies, together with predictive or statistical modeling, are increasingly necessary to allocate limited counterterrorism resources.
Information that previously simply did not exist or was difficult to find, retrieve, or analyze is now increasingly available and useful. Where such information is needed to prevent catastrophic outcomes, government must have authorized access under appropriate procedures. Thus, privacy expectations premised on outdated assumptions that data was simply unavailable or incomprehensible will have to adapt -- that is, expectations previously based on the absolute protection afforded by inefficiencies in information availability or processing must now accommodate themselves to more dilute procedural-based protections.
III. Unreasonable Expectations.
Privacy means different things to different people but much of the public debate seems to take place within an unexamined mythology of privacy that deifies absolute secrecy and allows no tolerance for even innocuous intrusions or inevitable errors.
Exhorted by a privacy lobby with an institutional fetish for insisting on absolute secrecy over procedural protections (evidenced, for example, by opposition to data retention practices), many privacy expectations are unrealistically inflated based on a presumed privacy entitlement for electronic data that exceeds that demanded by real-world experience or Constitutional requirements.
Further, technology is increasingly burdened with impossible - hence, unreasonable - expectations of proving effectiveness before development and of guaranteeing perfection prior to use. But, opposition to research on the basis that "it might not work" is absurdly shortsighted since the point of research is to determine efficacy. And, demanding perfection -- that is, brooking no possibility for any error -- before employing technical systems even where they would most certainly provide incremental improvement over existing or alternative methods is, to paraphrase Voltaire, making the perfect the enemy of the better.
The risk of catastrophic outcomes - including nuclear terrorism - requires the use of preemptive strategies and new technologies against certain threats. Developments in information technology make more information available and more useful. In appropriate circumstances and for legitimate purposes government must have access to the information necessary to prevent catastrophic attacks. To the extent that unrealistic or unreasonable privacy expectations clash with these needs, they will have to be lowered, if only because more flexible procedural protections must be accepted in place of the absolute protections previously afforded by the unavailability or incomprehensibility of data.
Join us online to participate in the debate.
K. A. (Kim) Taipale is the executive director of the Center for Advanced Studies in Science and Technology Policy and a senior fellow at the World Policy Institute.
The arguments set forth in this post, together with a discussion of how technology can help increase security while still protecting privacy interests, are discussed in greater detail in K. A. Taipale, "Technology, Security and Privacy: The Fear of Frankenstein, the Mythology of Privacy, and the Lessons of King Ludd" 7 Yale J. L. & Tech. 123; 9 Intl. J. Comm. L. & Pol'y 8 (Dec. 2004) available here.