z-logo
open-access-imgOpen Access
Security and the Claim to Privacy
Author(s) -
Amoore Louise
Publication year - 2014
Publication title -
international political sociology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.128
H-Index - 41
eISSN - 1749-5687
pISSN - 1749-5679
DOI - 10.1111/ips.12044
Subject(s) - united states national security agency , agency (philosophy) , computer security , order (exchange) , internet privacy , national security , terrorism , computer science , task (project management) , probabilistic logic , balance (ability) , law and economics , sociology , law , political science , business , economics , psychology , social science , management , finance , artificial intelligence , neuroscience
When US President Barack Obama publicly addressed the data mining and analysis activities of the National Security Agency (NSA), he appealed to a familiar sense of the weighing of the countervailing forces of security and privacy. “The people at the NSA don't have an interest in doing anything other than making sure that where we can prevent a terrorist attack, where we can get information ahead of time, we can carry out that critical task,” he stated. “Others may have different ideas,” he suggested, about the balance between “the information we can get” and the “encroachments on privacy” that might be incurred (Obama 2013). In many ways, conventional calculations of security weigh the probability and likelihood of a future threat on the basis of information gathered on a distribution of events in the past. Obama's sense of a trading-off of security and privacy shares this sense of a calculation of the tolerance for the gathering of data on past events in order to prevent threats in the future. In fact, though, the very NSA programs he is addressing precisely confound the weighing of probable threat, and the conventions of security and privacy that adhere to strict probabilistic reasoning. The contemporary mining and analysis of data for security purposes invites novel forms of inferential reasoning such that even the least probable elements can be incorporated and acted upon. I have elsewhere described these elements of possible associations, links, and threats as “data derivatives” (Amoore 2011) that are decoupled from underlying values and do not meaningfully belong to an identifiable subject. The analysis of data derivatives for security poses significant difficulties for the idea of a data subject with a recognizable body of rights to privacy, to liberty, and to justice

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom