Skip to main content

2016 | OriginalPaper | Buchkapitel

A Precautionary Approach to Big Data Privacy

verfasst von : Arvind Narayanan, Joanna Huey, Edward W. Felten

Erschienen in: Data Protection on the Move

Verlag: Springer Netherlands

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Once released to the public, data cannot be taken back. As time passes, data analytic techniques improve and additional datasets become public that can reveal information about the original data. It follows that released data will get increasingly vulnerable to re-identification—unless methods with provable privacy properties are used for the data release. We review and draw lessons from the history of re-identification demonstrations; explain why the privacy risk of data that is protected by ad hoc de-identification is not just unknown, but unknowable; and contrast this situation with provable privacy techniques like differential privacy. We then offer recommendations for practitioners and policymakers. Because ad hoc de-identification methods make the probability of a privacy violation in the future essentially unknowable, we argue for a weak version of the precautionary approach, in which the idea that the burden of proof falls on data releasers guides policies that incentivize them not to default to full, public releases of datasets using ad hoc de-identification methods. We discuss the levers that policymakers can use to influence data access and the options for narrower releases of data. Finally, we present advice for six of the most common use cases for sharing data. Our thesis is that the problem of “what to do about re-identification” unravels once we stop looking for a one-size-fits-all solution, and each of the six cases we consider a solution that is tailored, yet principled.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Though many of the examples are U.S.-centric, the policy recommendations have widespread applicability.
 
2
Executive Office of the President, President’s Council of Advisors on Science and Technology, Report to the President: Big Data and Privacy: A Technological Perspective (Washington, DC: 2014): 38–39.
 
3
Ed Felten, “Are pseudonyms ‘anonymous’?,” Tech@FTC, April 30, 2012, https://​techatftc.​wordpress.​com/​2012/​04/​30/​are-pseudonyms-anonymous/​.
 
4
Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization,” UCLA Law Review 57 (2010): 1742–43, http://​uclalawreview.​org/​pdf/​57-6-3.​pdf.
 
5
Solon Barocas and Helen Nissenbaum, “Big Data’s End Run Around Anonymity and Consent,” in Privacy, Big Data, and the Public Good: Frameworks for Engagement, ed. Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissenbaum (New York: Cambridge University Press, 2014), 52–54.
 
6
Paul Lewis and Dominic Rushe, “Revealed: how Whisper app tracks ‘anonymous’ users,” The Guardian, October 16, 2014, http://​www.​theguardian.​com/​world/​2014/​oct/​16/​-sp-revealed-whisper-app-tracking-users.
 
7
Ibid. A poster self-identified as the CTO of Whisper reiterated this point: “We just don’t have any personally identifiable information. Not name, email, phone number, etc. I can’t tell you who a user is without them posting their actual personal information, and in that case, it would be a violation of our terms of service.” rubyrescue, October 17, 2014, comment on blackRust, “How Whisper app tracks ‘anonymous’ users,” Hacker News, October 17, 2014, https://​news.​ycombinator.​com/​item?​id=​8465482.
 
8
This is consistent with the database having a technical property called k-anonymity, with k = 10. Latanya Sweeney, “k-anonymity: A Model for Protecting Privacy,” International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 10, no. 5 (2001): 557–70. Examples like this show why k-anonymity does not guarantee privacy.
 
9
Heuristics such as l-diversity and t-closeness account for such privacy-violating inferences, but they nevertheless fall short of the provable privacy concept we discuss in the next section. Ashwin Machanavajjhala et al., “l-diversity: Privacy beyond k-anonymity,” ACM Transactions on Knowledge Discovery from Data (TKDD) 1, no. 1 (2007): 3; Ninghui Li, Tiancheng Li, and Suresh Venkatasubramanian, “t-closeness: Privacy beyond k-anonymity and l-diversity,” in IEEE 23rd International Conference on Data Engineering, 2007 (Piscataway, NJ: IEEE, 2007): 106–15.
 
10
Latanya Sweeney, “Simple Demographics Often Identify People Uniquely” (Data Privacy Working Paper 3, Carnegie Mellon University, Pittsburgh, Pennsylvania, 2000), http://​dataprivacylab.​org/​projects/​identifiability/​paper1.​pdf.
 
11
Salvador Ochoa et al., “Reidentification of Individuals in Chicago’s Homicide Database: A Technical and Legal Study” (final project, 6.805 Ethics and Law on the Electronic Frontier, Massachusetts Institute of Technology, Cambridge, Massachusetts, May 5, 2001), http://​mike.​salib.​com/​writings/​classes/​6.​805/​reid.​pdf.
 
12
Latanya Sweeney, “Matching Known Patients to Health Records in Washington State Data” (White Paper 1089-1, Data Privacy Lab, Harvard University, Cambridge, Massachusetts, June 2013), http://​dataprivacylab.​org/​projects/​wa/​1089-1.​pdf.
 
13
Latanya Sweeney, Akua Abu, and Julia Winn, “Identifying Participants in the Personal Genome Project by Name” (White Paper 1021-1, Data Privacy Lab, Harvard University, Cambridge, Massachusetts, April 24, 2013), http://​dataprivacylab.​org/​projects/​pgp/​1021-1.​pdf. Sweeney and her team matched 22 % of participants based on voter data and 27 % based on a public records website.
 
14
Ben Adida, “Don’t Hash Secrets,” Benlog, June 19, 2008, http://​benlog.​com/​2008/​06/​19/​dont-hash-secrets/​; Ed Felten, “Does Hashing Make Data ‘Anonymous’?,” Tech@FTC, April 22, 2012, https://​techatftc.​wordpress.​com/​2012/​04/​22/​does-hashing-make-data-anonymous/​; Michael N. Gagnon, “Hashing IMEI numbers does not protect privacy,” Dasient Blog, July 26, 2011, http://​blog.​dasient.​com/​2011/​07/​hashing-imei-numbers-does-not-protect.​html.
 
15
Chris Whong, “FOILing NYC’s Taxi Trip Data,” March 18, 2014, http://​chriswhong.​com/​open-data/​foil_​nyc_​taxi/​.
 
16
Vijay Pandurangan, “On Taxis and Rainbows: Lessons from NYC’s improperly anonymized taxi logs,” Medium, June 21, 2014, https://​medium.​com/​@vijayp/​of-taxis-and-rainbows-f6bc289679a1.
 
17
Michael Barbaro and Tom Zeller, Jr., “A Face Is Exposed for AOL Searcher No. 4417749,” New York Times, August 9, 2006, http://​www.​nytimes.​com/​2006/​08/​09/​technology/​09aol.​html.
 
18
Ratan Dey, Yuan Ding, and Keith W. Ross, “The High-School Profiling Attack: How Online Privacy Laws Can Actually Increase Minors’ Risk” (paper presented at the 13th Privacy Enhancing Technologies Symposium, Bloomington, IN, July 12, 2013), https://​www.​petsymposium.​org/​2013/​papers/​dey-profiling.​pdf; Arvind Narayanan and Vitaly Shmatikov, “De-anonymizing Social Networks,” in Proceedings of the 2009 30th IEEE Symposium on Security and Privacy (Washington, D.C.: IEEE Computer Society, 2009): 173–87.
 
19
Melissa Gymrek et al., “Identifying Personal Genomes by Surname Inference,” Science 339, no. 6117 (January 2013): 321–24, doi:10.​1126/​science.​1229566.
 
20
Philippe Golle and Kurt Partridge, “On the Anonymity of Home/Work Location Pairs,” in Pervasive ’09 Proceedings of the 7th International Conference on Pervasive Computing (Berlin, Heidelberg: Springer-Verlag, 2009): 390–97, https://​crypto.​stanford.​edu/​~pgolle/​papers/​commute.​pdf.
 
21
Alessandro Acquisti, Ralph Gross, and Fred Stutzman, “Faces of Facebook: Privacy in the Age of Augmented Reality” (presentation at BlackHat Las Vegas, Nevada, August 4, 2011). More information can be found in the FAQ on Acquisti’s website: http://​www.​heinz.​cmu.​edu/​~acquisti/​face-recognition-study-FAQ/​.
 
22
“In the case of high-dimensional data, additional arrangements [beyond de-identification] may need to be pursued, such as making the data available to researchers only under tightly restricted legal agreements.” Ann Cavoukian and Daniel Castro, Big Data and Innovation, Setting the Record Straight: De-identification Does Work (Toronto, Ontario: Information and Privacy Commissioner, June 16, 2014): 3.
 
23
The median Facebook user has about a hundred friends. Johan Ugander, Brian Karrer, Lars Backstrom, and Cameron Marlow, “The anatomy of the Facebook social graph,” (arXiv Preprint, 2011): 3, http://​arxiv.​org/​pdf/​1111.​4503v1.​pdf.
 
24
There are roughly ten million single nucleotide polymorphisms (SNPs) in the human genome; SNPs are the most common type of human genetic variation. “What are single nucleotide polymorphisms (SNPs)?,” Genetics Home Reference: Your Guide to Understanding Genetic Conditions, published October 20, 2014, http://​ghr.​nlm.​nih.​gov/​handbook/​genomicresearch/​snp.
 
25
DHS Data Privacy and Integrity Advisory Committee FY (2005 )Meeting Materials (June 15, 2005) (statement of Latanya Sweeney, Associate Professor of Computer Science, Technology and Policy and Director of the Data Privacy Laboratory, Carnegie Mellon University), http://​www.​dhs.​gov/​xlibrary/​assets/​privacy/​privacy_​advcom_​06-2005_​testimony_​sweeney.​pdf.
 
26
Anthony Tockar, “Riding with the Stars: Passenger Privacy in the NYC Taxicab Dataset,” Neustar: Research, September 15, 2014, http://​research.​neustar.​biz/​2014/​09/​15/​riding-with-the-stars-passenger-privacy-in-the-nyc-taxicab-dataset/​.
 
27
Ibid. Tockar goes on to explain how to apply differential privacy to this dataset.
 
28
James Siddle, “I Know Where You Were Last Summer: London’s public bike data is telling everyone where you’ve been,” The Variable Tree, April 10, 2014, http://​vartree.​blogspot.​com/​2014/​04/​i-know-where-you-were-last-summer.​html.
 
29
Arvind Narayanan and Vitaly Shmatikov, “Robust de-anonymization of large sparse datasets,” in Proceedings 2008 IEEE Symposium on Security and Privacy, Oakland, California, USA, May 1821, 2008 (Los Alamitos, California: IEEE Computer Society, 2008): 111–25. The Netflix Prize dataset included movies and movie ratings for Netflix users.
 
30
Yves-Alexandre de Montjoye, et al., “Unique in the Crowd: The privacy bounds of human mobility,” Scientific Reports 3 (March 2013), doi:10.​1038/​srep01376.
 
31
Other studies have confirmed that pairs of home and work locations can be used as unique identifiers. Golle and Partridge, “On the anonymity of home/work location pairs;” Hui Zang and Jean Bolot, “Anonymization of location data does not work: A large-scale measurement study,” in Proceedings of the 17th International Conference on Mobile Computing and Networking (New York, New York: ACM, 2011): 145–156.
 
32
A similar type of chaining in a different context can trace a user’s web browsing history. A network eavesdropper can link the majority a user’s web page visits to the same pseudonymous ID, which can often be linked to a real-world identity. Steven Englehardt et al., “Cookies that give you away: Evaluating the surveillance implications of web tracking,” (paper accepted at 24th International World Wide Web Conference, Florence, May 2015).
 
33
Sean Hooley and Latanya Sweeney, “Survey of Publicly Available State Health Databases” (White Paper 1075-1, Data Privacy Lab, Harvard University, Cambridge, Massachusetts, June 2013), http://​dataprivacylab.​org/​projects/​50states/​1075-1.​pdf.
 
34
“Thus, while [Sweeney’s re-identification of Governor Weld] speaks to the inadequacy of certain de-identification methods employed in 1996, to cite it as evidence against current de-identification standards is highly misleading. If anything, it should be cited as evidence for the improvement of de-identification techniques and methods insofar as such attacks are no longer feasible under today’s standards precisely because of this case.” Cavoukian and Castro, De-identification Does Work: 5.
“Established, published, and peer-reviewed evidence shows that following contemporary good practices for de-identification ensures that the risk of re-identification is very small. In that systematic review (which is the gold standard methodology for summarizing evidence on a given topic) we found that there were 14 known re-identification attacks. Two of those were conducted on data sets that were de-identified with methods that would be defensible (i.e., they followed existing standards). The success rate of the re-identification for these two was very small.” Khaled El Emam and Luk Arbuckle, “Why de-identification is a key solution for sharing data responsibly,” Future of Privacy Forum, July 24, 2014, http://​www.​futureofprivacy.​org/​2014/​07/​24/​de-identification-a-critical-debate/​.
 
35
Gary McGraw and John Viega, “Introduction to Software Security,” InformIT, November 2, 2001, http://​www.​informit.​com/​articles/​article.​aspx?​p=​23950&​seqNum=​7.
 
36
Anup K. Ghosh, Chuck Howell, and James A. Whittaker, “Building Software Securely from the Ground Up,” IEEE Software (January/February 2002): 14–16.
 
37
For example, the description for a 2012 conference notes that communication between researchers and practitioners is “currently perceived to be quite weak.” “Is Cryptographic Theory Practically Relevant?,” Isaac Newton Institute for Mathematical Sciences, http://​www.​newton.​ac.​uk/​event/​sasw07. In addition, “[m]odern crypto protocols are too complex to implement securely in software, at least without major leaps in developer know-how and engineering practices.” Arvind Narayanan, “What Happened to the Crypto Dream?, Part 2,” IEEE Security & Privacy 11, no. 3 (2013): 68–71.
 
38
El Emam and Arbuckle, “Why de-identification is a key solution.”
 
39
Philippe Golle, “Revisiting the Uniqueness of Simple Demographics in the US Population,” in Proceedings of the 5th ACM Workshop on Privacy in Electronic Society (New York, New York: ACM, 2006): 77–80.
 
40
Cavoukian and Castro, De-identification Does Work: 4.
 
41
See Sect. 2.1.
 
42
Khaled El Emam et al., “De-identification methods for open health data: the case of the Heritage Health Prize claims dataset,” Journal of Medical Internet Research 14, no. 1 (2012): e33, doi:10.​2196/​jmir.​2001.
 
43
Cavoukian and Castro, De-identification Does Work: 11.
 
44
El Emam et al., “Heritage Health”.
 
45
Arvind Narayanan, “An Adversarial Analysis of the Reidentifiability of the Heritage Health Prize Dataset” (unpublished manuscript, 2011).
 
46
The dataset “contains information on utilization, payment (allowed amount and Medicare payment), and submitted charges organized by National Provider Identifier (NPI), Healthcare Common Procedure Coding System (HCPCS) code, and place of service.” “Medicare Provider Utilization and Payment Data: Physician and Other Supplier,” Centers for Medicare & Medicaid Services, last modified April 23, 2014, http://​www.​cms.​gov/​Research-Statistics-Data-and-Systems/​Statistics-Trends-and-Reports/​Medicare-Provider-Charge-Data/​Physician-and-Other-Supplier.​html.
 
47
“Guidance Regarding Methods for De-identification of Protected Health Information in Accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule,” U.S. Department of Health & Human Services, http://​www.​hhs.​gov/​ocr/​privacy/​hipaa/​understanding/​coveredentities/​De-identification/​guidance.​html.
 
48
The following sources contain introductions to differential privacy. Cynthia Dwork et al., “Differential Privacy—A Primer for the Perplexed” (paper presented at the Joint UNECE/Eurostat work session on statistical data confidentiality, Tarragona, Spain, October 2011); Erica Klarreich, “Privacy by the Numbers: A New Approach to Safeguarding Data,” Quanta Magazine (December 10, 2012); Christine Task, “An Illustrated Primer in Differential Privacy,” XRDS 20, no. 1 (2013): 53–57.
 
49
Ohm, “Broken Promises of Privacy”: 1752–55.
 
50
Cass R. Sunstein, “The Paralyzing Principle,” Regulation 25, no. 4 (2002): 33–35.
 
51
Noah M. Sachs, “Rescuing the Strong Precautionary Principle from Its Critics,” Illinois Law Review 2011 no.4 (2011): 1313.
 
52
Alternatively, a data provider could show that the expected benefit outweighs the privacy cost of complete re-identification of the entire dataset. In other words, the data provider would need to show that there still would be a net benefit from releasing the data even if the names of all individuals involved were attached to their records in the dataset. This standard would be, in most cases, significantly more restrictive.
 
53
“OnTheMap,” U.S. Census Bureau, http://​onthemap.​ces.​census.​gov/​; Klarreich, “Privacy by the Numbers.”.
 
54
Úlfar Erlingsson, Vasyl Pihur, and Aleksandra Korolova, “RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response,” in Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (Scottsdale, Arizona: ACM, 2014): 1054–67.
 
55
Jakob Edler and Luke Georghiou, “Public procurement and innovation—Resurrecting the demand side,” Research Policy 36, no. 7 (September 2007): 949–63.
 
56
Charles Edquist and Jon Mikel Zabala-Iturriagagoitia, “Public Procurement for Innovation as mission-oriented innovation policy,” Research Policy 41, no. 10 (December 2012): 1757–69.
 
57
Elvira Uyarra and Kieron Flanagan, “Understanding the Innovation Impacts of Public Procurement,” European Planning Studies 18, no. 1 (2010): 123–43.
 
58
Solove, among others, has discussed how privacy is traditionally viewed as an individual right but also has social value. Daniel J. Solove, “‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy,” San Diego Law Review 44 (2007): 760–64.
 
59
Information Commissioner’s Office, Big data and data protection (July 28, 2014): 5–6, 33–37.
 
60
The White House, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (Washington, D.C.: February 2012): 47.
 
61
Ibid.
 
62
Of course, simply providing information can be insufficient to protect users. It may not “be information that consumers can use, presented in a way they can use it,” and so it may be ignored or misunderstood. Lawrence Lessig, “Against Transparency,” New Republic, October 9, 2009. Alternatively, a user may be informed effectively but the barriers to opting out may be so high as to render the choice illusory. Janet Vertesi, “My Experiment Opting Out of Big Data Made Me Look Like a Criminal,” Time, May 1, 2014. Still, we believe that concise, clear descriptions of privacy protecting measures and re-identification risks can aid users in many circumstances and should be included in the options considered by policymakers.
 
63
For example, patients in clinical trials or with rare diseases might wish to have their data included for analysis, even if the risk of re-identification is high or if no privacy protecting measures are taken at all. Kerstin Forsberg, “De-identification and Informed Consent in Clinical Trials,” Linked Data for Enterprises, November 17, 2013, http://​kerfors.​blogspot.​com/​2013/​11/​de-identification-and-informed-consent.​html.
 
64
For example, the Network Advertising Initiative’s self-regulatory Code “provides disincentives to the use of PII for Interest-Based Advertising. As a result, NAI member companies generally use only information that is not PII for Interest Based Advertising and do not merge the non-PII they collect for Interest-Based Advertising with users’ PII.” “Understanding Online Advertising: Frequently Asked Questions,” Network Advertising Initiative, http://​www.​networkadvertisi​ng.​org/​faq.
 
65
Balachander Krishnamurthy and Craig E. Wills, “On the Leakage of Personally Identifiable Information Via Online Social Networks,” in Proceedings of the 2nd ACM Workshop on Online Social Networks (New York, New York: ACM, 2009): 7-12, http://​www2.​research.​att.​com/​~bala/​papers/​wosn09.​pdf.
 
66
Data aggregation replaces individual data elements by statistical summaries.
 
67
Cynthia Dwork and Deirdre K. Mulligan, “It's not privacy, and it's not fair,” Stanford Law Review Online 66 (2013): 35.
 
68
Julia Angwin, “The web’s new gold mine: Your secrets,” Wall Street Journal, July 30, 2010.
 
69
Aniko Hannak et al., “Measuring Price Discrimination and Steering on E-commerce Web Sites,” in Proceedings of the 2014 Conference on Internet Measurement Conference (Vancouver: ACM, 2014): 305–318.
 
70
Solon Barocas and Andrew D. Selbst, “Big Data's Disparate Impact,” California Law Review 104 (forthcoming); Ryan Calo, “Digital Market Manipulation,” George Washington Law Review 82 (2014): 995.
 
71
Directive 95/46/EC, of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Art. 2(a), 1995 O.J. (C 93).
 
72
Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Art. 4(1)-(2) (January 25, 2012).
 
73
Ohm, “Broken Promises of Privacy”: 1704, 1738–41.
 
74
Pablo Barberá, “How Social Medial Reduces Mass Political Polarization: Evidence from Germany, Spain, and the U.S.” (unpublished manuscript, October 18, 2014), https://​files.​nyu.​edu/​pba220/​public/​barbera-polarization-social-media.​pdf; Amaney Jamal et al., “Anti-Americanism and Anti-Interventionism in Arabic Twitter Discourses” (unpublished manuscript, October 20, 2014), http://​scholar.​harvard.​edu/​files/​dtingley/​files/​aatext.​pdf; Margaret E. Roberts, “Fear or Friction? How Censorship Slows the Spread of Information in the Digital Age” (unpublished manuscript, September 26, 2014), http://​scholar.​harvard.​edu/​files/​mroberts/​files/​fearfriction_​1.​pdf.
Computational social scientists can also generate their own self-reported data online. Matthew J. Salganik and Karen E.C. Levy, “Wiki surveys: Open and quantifiable social data collection” (unpublished manuscript, October 2, 2014), http://​arxiv.​org/​abs/​1202.​0500.
 
75
Kevin J. Boudreau, Nicola Lacetera, and Karim R. Lakhani, “Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis,” Management Science 57, no. 5 (2014): 843–63, doi: 10.​1287/​mnsc.​1110.​1322.
 
76
Ibid., 860–61.
 
77
Lars Bo Jeppesen and Karim R. Lakhani, “Marginality and Problem Solving Effectiveness in Broadcast Search,” Organization Science 21, no. 5 (2010): 1016–33.
 
78
Researchers already have developed methods for creating such synthetic data. Avrim Blum, Katrina Ligett, and Aaron Roth, “A Learning Theory Approach to Non-Interactive Database Privacy,” in Proceedings of the 40th ACM SIGACT Symposium on Theory of Computing (Victoria, British Columbia: ACM, 2008).
 
79
“If there are privacy concerns I can imagine ensuring we can share the data in a ‘walled garden’ within which other researchers, but not the public, will be able to access the data and verify results.” Victoria Stodden, “Data access going the way of journal article access? Insist on open data,” Victoria’s Blog, December 24, 2012, http://​blog.​stodden.​net/​2012/​12/​24/​data-access-going-the-way-of-journal-article-access/​.
 
80
Genomics researchers have proposed one such system. Bartha Maria Knoppers, et al., “Towards a data sharing Code of Conduct for international genomic research,” Genome Medicine 3 (2011): 46.
 
81
HCUP, SID/SASD/SEDD Application Kit (October 15, 2014), http://​www.​hcup-us.​ahrq.​gov/​db/​state/​SIDSASDSEDD_​Final.​pdf.
 
82
Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers (Washington, DC: March 2012) 21, http://​www.​ftc.​gov/​sites/​default/​files/​documents/​reports/​federal-trade-commission-report-protecting-consumer-privacy-era-rapid-change-recommendations/​120326privacyrep​ort.​pdf.
 
Literatur
Zurück zum Zitat Acquisti, Alessandro, Ralph Gross, and Fred Stutzman. 2011. Faces of facebook: Privacy in the age of augmented reality. Nevada: Presentation at BlackHat Las Vegas. 4 Aug 2011. Acquisti, Alessandro, Ralph Gross, and Fred Stutzman. 2011. Faces of facebook: Privacy in the age of augmented reality. Nevada: Presentation at BlackHat Las Vegas. 4 Aug 2011.
Zurück zum Zitat Angwin, Julia. 2010. The web’s new gold mine: Your secrets. Wall Street Journal. 30 July 2010. Angwin, Julia. 2010. The web’s new gold mine: Your secrets. Wall Street Journal. 30 July 2010.
Zurück zum Zitat Barocas, Solon and Andrew D. Selbst. Big Data’s Disparate Impact. California Law Review 104 (forthcoming). Barocas, Solon and Andrew D. Selbst. Big Data’s Disparate Impact. California Law Review 104 (forthcoming).
Zurück zum Zitat Barocas, Solon, and Helen Nissenbaum. 2014. Big data’s end run around anonymity and consent. In Privacy, big data, and the public good: Frameworks for Engagement, ed. Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissenbaum, 44–75. New York: Cambridge University Press.CrossRef Barocas, Solon, and Helen Nissenbaum. 2014. Big data’s end run around anonymity and consent. In Privacy, big data, and the public good: Frameworks for Engagement, ed. Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissenbaum, 44–75. New York: Cambridge University Press.CrossRef
Zurück zum Zitat Blum, Avrim, Katrina Ligett, and Aaron Roth. 2008. A learning theory approach to non-interactive database privacy. In Proceedings of the 40th ACM SIGACT symposium on theory of computing. Victoria, British Columbia: ACM. Blum, Avrim, Katrina Ligett, and Aaron Roth. 2008. A learning theory approach to non-interactive database privacy. In Proceedings of the 40th ACM SIGACT symposium on theory of computing. Victoria, British Columbia: ACM.
Zurück zum Zitat Boudreau, Kevin J., Nicola Lacetera, and Karim R. Lakhani. 2014. Incentives and problem uncertainty in innovation contests: An empirical analysis. Management Science 57(5): 843–863. doi:10.1287/mnsc.1110.1322.CrossRef Boudreau, Kevin J., Nicola Lacetera, and Karim R. Lakhani. 2014. Incentives and problem uncertainty in innovation contests: An empirical analysis. Management Science 57(5): 843–863. doi:10.​1287/​mnsc.​1110.​1322.CrossRef
Zurück zum Zitat Calo, Ryan. 2014. Digital market manipulation. George Washington Law Review 82: 995–1051. Calo, Ryan. 2014. Digital market manipulation. George Washington Law Review 82: 995–1051.
Zurück zum Zitat Cavoukian, Ann and Daniel Castro. 2014. Big Data and innovation, setting the record straight: De-identification does Work. Toronto, Ontario: Information and Privacy Commissioner. Cavoukian, Ann and Daniel Castro. 2014. Big Data and innovation, setting the record straight: De-identification does Work. Toronto, Ontario: Information and Privacy Commissioner.
Zurück zum Zitat de Montjoye, Yves-Alexandre, César A. Hidalgo, Michel Verleysen, and Vincent D. Blondel. 2013. Unique in the crowd: The privacy bounds of human mobility. Scientific Reports 3. doi:10.1038/srep01376. de Montjoye, Yves-Alexandre, César A. Hidalgo, Michel Verleysen, and Vincent D. Blondel. 2013. Unique in the crowd: The privacy bounds of human mobility. Scientific Reports 3. doi:10.​1038/​srep01376.
Zurück zum Zitat Directive 95/46/EC, of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Art. 2(a), 1995 O.J. (C 93). Directive 95/46/EC, of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Art. 2(a), 1995 O.J. (C 93).
Zurück zum Zitat Dwork, Cynthia, Frank McSherry, Kobbi Nissim, and Adam Smith. 2011. Differential privacy—a primer for the perplexed. Paper presented at the Joint UNECE/Eurostat work session on statistical data confidentiality, Tarragona, Spain. Dwork, Cynthia, Frank McSherry, Kobbi Nissim, and Adam Smith. 2011. Differential privacy—a primer for the perplexed. Paper presented at the Joint UNECE/Eurostat work session on statistical data confidentiality, Tarragona, Spain.
Zurück zum Zitat Dwork, Cynthia, and Deirdre K. Mulligan. 2013. It’s not privacy, and it’s not fair. Stanford Law Review Online 66: 35–40. Dwork, Cynthia, and Deirdre K. Mulligan. 2013. It’s not privacy, and it’s not fair. Stanford Law Review Online 66: 35–40.
Zurück zum Zitat Edler, Jakob, and Luke Georghiou. 2007. Public procurement and innovation—Resurrecting the demand side. Research Policy 36(7): 949–963.CrossRef Edler, Jakob, and Luke Georghiou. 2007. Public procurement and innovation—Resurrecting the demand side. Research Policy 36(7): 949–963.CrossRef
Zurück zum Zitat Edquist, Charles and Jon Mikel Zabala-Iturriagagoitia. 2012. Public procurement for innovation as mission-oriented innovation policy. Research Policy 41(10): 1757–69. Edquist, Charles and Jon Mikel Zabala-Iturriagagoitia. 2012. Public procurement for innovation as mission-oriented innovation policy. Research Policy 41(10): 1757–69.
Zurück zum Zitat El Emam, Khaled, Luk Arbuckle, Gunes Koru, Benjamin Eze, Lisa Gaudette, Emilio Neri, Sean Rose, Jeremy Howard, and Jonathan Gluck. 2012. De-identification methods for open health data: The case of the Heritage Health Prize claims dataset. Journal of Medical Internet Research 14(1): e33. doi:10.2196/jmir.2001. El Emam, Khaled, Luk Arbuckle, Gunes Koru, Benjamin Eze, Lisa Gaudette, Emilio Neri, Sean Rose, Jeremy Howard, and Jonathan Gluck. 2012. De-identification methods for open health data: The case of the Heritage Health Prize claims dataset. Journal of Medical Internet Research 14(1): e33. doi:10.​2196/​jmir.​2001.
Zurück zum Zitat Englehardt, Steven, Dillon Reisman, Christian Eubank, Peter Zimmerman, Jonathan Mayer, Arvind Narayanan, and Edward W. Felten. 2015. Cookies that give you away: Evaluating the surveillance implications of web tracking. Paper accepted at 24th International World Wide Web Conference, Florence. Englehardt, Steven, Dillon Reisman, Christian Eubank, Peter Zimmerman, Jonathan Mayer, Arvind Narayanan, and Edward W. Felten. 2015. Cookies that give you away: Evaluating the surveillance implications of web tracking. Paper accepted at 24th International World Wide Web Conference, Florence.
Zurück zum Zitat Erlingsson, Úlfar, Vasyl Pihur, and Aleksandra Korolova. 2014. RAPPOR: Randomized aggregatable privacy-preserving ordinal response. In Proceedings of the 2014 ACM SIGSAC conference on computer and communications security, 1054–67. Scottsdale, Arizona: ACM. Erlingsson, Úlfar, Vasyl Pihur, and Aleksandra Korolova. 2014. RAPPOR: Randomized aggregatable privacy-preserving ordinal response. In Proceedings of the 2014 ACM SIGSAC conference on computer and communications security, 1054–67. Scottsdale, Arizona: ACM.
Zurück zum Zitat Executive Office of the President, President’s Council of Advisors on Science and Technology. 2014. Report to the President: Big Data and Privacy: A Technological Perspective. Washington, DC. Executive Office of the President, President’s Council of Advisors on Science and Technology. 2014. Report to the President: Big Data and Privacy: A Technological Perspective. Washington, DC.
Zurück zum Zitat Ghosh, Anup K., Chuck Howell, and James A. Whittaker. 2002. Building software securely from the ground up. IEEE Software. Ghosh, Anup K., Chuck Howell, and James A. Whittaker. 2002. Building software securely from the ground up. IEEE Software.
Zurück zum Zitat Golle, Philippe. 2006. Revisiting the uniqueness of simple demographics in the US population. In Proceedings of the 5th ACM workshop on privacy in electronic society, 77–80. New York, New York: ACM. Golle, Philippe. 2006. Revisiting the uniqueness of simple demographics in the US population. In Proceedings of the 5th ACM workshop on privacy in electronic society, 77–80. New York, New York: ACM.
Zurück zum Zitat Hannak, Aniko, Gary Soeller, David Lazer, Alan Mislove, and Christo Wilson. 2014. Measuring price discrimination and steering on e-commerce web sites. In Proceedings of the 2014 conference on internet measurement conference, 305–318. Vancouver: ACM. Hannak, Aniko, Gary Soeller, David Lazer, Alan Mislove, and Christo Wilson. 2014. Measuring price discrimination and steering on e-commerce web sites. In Proceedings of the 2014 conference on internet measurement conference, 305–318. Vancouver: ACM.
Zurück zum Zitat Information Commissioner’s Office. 2014. Big data and data protection 28 July 2014. Information Commissioner’s Office. 2014. Big data and data protection 28 July 2014.
Zurück zum Zitat Jeppesen, Lars Bo, and Karim R. Lakhani. 2010. Marginality and problem solving effectiveness in broadcast search. Organization Science 21(5): 1016–1033.CrossRef Jeppesen, Lars Bo, and Karim R. Lakhani. 2010. Marginality and problem solving effectiveness in broadcast search. Organization Science 21(5): 1016–1033.CrossRef
Zurück zum Zitat Klarreich, Erica. 2012. Privacy by the numbers: A new approach to safeguarding data. Quanta Magazine. 10 Dec 2012. Klarreich, Erica. 2012. Privacy by the numbers: A new approach to safeguarding data. Quanta Magazine. 10 Dec 2012.
Zurück zum Zitat Knoppers, Bartha Maria, Jennifer R. Harris, Anne Marie Tassé, Isabelle Budin-Ljøsne, Jane Kaye, Mylène Deschênes, and Ma’n H Zawati. 2011. Towards a data sharing code of conduct for international genomic research. Genome Medicine 3: 46. Knoppers, Bartha Maria, Jennifer R. Harris, Anne Marie Tassé, Isabelle Budin-Ljøsne, Jane Kaye, Mylène Deschênes, and Ma’n H Zawati. 2011. Towards a data sharing code of conduct for international genomic research. Genome Medicine 3: 46.
Zurück zum Zitat Lessig, Lawrence. 2009. Against transparency. New Republic. 9 Oct 2009. Lessig, Lawrence. 2009. Against transparency. New Republic. 9 Oct 2009.
Zurück zum Zitat Li, Ninghui, Tiancheng Li, and Suresh Venkatasubramanian. 2007. t-closeness: Privacy beyond k-anonymity and l-diversity. In IEEE 23rd international conference on data engineering, 2007, 106–15. Piscataway, NJ: IEEE. Li, Ninghui, Tiancheng Li, and Suresh Venkatasubramanian. 2007. t-closeness: Privacy beyond k-anonymity and l-diversity. In IEEE 23rd international conference on data engineering, 2007, 106–15. Piscataway, NJ: IEEE.
Zurück zum Zitat Machanavajjhala, Ashwin, Johannes Gehrke, Daniel Kifer, and Muthuramakrishnan Venkitasubramanian. 2007. l-diversity: Privacy beyond k-anonymity. ACM Transactions on Knowledge Discovery from Data (TKDD) 1(1): 3. Machanavajjhala, Ashwin, Johannes Gehrke, Daniel Kifer, and Muthuramakrishnan Venkitasubramanian. 2007. l-diversity: Privacy beyond k-anonymity. ACM Transactions on Knowledge Discovery from Data (TKDD) 1(1): 3.
Zurück zum Zitat Narayanan, Arvind. 2011. An adversarial analysis of the reidentifiability of the heritage health prize dataset. Unpublished manuscript. Narayanan, Arvind. 2011. An adversarial analysis of the reidentifiability of the heritage health prize dataset. Unpublished manuscript.
Zurück zum Zitat Narayanan, Arvind and Vitaly Shmatikov. 2009. De-anonymizing Social Networks. In Proceedings of the 2009 30th IEEE symposium on security and privacy, 173–87. Washington, D.C.: IEEE Computer Society. Narayanan, Arvind and Vitaly Shmatikov. 2009. De-anonymizing Social Networks. In Proceedings of the 2009 30th IEEE symposium on security and privacy, 173–87. Washington, D.C.: IEEE Computer Society.
Zurück zum Zitat Narayanan, Arvind and Vitaly Shmatikov. 2008. Robust de-anonymization of large sparse datasets. In Proceedings 2008 IEEE symposium on security and privacy, 111–25, Oakland, California, USA Los Alamitos, California: IEEE Computer Society. 18–21 May 2008. Narayanan, Arvind and Vitaly Shmatikov. 2008. Robust de-anonymization of large sparse datasets. In Proceedings 2008 IEEE symposium on security and privacy, 111–25, Oakland, California, USA Los Alamitos, California: IEEE Computer Society. 18–21 May 2008.
Zurück zum Zitat Narayanan, Arvind. 2013. What happened to the crypto dream? Part 2. IEEE Security and Privacy 11(3): 68–71.CrossRef Narayanan, Arvind. 2013. What happened to the crypto dream? Part 2. IEEE Security and Privacy 11(3): 68–71.CrossRef
Zurück zum Zitat Ochoa, Salvador, Jamie Rasmussen, Christine Robson, and Michael Salib. 2001. Reidentification of individuals in Chicago’s homicide database: A technical and legal study. Final project, 6.805 Ethics and Law on the Electronic Frontier, Massachusetts Institute of Technology, Cambridge, Massachusetts. http://mike.salib.com/writings/classes/6.805/reid.pdf. 5 May 2001. Ochoa, Salvador, Jamie Rasmussen, Christine Robson, and Michael Salib. 2001. Reidentification of individuals in Chicago’s homicide database: A technical and legal study. Final project, 6.805 Ethics and Law on the Electronic Frontier, Massachusetts Institute of Technology, Cambridge, Massachusetts. http://​mike.​salib.​com/​writings/​classes/​6.​805/​reid.​pdf. 5 May 2001.
Zurück zum Zitat Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data. Art. 4(1)-(2) 25 Jan 2012. Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data. Art. 4(1)-(2) 25 Jan 2012.
Zurück zum Zitat Sachs, Noah M. 2011. Rescuing the strong precautionary principle from its critics. Illinois Law Review 2011(4): 1313. Sachs, Noah M. 2011. Rescuing the strong precautionary principle from its critics. Illinois Law Review 2011(4): 1313.
Zurück zum Zitat Solove, Daniel J. 2007.‘I’ve got nothing to hide’ and other misunderstandings of privacy. San Diego Law Review 44: 760–64. Solove, Daniel J. 2007.‘I’ve got nothing to hide’ and other misunderstandings of privacy. San Diego Law Review 44: 760–64.
Zurück zum Zitat Sunstein, Cass R. 2002. The paralyzing principle. Regulation 25(4): 33–35. Sunstein, Cass R. 2002. The paralyzing principle. Regulation 25(4): 33–35.
Zurück zum Zitat Sweeney, Latanya. 2001. k-anonymity: A model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 10(5): 557–570.CrossRef Sweeney, Latanya. 2001. k-anonymity: A model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 10(5): 557–570.CrossRef
Zurück zum Zitat Task, Christine. 2013. An illustrated primer in differential privacy. XRDS 20(1): 53–57.CrossRef Task, Christine. 2013. An illustrated primer in differential privacy. XRDS 20(1): 53–57.CrossRef
Zurück zum Zitat The White House. 2012. Consumer data privacy in a networked world: A framework for protecting privacy and promoting innovation in the global digital economy. Washington, D.C. The White House. 2012. Consumer data privacy in a networked world: A framework for protecting privacy and promoting innovation in the global digital economy. Washington, D.C.
Zurück zum Zitat Uyarra, Elvira, and Kieron Flanagan. 2010. Understanding the innovation impacts of public procurement. European Planning Studies 18(1): 123–143.CrossRef Uyarra, Elvira, and Kieron Flanagan. 2010. Understanding the innovation impacts of public procurement. European Planning Studies 18(1): 123–143.CrossRef
Zurück zum Zitat Vertesi, Janet. 2014. My experiment opting out of big data made me look like a criminal. Time, 1 May 2014. Vertesi, Janet. 2014. My experiment opting out of big data made me look like a criminal. Time, 1 May 2014.
Zurück zum Zitat Zang, Hui and Jean Bolot. 2011. Anonymization of location data does not work: A large-scale measurement study. In Proceedings of the 17th international conference on mobile computing and networking, 145–56. New York: ACM. Zang, Hui and Jean Bolot. 2011. Anonymization of location data does not work: A large-scale measurement study. In Proceedings of the 17th international conference on mobile computing and networking, 145–56. New York: ACM.
Metadaten
Titel
A Precautionary Approach to Big Data Privacy
verfasst von
Arvind Narayanan
Joanna Huey
Edward W. Felten
Copyright-Jahr
2016
Verlag
Springer Netherlands
DOI
https://doi.org/10.1007/978-94-017-7376-8_13

Premium Partner