Skip to main content
Top
Published in:
Cover of the book

2014 | OriginalPaper | Chapter

1. Privacy Versus Security… Are We Done Yet?

Author : Sophie Stalla-Bourdillon

Published in: Privacy vs. Security

Publisher: Springer London

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

It is often assumed that privacy and security are alternative values, which cannot be pursued together. Hence, the strength of the “nothing-to-hide argument”: if you have nothing to hide, you have nothing to fear. Besides, ensuring the security of the network itself is said to actually require a detailed analysis of network flows. Reasonable expectations of privacy should thus progressively disappear in cyberspace. While it is true that enforcement of legal rules is a real challenge when communications are transmitted through the means of a borderless network, the evolution of the case law of the European Court of Human Right recently followed by the Court of Justice of the European Union does show that the right to respect for private life should have important implications online and in particular should significantly restrict the systematic collection and retention of content and traffic data by both public and private actors such as Internet service providers. At a time at which data-gathering and data-matching technologies are more sophisticated than ever, as illustrated by Snowden’s revelations, it is crucial to fully comprehend the interaction between the protection of privacy and the furtherance of security in order to set appropriate limits to surveillance practices. The purpose of this chapter is therefore twofold: first, to shed light upon the European approach to privacy and explain the interplay between privacy law, data protection law and data retention law; second, to explain how the values of privacy and security should be balanced together and in particular how privacy law should serve to scrutinise the appropriateness of measures implemented to ensure the security of the social group at large.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
CCTV stands for closed-circuit television as opposed to broadcast television. CCTV cameras are set up to transmit signals to a restricted number of monitors.
 
2
Several authors have, however, showed that the security trend had initiated before the 9/11 events. See, for example, Lyon [1], Seamon and Gardner [2] at 372, Schulhofer [3].
 
3
A data mining programme meant to track terrorists.
 
4
Not to mention Operation TIPS standing for Terrorism Information and Prevention System, conceived as a domestic intelligence-gathering programme to have US workers having access to people’s home report about suspicious activities. Operation TIPS was, however, abandoned after becoming known to the public.
 
5
The PRISM/Tempora scandal generated public outcry after former CIA technical assistant Edward Snowden published secret US Government programme aiming at monitoring online communications conducted by the US National Security Agency. The first document published by the Guardian was a secret court order allowing the NSA to collect the telephone records of millions of US customers of Verizon, a big US telecoms provider. It then became clear that other methods of data collection were being used such as methods of upstream collection by tapping into fibre-optic cables and methods of downstream collection by being granted access to the servers of US Internet companies such as Google, Facebook, Apple and Yahoo. Huge amount of data are at stake since most of the communications of the world pass by the USA. The Snowden documents also revealed the existence of Tempora, a programme set up by the UK Government Communications Headquarters (GCHQ) in 2011 to collect data relating to phone and Internet traffic by once again tapping into fibre-optic cables. The GCHQ and the NSA were described as collaborating together. The Snowden documents also show that USA and UK intelligence agencies have managed to successfully weaken much of online encryption. The legality of these programmes with the European Convention on Human Rights has been attacked by Big Brother Watch, Open Rights Group and English PEN, together with a German Internet “hacktivist” and academic Constanze Kurz who have decided to present their case before the ECtHR. In the UK, Privacy International has filed two cases against GCHQ: in relation to mass surveillance programmes Tempora, Prism and Upstream, and the second in relation to the use by GCHQ of computer intrusion capabilities and spyware. See e.g. the statements of grounds submitted before the Investigatory Powers Tribunal by Privacy International on 8 July 2013, https://​www.​privacyinternati​onal.​org/​sites/​privacyinternati​onal.​org/​files/​downloads/​press-releases/​privacy_​international_​ipt_​grounds.​pdf. They have been followed by 7 ISPs who have also filled a complaint against GCHQ for the attacks allegedly run on the network.
 
6
Schneier [5] (Schneier).
 
7
Schneier.
 
8
Schneier.
 
9
See, for example, in the UK the long list to be found in The Regulation of Investigatory Powers (Communications Data) Order 2010 No. 480, available at http://​www.​legislation.​gov.​uk/​uksi/​2010/​480/​made.
 
10
The Regulation of Investigatory Powers (Communications Data) Order 2010 No. 480.
 
11
For a history of private life, see the 5 volumes edited by Philippe Ariès and Georges Duby, Histoire de la vie privée, Seuil, Univers Historique.
 
12
Notably in France, for example, a few legislative provisions dealt with the protection of the secrecy of private life but in a very limited way. See Loi Relative á la Presse. 11 Mai 1868 and its Article 11 («Toute Publication dans un écrit périodique relative à un fait de la vie privé constitue une contravention punie d'un amende de cinq cent francs»). Riviére, Codes Francais et Lois Usuelles. App. Code Pen., p. 20.
 
13
See nevertheless the comments of François Rigaux who recalls that in the UK, in France and Germany, several cases were already going in the direction advocated by Warren and Brandeis. Rigaux François [6].
 
14
Warren and Brandeis at 214–218.
 
15
Richards and Solove at 132 citing Warren and Brandeis at 211.
 
16
Warren and Brandeis at 218.
 
17
See The American Law Institute, Restatement of the Law, Second, Torts, §652; William Prosser [10], William Prosser et al. [11]. Note that US privacy law is a composite body of law which comprises privacy torts and the fourth Amendment of the US Constitution. See the important ruling of the US Supreme Court in the case Riley v California 2014 WL 2864483 (U.S.Cal.) (it is now clear that a warrant is required for searches of cell phone data).
 
18
See, e.g. Campbell v MGN Ltd [2004] UKHL 22 at [43]; Wainwright v Home Office [2003] UKHL 53 at [35] (“I would reject the invitation to declare that since at the latest 1950 there has been a previously unknown tort of invasion of privacy” per Lord Hoffmann).
 
19
Loi n°70–643 du 17 juillet 1970—art. 22 JORF 19 juillet 1970.
 
20
Richards and Solove [8], at 176 (2007). See the US case U.S. West, Inc. v Federal Communications Commission, 182 F.3d 1224 (10th Cir. 1999) in which a telecommunications carrier criticized the privacy regulations of the Federal Communications Commission (“FCC”) limiting the use and disclosure of customers' personal information in the absence of customers’ consent. It was stated by the Court that: “A general level of discomfort from knowing that people can readily access information about us does not necessarily rise to the level of a substantial state interest, for it is not based on an identified harm. Our names, addresses, types of cars we own, and so on are not intimate facts about our existence, certainly not equivalent to our deeply held secrets or carefully guarded diary entries. In cyberspace, most of our relationships are more like business transactions than intimate interpersonal relationships”. In Smith v Maryland, 442 U.S. 735 (1979), it had been held earlier that installation and use of a pen register (to record the numbers dialled on a telephone by monitoring the electrical impulses caused when the dial on the telephone is released) by a telephone company at the request of the police does not constitute a “search” within the meaning of the Fourth Amendment to the US Constitution.
 
21
See, e.g. Solove [12], 497 ff (2006).
 
22
Formally the Convention for the Protection of Human Rights and Fundamental Freedoms, Rome, 4.XI.1950. The Convention has been amended by several protocols.
 
23
Text adopted by the General Assembly resolution 217 A (III) on 10 December 1948.
 
24
Other international instruments also include this right into their list of human rights: the International Covenant on Civil and Political Rights (art. 17) entered into force in 1976; the American Convention on Human Rights (Art. 11) entered into force in 1978; the New York Convention on the rights of the child (art. 16) entered into force in 1990.
 
25
Montenegro and Serbia ratified the Convention in 2003, and Monaco ratified in 2004.
 
26
Article 12 of the Universal Declaration of Human Rights is mainly concerned about illegitimate interference with private life. It reads as follows: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks”.
 
27
For quite some time, cases had first to be brought before the European Commission of Human Rights (established in 1954). If the case was deemed admissible, it would then proceed before the ECtHR. Initially, the European Commission of Human Rights was more a political body than a legal body. It was thus abolished by Protocol 11 (ETS No. 155) brought into force on 1 November 1998. The creation of a single Court was intended to prevent the overlapping of a certain amount of work, to avoid certain delays and to strengthen the judicial elements of the system. Other protocols have nonetheless followed such as Protocol 14 (Strasbourg, 13.V.2004).
 
28
The Committee of Ministers of the Council of Europe, the Council of Europe's decision-making body, monitors the execution of judgements, to make sure in particular that damages are actually awarded to wining applicants.
 
29
Note, however, that in some cases, the enforcement mechanism of ECHR rights remained for a long time ineffective. See the example of the United Kingdom. For many years, British citizens could not rely upon ECHR rights before national courts. In this sense, the ECHR was not a full part of British law. British citizens had to take a case to the ECtHR. Since coming into force on 2 October 2000 of the Human Rights Act, ECHR rights can now be asserted before national courts. More generally, see Sweet and Keller [14] and Sweet and Keller [15].
 
30
[2007] 3 W.L.R. 194 at [11].
 
31
See e.g. Moreham [16] (Moreham).
 
32
Moreham at 45.
 
33
This distinction has been used by authors such as François Rigaux, La vie privée, Une liberté parmi les autres, Larcier, Bruxelles, 1992 and Pierre Kayser, Protection de la vie privée, Economica, Paris, 2005 (Kayser). Interestingly, although Pierre Kayser acknowledges the usefulness of such a distinction right from the beginning, he decides to focus upon the protection of the secrecy of private life and not upon the protection of liberty of private life. This is the reason that he gives: «Nous n’envisagerons pas la protection de la liberté de la vie privée; qui soulève plus de difficultés encore parce que cette liberté consiste à la fois dans un principe et dans des libertés particulières s’exerçant dans cette partie de la vie, en particulier la liberté corporelle et la liberté de conscience» . Kayser n°62. In other words, because what Kayser calls the liberty of private life is intimately connected with the liberty of conscience, its scope and components are more difficult to identify. Besides, this author questions the fact that both dimensions of private life have the same ground. Kayser n°1.
 
34
Application n°6825/74 (1976) 5 DR 86.
 
35
Niemietz v Germany (A/251-B) (1993) 16 E.H.R.R. 97 (Niemietz).
 
36
Niemietz, at [57].
 
37
Niemietz, at [30]. However, Member States’ leeway under Art. 8(2) “might well be more far-reaching where professional or business activities or premises were involved than would otherwise be the case”. Niemietz at [31]. See also Reiss v Austria (1995) 20 E.H.R.R. CD90.
 
38
Gillan and Quinton v United Kingdom (2010) 50 E.H.R.R. 45, at [61].
 
39
Warren and Brandeis.
 
40
See Campbell and Fell (1985) 7 E.H.R.R. 165; Boyle and Rice (1988) 10 E.H.R.R. 425; McCallum v United Kingdom (1991) 13 E.H.R.R. 597; Pfeifer and Plank v. Austria (1992) 14 E.H.R.R. 692; Peers v Greece (2001) 33 E.H.R.R. 51; AB v Netherlands (2003) 37 E.H.R.R. 48; Stojanovic v Serbia [2010] 1 Prison L.R. 286. See also Herczegfalvy v Austria (1993) 15 E.H.R.R. 437.
 
41
Silver v United Kingdom (A/161) 25 March 1983 (1983) 5 E.H.R.R. 347, at [86–104].
 
42
Van Kück v Germany (2003) 37 E.H.R.R. 51 at [70]. See also Mosley v United Kingdom (2011) 53 E.H.R.R. 30 at [106]–[108], Von Hannover v Germany (2005) 40 E.H.R.R. 1 at [57]; Stubbings v United Kingdom (1997) 23 E.H.R.R. 213 at [60], McGinley and Egan v United Kingdom (1998) 27 E.H.R.R. 1 at [98]; X and Y v The Netherlands (1986) 8 E.H.R.R. 235 at [23] and Airey v Ireland (1979) 2 E.H.R.R. 305 at [32].
 
43
Van Kück v Germany (2007) 37 E.H.R.R. 51 at [71]. See also Mosley v United Kingdom (2011) 53 E.H.R.R. 30 at [120], Karakó v Hungary (2011) 52 E.H.R.R. 36 at [19], Von Hannover (2005) 40 E.H.R.R. 1 at [57]; Rees v United Kingdom (1987) 9 E.H.R.R. 56 at [37]; Gaskin v United Kingdom (1989) 12 E.H.R.R. 36 at [42]; See also Hatton v United Kingdom (36022/97) (2003) 37 E.H.R.R. 28 at [119].
 
44
Moreham [16] at 46 (2008).
 
45
Goodwin (2002) 35 E.H.R.R. 18 at [90] and I (2003) 36 E.H.R.R. 53 at [70]. See also R (On the Application of Perdy) v DPP [2009] UKHL 45 at [71] and [82], R (on the application of G) v Nottinghamshire Healthcare NHS Trust at [94], Pretty v United Kingdom (2002) 35 E.H.R.R. 1 at [61] and Van Kück v United Kingdom (2007) 37 E.H.R.R. 51 at [69].
 
46
Pretty v United Kingdom (2002) 35 E.H.R.R. 1 at [65]. See also Wilkinson v Kitzinger [2006] EWHC 835 (Fam) at [11] and I v United Kingdom (2003) 36 E.H.R.R. 53 at [70].
 
47
YF v Turkey (2004) 39 E.H.R.R. 34 at [33]. “On October 20, 1993, following her detention in police custody, Ms F was examined by a doctor, who reported that there were no signs of ill-treatment on her body. On the same day she was taken to a gynaecologist for a further examination. The police requested that the report should indicate whether she had had vaginal or anal intercourse while in custody. Despite her refusal, Ms F was forced by the police officers to undergo a gynaecological examination. The police officers remained on the premises while Ms F was examined behind a curtain. The doctor reported that the applicant's wife had not had any sexual intercourse in the days preceding the examination” [12]. See also MAK v United Kingdom (2010) 51 E.H.R.R. 14 and Storck v Germany (2006) 43 E.H.R.R. 6.
 
48
Wainwright v United Kingdom (2007) 44 E.H.R.R. 40 at [47–48] (Wainwright). See also J Council v GU [2012] EWHC 3531 (COP).
 
49
Wainwright at [44].
 
50
House of Lords, Select Committee on the Constitution, Surveillance: Citizens and the State, February 2009, available at http://​www.​publications.​parliament.​uk/​pa/​ld200809/​ldselect/​ldconst/​18/​18.​pdf, p. 12 (HL Surveillance).
 
51
HL Surveillance p. 12.
 
53
Compare D. Solove taxonomy of privacy with the case law of the ECtHR. Solove [12] (Solove).
 
54
Solove at 497. This author mentions the case of Nader v. General Motors Corp. 225 N.E.2d 765 (N.Y. 1970) (in particular at 768–771), in which General Motors had wiretapped Ralph Nader’s phone and extensively monitored his behaviour while being in public. Although the Court acknowledged that in principle surveillance in public places cannot amount to an invasion of privacy, in certain cases it may be “so ‘overzealous’ as to render it actionable”. That being said, surveillance could only be actionable to the extent concealed facts had been revealed as a result of pervasive public monitoring. As a result surveillance in public places is not considered to be illegitimate per se. See also Solove [17, 18].
 
55
For a discussion about the strength of the secrecy paradigm in the UK see Stalla-Bourdillon, Sophie (2011) Privacy is dead, long live privacy! Breach of confidence and information privacy: towards a more progressive action for breach of confidence? In, Kierkegaard, Sylvia (ed.) Law Across Nations: Governance, Policy & Statutes, IAITL.
 
56
See in particular Peck v United Kingdom (2003) 36 E.H.R.R. 41 at [57]. See also PG and JH v United Kingdom (App. No.44787/98), judgment of September 25, 2001 at [56].
 
57
See for example, Gillan and Quinton v United Kingdom (2010) 50 E.H.R.R. 45 at [61], Perry v United Kingdom (2004) 39 E.H.R.R. 3 at [36] and Von Hannover V Germany (2005) 40 E.H.R.R. 1 at [50].
 
58
Malone v United Kingdom Malone (1985) 7 E.H.R.R. 14 at [19] (Malone).
 
59
(1979–80) 2 E.H.R.R. 214 (Klass).
 
60
(1990) 12 E.H.R.R. 547 (Kruslin).
 
61
Klass at [41]; Malone at [64]: “telephone conversations are covered by the notions of 'private life' and ‘correspondence’ within the meaning of Article 8”. See also Huvig v France (1990) 12 E.H.R.R. 528 at [25].
 
62
PG and JH v United Kingdom (2008) 46 E.H.R.R. 51 at [60]; Chalkley v United Kingdom (2003) 37 E.H.R.R. 30 at [24]; Lewis v United Kingdom (2004) 39 E.H.R.R. 9 at [18]; Elahi v United Kingdom (2007) 44 E.H.R.R. 30 at [17–18]; Hewitson v United Kingdom (2003) 37 E.H.R.R. 31 at [20]; Khan v United Kingdom (2001) 31 E.H.R.R. 45 at [25]; Armstrong v United Kingdom (2003) 36 E.H.R.R. 30 at [19]; Allan v United Kingdom (2003) 36 E.H.R.R. 12 at [35]; Wood v United Kingdom (App. No.23414/02), [2004] Po. L.R. 326 at [33].
 
63
Kruslin at [33].
 
64
A few exceptions to the right to respect of correspondence have been recognized by the ECtHR in particular in cases brought by prisoners. In Campbell v UK (1993) 15 E.H.R.R. 137 at [48]: “the prison authorities may open a letter from a lawyer to a prisoner when they have reasonable cause to believe that it contains an illicit enclosure which the normal means of detection have failed to disclose. The letter should, however, only be opened and should not be read. Suitable guarantees preventing the reading of the letter should be provided, e.g. opening the letter in the presence of the prisoner. The reading of a prisoner's mail to and from a lawyer, on the other hand, should only be permitted in exceptional circumstances when the authorities have reasonable cause to believe that the privilege is being abused in that the contents of the letter endanger prison security or the safety of others or are otherwise of a criminal nature. What may be regarded as ‘reasonable cause’ will depend on all the circumstances but it presupposes the existence of facts or information which would satisfy an objective observer that the privileged channel of communication was being abused”.
 
65
See, e.g. Malone at [67] ff; Kruslin at [27] ff.
 
66
(1983) 5 E.H.R.R. 347. The ECtHR ruled that “a law which confers a discretion must indicate the scope of that discretion” at [88].
 
67
In this case, the applicant, an antiques dealer, had been prosecuted for a number of offences relating to dishonest handling of stolen goods but then acquitted. During the trial, his telephone conversation had been intercepted and monitored on the authority of a warrant issued by the Secretary of State for the Home Department. After his acquittal, the applicant initiated some proceedings and claimed that the tapping of his telephone had been done in violation of Article 8.
 
68
Malone at [67].
 
69
Malone at [68].
 
70
In Malone, there was no overall statutory code governing interceptions of communications, although various statutory provisions were applicable thereto. That being said, the bulk of the provisions at stake were mere “administrative” provisions coming from a Government's White Paper of 1980 entitled 'The Interception of Communications in Great Britain' presented to Parliament by the then Home Secretary in April 1980, Cmnd. 7873. Such administrative rules had no binding effects. Importantly, the law of England and Wales did not expressly require the issuance of a warrant for the purposes of exercising the power to intercept communications. As a result, “it cannot be said with any reasonable certainty what elements of the powers to intercept are incorporated in legal rules and what elements remain within the discretion of the executive”. Malone at [79].
 
71
Kruslin at [34].
 
72
Kruslin at [35].
 
73
But obviously not each individual belonging to this category.
 
74
Klass at [16].
 
75
Klass at [51].
 
76
Klass at [51].
 
77
Klass at [51].
 
78
Klass at [60].
 
79
The reason given to justify such a solution was that not informing individuals who had been under surveillance was necessary to ensure the efficacy of the interception. Klass at [58].
 
80
Klass at [56]. Judicial review had been replaced by a two-tiered system: first an official qualified for judicial office would review the implementation of the surveillance measure and make sure only relevant information was kept. At a later stage a Parliamentary Board and the G 10 Commission would eventually oversight the whole process.
 
81
Klass at [55].
 
82
The reactive and minimalist approach taken by the UK legislator has, however, been heavily criticized. See e.g. B. Goold, Liberty and others v The United Kingdom: a new chance for another missed opportunity, 2009 Public Law 5.
 
83
(2011) 52 E.H.R.R. 4 (Kennedy).
 
84
Kennedy at [128].
 
85
Kennedy at [159].
 
86
The ECtHR relies upon a definition to be found in a 1986 Report: “[A]ctivities which threaten the safety of well-being of the State, and which are intended to undermine or overthrow Parliamentary democracy by political, industrial or violent means”. See Kennedy at [33].
 
87
See Sect. 81 of RIPA 2000.
 
88
Kennedy at [160]. See Sect. 8(1) which provides that “an interception warrant must name or describe either—(a) one person as the interception subject; or (b) a single set of premises as the premises in relation to which the interception to which the warrant relates is to take place”.
 
89
Kennedy at [161].
 
90
Kennedy at [14].
 
91
Kenney at [155].
 
92
(2009) 48 E.H.R.R. 1 at [69–70] (Liberty).
 
93
(2008) 46 E.H.R.R. SE5.
 
94
see infra pp. 54-55
 
95
Weber at [95].
 
96
It was not possible to use catchwords containing features allowing the interception of specific telecommunications and thereby expressly monitor specific individuals. Weber at [32].
 
97
Weber at [33–35].
 
98
The applicants were claiming that between 1990 and 1997 “all public telecommunications, including telephone, facsimile and email communications, carried on microwave radio between the two British Telecom's radio stations (at Clwyd and Chester), a link which also carried much of Ireland's telecommunications traffic” had been intercepted. Liberty at [5].
 
99
Malone at [83].
 
100
Malone at [83].
 
101
Malone at [84].
 
102
Malone at [84].
 
103
Malone at [84].
 
104
Malone at [87].
 
105
(2007) 45 E.H.R.R. 37 (Copland).
 
106
Copland at [41].
 
107
Copland at [13].
 
108
Copland at [11].
 
109
Copland at [43].
 
110
Copland at [43].
 
111
Copland at [49]. The Regulation of Investigatory Powers Act 2000 and thereby the Telecommunications (Lawful Business Practice) Regulations 2000 came into force after the facts on the case.
 
112
Copland at [43].
 
113
Malone at [84].
 
114
See infra pp.
 
115
Copland at [54].
 
116
Copland at [42].
 
117
See e.g. Niemietz.
 
118
In most cases, users are not in practice able to freely express a choice in relation to the terms of online service providers’ privacy policies.
 
119
Köpke v Germany (2011) 53 E.H.R.R. SE26 at [37]. See also ECtHR Gillan and Quinton v United Kingdom Judgment 12 January 2010, at [61]; PG v United Kingdom (2008) 46 E.H.R.R. 51 (PG) at [57]; Perry v United Kingdom (2004) 39 E.H.R.R. 3 at [37]. But see Halford v United Kingdom (1997) 24 E.H.R.R. 523 at [45].
 
120
(2009) 48 E.H.R.R. 50 (S and Marper).
 
121
S and Marper, at [67].
 
122
(ETS No. 108). The Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was drawn up within the Council of Europe by a committee of governmental experts under the authority of the European Committee on Legal Co-operation (CDCJ) and was opened for signature by the member States of the Council of Europe on 28 January 1981 in Strasbourg.
 
123
S and Marper, at [71].
 
124
S and Marper, at [72].
 
125
S and Marper, at [104].
 
126
S and Marper, at [116–117].
 
127
S and Marper, at [117].
 
128
S and Marper, at [119]: “The material may be retained irrespective of the nature or gravity of the offence with which the individual was originally suspected or of the age of the suspected offender; fingerprints and samples may be taken—and retained—from a person of any age, arrested in connection with a recordable offence, which includes minor or non-imprisonable offences. The retention is not time limited; the material is retained indefinitely whatever the nature or seriousness of the offence of which the person was suspected. Moreover, there exist only limited possibilities for an acquitted individual to have the data removed from the nationwide database or the materials destroyed; in particular, there is no provision for independent review of the justification for the retention according to defined criteria, including such factors as the seriousness of the offence, previous arrests, the strength of the suspicion against the person and any other special circumstances”. Besides, the interests of minors were not specifically taken into account.
 
129
Von Hannover v Germany S [2012] E.M.L.R. 16 at [96]. See also Reklos v Greece [2009] E.M.L.R. 16 at [40].
 
130
PG v United Kingdom (2008) 46 E.H.R.R. 51 (PG).
 
131
PG at [59–60]. Therefore, the interference was not “in accordance with the law” as required by para. 2 of Art. 8. PG at [63]. Compare with Friedl (1996) 21 E.H.R.R. 83 at [49–51] in which the Commission considered that the retention of anonymous photographs that had been taken at a public demonstration did not interfere with the right to respect for private life. In so deciding, it attached special weight to the fact that the photographs concerned had not been entered in a data-processing system and that the authorities had taken no steps to identify the persons photographed by means of data processing. In a more recent case the mere collection of personal information was prohibited. Reklos v Greece [2009] E.M.L.R. 16. The applicant was a child though.
 
132
PG at [57].
 
133
Peck v United Kingdom [2003] E.M.L.R. 15 (Peck). See also Perry v United Kingdom (2004) 39 E.H.R.R. 3 (Perry) at [40–43] in which the ECtHR found that because the footage of the applicant had been taken for use in a video identification procedure and, potentially, as evidence prejudicial to his defence at trial the recording and the use of the footage amounted to a violation of Article 8(1). Besides it was not possible to justify the violation on the ground of Article 8(2) for it was not done “in accordance with the law”. Perry at [49].
 
134
At night, the applicant had been walking towards a central junction in the centre of Brentwood with a kitchen knife in his hand and attempted suicide by cutting his wrists. He had stopped at the junction and leaned over a railing facing the traffic with the knife in his hand. He had not been unaware that a CCTV camera, mounted on the traffic island in front of the junction, filmed his movements. The CCTV footage later disclosed did show the applicant in possession of a knife. The cutting of the wrists was not part of the footage.
 
135
Peck at [67].
 
136
Peck at [79].
 
137
Peck at [80–84].
 
138
Peck at [85].
 
139
A public figure.
 
140
Von Hannover v Germany [2004] E.M.L.R. 21. Notably although only the publication of the photos seemed to be at stake, the applicant had alleged that “as soon as she left her house she was constantly hounded by paparazzi who followed her every daily movement, be it crossing the road, fetching her children from school, doing her shopping, out walking, practising sport or going on holiday”. Von Hannover at [44].
 
141
Köpke at [36].
 
142
Köpke at [46].
 
143
Köpke at [42].
 
144
Guaranteed by art. 1 of Protocol No. 1 to the Convention.
 
145
Köpke at [44].
 
146
Köpke at [50].
 
147
Köpke at [53].
 
148
[2009] EWCA Civ 414 (Wood).
 
149
Wood at [69].
 
150
Wood at [22] as per Laws LJ dissenting in this case. This approach is, however, dangerous in so far as it amounts to lowering the requirement for legality when the interference appears to be modest. Indeed, if modest inferences are repeated, it is, however, possible to create a very detailed profile of the individuals monitored. See Wood at [54]. In addition, it can lead to the formulation of an oversimplified test like the test used by Lord Hope of Craighead DPSC in Kinloch v HM Advocate [2012] UKSC 62. The answer is to be found not only by considering whether the alleged victim had a reasonable expectation of privacy while she was in public view but also by considering whether the interference was a measure of surveillance and whether the information had subsequently been stored through the means in particular of an automated process.
 
151
Wood at [35] as per Laws LJ dissenting.
 
152
Wood at [98].
 
153
Wood at [98].
 
154
The only relevant information found was the following: “The Metropolitan Police Service (‘MPS’) is committed to providing MPS personnel with a particularly useful tactic to combat crime and gather intelligence and evidence relating to street crime, anti-social behaviour and public order. It may be used to record identifiable details of subjects suspected of being involved in crime or anti-sociable [sic] behaviour such as facial features, visible distinctive marks e g, tattoos, jewellery, clothing and associates for the purposes of preventing and detecting crime and to assist in the investigation for all alleged offences. This tactic may also be used to record officers' actions in the following circumstances” Wood at [13].
 
155
Wood at [97].
 
156
Wood at [84] per Dyson LJ.
 
157
Leander v Sweden (1987) 9 E.H.R.R. 433 (Leander).
 
158
Leander at [48].
 
159
It seems, at least according to the Swedish government, that two mistakes had been made by the Director of the Museum: First he took the decision to employ the applicant before the communication of the results of the personal control and second the post had not properly been declared vacant. Leander at [10].
 
160
Besides, it was not included in the material submitted to the Court. Leander at [14].
 
161
Leander at [19].
 
162
The ECtHR observed that “[r]egarding his personal background, [the applicant] furnished the following details to the Commission and the Court. At the relevant time, he had not belonged to any political party since 1976. Earlier he had been a member of the Swedish Communist Party. He had also been a member of an association publishing a radical review—Fib/Kulturfront. During his military service, in 1971–1972, he had been active in the soldiers’ union and a representative at the soldiers' union conference in 1972 which, according to him, had been infiltrated by the security police. His only criminal conviction stems from his time in military service and consisted of a fine of 10 Skr. for having been late for a military parade. He had also been active in the Swedish Building Workers' Association and he had travelled a couple of times in Eastern Europe. The applicant asserted however that, according to unanimous statements by responsible officials, none of the abovementioned circumstances should have been the cause for the unfavourable outcome of the personnel control”. Leander at [17].
 
163
Leander at [48].
 
164
Leander at [51].
 
165
Leander at [51].
 
166
Leander at [55]: “[T]he Ordinance contains explicit and detailed provisions as to what information may be handed out, the authorities to which information may be communicated, the circumstances in which such communication may take place and the procedure to be followed by the National Police Board when taking decisions to release information”.
 
167
Leander at [57].
 
168
Leander at [67].
 
169
Leander at [67]. The ECtHR expressly stated that “[t]he interference to which Mr. Leander was subjected [could not] therefore be said to have been disproportionate to the legitimate aim pursued”. Leander at [67].
 
170
Leander at [64].
 
171
By looking at the way the ECtHR applied Article 8(2) in both Leander (decided in 1987) and Kruslin (decided in 1990), it could appear that the framework used by the ECtHR to assess the legitimacy of the interference at stake has evolved and has become more stringent over time. While in Leander the ECtHR did not take into account the existing safeguards to assess the quality of the legal basis but examined them to determine whether the interference was proportionate to the aim pursued, in Kruslin, the safeguards were mentioned and analysed at an earlier stage, to determine whether the legal basis was foreseeable. Therefore, in Kruslin, the ECtHR did not need to address the issue of proportionality.
 
172
(2007) 44 E.H.R.R. 2 (Segerstedt Wiberg).
 
173
Although the safeguards to be found in national law seem to have been examined at the stage of the proportionality test more than at the stage of the quality of the legal basis test, the scrutiny does appear to be stricter.
 
174
The only exceptions made were for the benefit of researchers. From 1 July 1996, it was also possible to allow exemptions if the Government held the view that there were extraordinary reasons for an exemption to be made from the main rule of absolute secrecy.
 
175
Segerstedt-Wiberg at [49].
 
176
Segerstedt-Wiberg at [74].
 
177
Segerstedt-Wiberg at [70].
 
178
Segerstedt-Wiberg at [72].
 
179
Segerstedt-Wiberg at [72].
 
180
Segerstedt-Wiberg at [90].
 
181
Segerstedt-Wiberg at [91]: “In this case, the KPML(r) party programme was the only evidence relied upon by the Government. Beyond that they did not point to any specific circumstance indicating that the impugned programme clauses were reflected in actions or statements by the party’s leaders or members and constituted an actual or even potential threat to national security when the information was released in 1999, almost 30 years after the party had come into existence”.
 
182
Rotaru v Romania (2000) 8 B.H.R.C. 449 (Rotaru).
 
183
Rotaru at [53]. The ECtHR expressly stated that it had “doubts as to the relevance to national security of the information held on the applicant”.
 
184
Rotaru at [57–62]. No provision of domestic law defined the kind of information to be recorded, the categories of people against whom surveillance measures could have taken, the circumstances in which such measures could have been taken and the procedure to be followed. The collection and storage had been limited to a certain period of time. Nothing had been said as regards the possibility to access the information contained in these files. No system of supervision of the surveillance activities had been put in place.
 
185
Rotaru at [43]. See also Amann v. Switzerland [GC] ECHR 2000-II at [65].
 
186
Rotaru at [43].
 
187
(2000/C 364/01) OJEC C 364/1 18 December 2000. The Charter of Fundamental Rights of the European Union was signed and proclaimed in 2000 by the European Parliament, the European Commission and by the EU member states, comprising the European Council. It includes civil, political, economic and social rights and certain “third generation” rights. With the coming into force of the Treaty of Lisbon in December 2009, the Charter has been transformed into a binding instrument and is therefore directly enforceable by the Court of Justice of the European Union (CJEU) and national courts. Under Art. 6(1) of the Treaty on the European Union “the Union recognises the rights, freedoms and principles set out in the Charter of Fundamental Rights”.
 
188
“Everyone has the right to respect for his or her private and family life, home and communications”.
 
189
“1. Everyone has the right to the protection of personal data concerning him or her. 2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified. 3. Compliance with these rules shall be subject to control by an independent authority”.
 
190
This disconnection has also been condemned by other commentators. See, e.g. Costa and Poullet [19] at 255.
 
191
The similarity and complementary of traditional privacy law and data protection law are not always clearly understood. See e.g. Robinson et al. [20, p. 27]. This is how the authors of this report frame the problem: “[t]he scope of the Directive has been criticised because the relationship between privacy protection and data protection is vague: not all acts of personal data processing as covered by the Directive have a clear or noticeable privacy impact, and we must ask if this is a weakness in its focus. Should the impact on privacy be a relevant criterion for determining the applicability of data protection rules?” This is a pertinent concern only to the extent that the right to respect for private life is quite limited in scope, which does not seem to be the case.
 
192
See Pearche and Platten [21] at 533. See also in France Proposition de résolution sur la proposition de directive du Conseil des Communautés européennes relative à la protection des personnes physiques à l’égard du traitement des données à caractère personnel et à la libre circulation de ces données, présentée par Robert Pandraud et Pierre Mazeaud, Assemblée nationale, 27 avril 1993, n°117.
 
193
81/679/EEC. Commission Recommendation of 29 July 1981 relating to the Council of Europe Convention for the protection of individuals with regard to automatic processing of personal data Official Journal L 246, 29/08/1981 pp. 31–31.
 
194
This Convention as of the time of writing has been ratified by 45 countries among which all the countries of the European Union. Notably this Convention is open for signature to non-members of the Council of Europe but no third-party country has taken that opportunity.
 
195
Article 2.
 
196
See Article 3(1): “The Parties undertake to apply this convention to automated personal data files and automatic processing of personal data in the public and private sectors”.
 
197
See Article 8.
 
198
Preamble.
 
199
See in particular Article 3.
 
200
This is true in particular as regards sensitive data. See Article 6: “Personal data revealing racial origin, political opinions or religious or other beliefs, as well as personal data concerning health or sexual life, may not be processed automatically unless domestic law provides appropriate safeguards. The same shall apply to personal data relating to criminal convictions”.
 
201
Rec(87)15E 17 September 1987 regulating the use of personal data in the police sector.
 
202
Additional Protocol to the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data regarding supervisory authorities and transborder data flows, Strasbourg, 8. XI.2001.
 
203
See in particular Article 2.
 
204
Recommendation of the Council concerning guidelines governing the protection of privacy and transborder flows of personal data (23 September 1980) followed by the 1985 Declaration on Transborder Data Flows and the 1998 Ministerial Declaration on the Protection of Privacy on Global Networks.
 
205
Guidelines for the Regulation of Computerized Personal Data Files, G.A. res. 44/132, 44 U.N. GAOR Supp. (No. 49) at 211, U.N. Doc. A/44/49 (1989).
 
206
For a presentation of Member states’ regimes at the time of the adoption of the Directive 1995/47/EC see e.g. Mayer-Schönberger [22], Reidenberg [23], at S148–S164. See also Proposal for a Council Directive concerning the protection of individuals in relation to the processing of personal data 1990 COM(90) 314 final.
 
207
See also CJEU 28 March 1996 Opinion 2/94 Accession by the Community to the European Convention for the Protection of Human Rights and Fundamental Freedoms at [27]: “No Treaty provision confers on the Community institutions any general power to enact rules on human rights or to conclude international conventions in this field”.
 
208
See Serge Gutwirth, Raphael Gellert and Rocco Bellanova, VUB, Michael Friedewald and Philip Schütz, Fraunhofer ISI, David Wright, Trilateral Research & Consulting Emilio Mordini and Silvia Venier, Legal, social, economic and ethical conceptualisations of privacy and data protection, D1 Prescient research project, at http://​prescient-project.​eu/​prescient/​inhalte/​download/​PRESCIENT-D1—final.​pdf (23 March 2011), p. 3–4, who oppose data protection law to Article 8 and state that the two bodies of law have distinct rationale. While I agree that data protection law had initially been conceived as a means to facilitate the free flow of information, the main difference lies in the nature of the regulatory strategy underlying the two bodies of rules. One is reactive, the other proactive.
 
209
Confirmed by the CJEU in Lindqvist C-101/01 [2003] ECR I-12971 at [96] (Lindqvist): “[t]he harmonisation of those national laws is therefore not limited to minimal harmonisation but amounts to harmonisation which is generally complete. It is upon that view that Directive 95/46 is intended to ensure free movement of personal data while guaranteeing a high level of protection for the rights and interests of the individuals to whom such data relate”.
 
210
Council Directive 95/46/EC of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L281/31.
 
211
Legal persons are excluded from the scope of the protection for here we are only concerned about individuals.
 
212
Article 2(a).
 
213
See recital 14 of the data protection Directive.
 
214
See recital 26: “to determine whether a person is identifiable, account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify the said person”.
 
215
What matters is whether the user browsing the web is identifiable. Vidal-Hall et al. v Google Inc. [2014] EWHC 13 (QB) at [117] (Vidal). There is a strong argument that the English case law up until Vidal has been too restrictive. See in particular Micheal John Durant v Financial Services Authority [2003] EWCA Civ 1746 and Lord and R (Department of Health) v Information Commissioner [2011] EWHC 1430 (Admin); Johnson v Medical Defence Union [2004] EWHC 347. In Durant Auld L.J. stated that “not all information retrieved from a computer search against an individual’s name or unique identifier is personal data within the meaning of the Act”. One could also argue M. Tugendhat’s interpretation in Vidal of private information remains too narrow though.
 
216
Article 2(b).
 
217
In Lindqvist, the CJEU held at [25] that “[a]ccording to the definition in Article 2(b) of Directive 95/46, the term ‘processing’ of such data used in Article 3(1) covers ‘any operation or set of operations which is performed upon personal data, whether or not by automatic means’. That provision gives several examples of such operations, including disclosure by transmission, dissemination or otherwise making data available. It follows that the operation of loading personal data on an Internet page must be considered to be such processing”.
 
218
The European Commission, Proposal for a Regulation on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 Final.
 
219
Article 2(2)(d). Recital 15 adds that “[t]he exemption should also not apply to controllers or processor which provide the means for processing personal data for such personal or domestic activities”.
 
220
Article 2(d).
 
221
See the answer given by the Article 29 Working party Opinion 5/2009 on online social networking (12.06.2009) at p. 5 which states “SNS [Social Network Service] providers are data controllers under the Data Protection Directive. They provide the means for the processing of user data and provide all the ‘basic’ services related to user management (e.g. registration and deletion of accounts). SNS providers also determine the use that may be made of user data for advertising and marketing purposes—including advertising provided by third parties”. Besides, “[a]pplication providers may also be data controllers, if they develop applications which run in addition to the ones from the SNS and users decide to use such an application”. With regard to users, they are in most cases considered as mere data subjects because of the application of the household exemption. See more generally Opinion 1/2010 on the concepts of “controller” and “processor” (16.02.2010). The Article 29 Working Party has nevertheless a mere advisory status.
 
222
See Article 29 Working Party Opinion 05/2012 on Cloud Computing (01.07.2012) at pp. 7–8 which states “[t]he cloud client determines the ultimate purpose of the processing and decides on the outsourcing of this processing and the delegation of all or part of the processing activities to an external organisation. The cloud client therefore acts as a data controller. … The cloud provider is the entity that provides the cloud computing services in the various forms discussed above. When the cloud provider supplies the means and the platform, acting on behalf of the cloud client, the cloud provider is considered as a data processor”.
 
223
This is not to deny that both the provisions of the data protection Directive and Article 8 of the ECHR can have ex ante and ex post effects. I am just trying to highlight the regulatory strategy underlying the data protection Directive which differs from that of Article 8 of the ECHR.
 
224
It, however, includes demanding record keeping obligations. See Article 22, 28.
 
225
It is worth mentioning that under UK law (SI 2000/188) processing for staff administration, processing for advertising marketing and public relations and processing for accounts and record keeping do not require prior notification to the Information Commissioner. This does not mean that data controllers do not have to comply with the rest of the Data Protection Act of 1998.
 
226
It is not clear whether the formulation adopted by the proposed general data Regulation will be interpreted in a more restrictive manner. See Article 2 about the material scope of the Regulation.
 
227
However, some shared minimum standards existed for all Member States had ratified the Convention 108 and the Council of Europe had adopted a sectorial recommendation to regulate personal data processing by the police. Recommendation R (87).
 
228
Article 17.
 
229
Article 18.
 
230
Article 32. The language of rights is not used in this article tough. This said it goes beyond Article 20 of the data protection Directive which simply requires Member States to “determine the processing operations likely to present specific risks to the rights and freedoms of data subjects and (...) check that these processing operations are examined prior to the start thereof”. In the UK, this provision has not been fully implemented. Section 22 of the Data Protection Act of 1998 does grant the Secretary of State the power to determine categories of assessable processing but so far no determination has been made.
 
231
Article 34(2).
 
232
Proposal for a Directive of the European Parliament and of The Council on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data COM/2012/010 final—2012/0010 (COD).
 
233
Council Framework Decision 2008/977/JHA on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters OJ L 350, 30.12.2008 pp. 60–71. This framework Decision has a limited scope of application, since it only applies to cross-border data processing. Besides, it gives Member States a large leeway to implement its provisions. And, it does not contain provisions relating to supervision by an advisory group similar to the Article 29 Working Party or oversight by the European Commission to ensure a common approach in its implementation. See also Council Framework Decision 2006/960/JHA of 18 December 2006 on simplifying the exchange of information and intelligence between law enforcement authorities of the Member States of the European Union OJ L 386, 29.12.2006, pp. 89–100.
 
234
The creation of the area of freedom, security and justice is based on the Tampere (1999–04), Hague (2004–09) and Stockholm (2010–14) programmes. It derives from Title V of the TFEU, which regulates the “Area of freedom, security and justice”.
 
235
See 29 March 2004 COM(2004) 221; 16 June 2004 COM(2004) 429; Proposal for a Council framework decision on the protection of personal data processed in the framework of police and judicial co-operation in criminal matters COM(2005) 475; COM(2005) 490.
 
236
Proposal for a Council framework decision on the protection of personal data processed in the framework of police and judicial co-operation in criminal matters COM(2005) 475 Recital 6.
 
237
Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions safeguarding privacy in a connected world—A European data protection framework for the 21st Century COM(2012) 9 final.
 
238
Articles 12 to 14. There is also a right to information (Article 11 & Article 29), a right to rectification (Article 15) and a right to erasure (Article 15). The right to access is already included in the Framework Decision 2008/977/JHA, but its modalities of exercise such as the right to request an assessment of the lawfulness of the processing are not regulated. See Article 17 of the Framework Decision 2008/977/JHA.
 
239
Article 14.
 
240
“The other grounds for lawful processing in Article 7 of Directive 95/46/EC are not appropriate for the processing in the area of police and criminal justice” states the Commission in COM(2012) 10 final, p. 7.
 
241
This is a new provision which was neither present in the data protection Directive nor in the Framework Decision 2008/977/JHA. However, it finds its roots in the Council of Europe’s Recommendation No R (87)15. See also Euoropol Decision 2009/371/JHA (Article 14) and Eurojust Decision 2009/426/JHA (Article 15).
 
242
See infra pp. 63−64
 
243
See Article 13 of the data protection Directive and Article 2 of the proposed general data protection Regulation. Note that in the proposed general data protection Regulation article 13 has disappeared though which limits the range of exceptions that Member States can carve out.
 
244
Klass at [34]. See also Weber at [78] and Liberty at [56–57].
 
245
The Guardian and the Washington Post have also revealed that the UK authorities have asked the US agency (NSA) to supply data intercepted or accessed to in the US. In this case RIPA 2000 is not applicable.
 
246
Under s.20 of RIPA 2000, “external communication” means a communication sent or received outside the British Islands.
 
247
Section 8(1). See also Section 8(2) and (3) concerning information relating to the apparatus to be used and the communication to be intercepted.
 
248
Kennedy at [169–170].
 
249
It is reported that “[f]ull content of transmissions is preserved for 3 days and metadata for 30” in P. Bump, The UK Tempora Program Captures Vast Amounts of Data—and Shares with NSA, The Wire, 21 June 2013, http://​www.​thewire.​com/​national/​2013/​06/​uk-tempora-program/​66490/​. See also the statements of grounds submitted before the Investigatory Powers Tribunal by Privacy International on 8 July 2013, https://​www.​privacyinternati​onal.​org/​sites/​privacyinternati​onal.​org/​files/​downloads/​press-releases/​privacy_​international_​ipt_​grounds.​pdf.
 
250
Except to the extent to which it requires compliance with the conditions in Schedules 2 and 3.
 
251
Section 27 RIPA 2000.
 
252
Section 26(3) RIPA 2000.
 
253
This is not a consensual interpretation and it does seem that in practice law-enforcement officers do not agree with such a finding for they argue that private life for the purposes of the application of RIPA 2000 differs from the ECtHR’s understanding of private life. It is also to be noted that in England a violation of the right to respect for private life does not necessarily render the evidence inadmissible.
 
254
Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) OJ L 201, 31.7.2002, p. 37 amended two times by Directive 2006/24/EC of the European Parliament and the of the Council of 15 March 2006 and Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009.
 
255
Article 3.
 
256
Article 1(2).
 
257
Indeed, Recital 12 of the proposed Regulation states that “[t]he protection afforded by this Regulation concerns natural persons, whatever their nationality or place of residence, in relation to the processing of personal data. With regard to the processing of data which concern legal persons and in particular undertakings established as legal persons, including the name and the form of the legal person and the contact details of the legal person, the protection of this Regulation should not be claimed by any person. This should also apply where the name of the legal person contains the names of one or more natural persons.”.
 
258
Article 6(2) of the e-privacy Directive: “Traffic data necessary for the purposes of subscriber billing and interconnection payments may be processed. Such processing is permissible only up to the end of the period during which the bill may be lawfully challenged or payment pursued”.
 
259
Router providers have recently taken the initiative to develop pro-active technologies primarily to increase the security of their systems and networks. However, interestingly while these technologies are intended to monitor traffic from users’ home network to the Internet and block access to infectious programs, they can occasionally be used to detect and react upon or prevent alleged unlawful content. A new generation of “precautionary” intermediaries is thus emerging which develop security technologies that rely on systematic data collection and information blocking mechanisms both to secure their systems and networks and combat cybercrimes. See e.g. http://​www.​neowin.​net/​news/​cisco-locks-users-out-of-their-routers-requires-invasive-cloud-service.
 
260
Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across of the Union COM(2013) 48 final.
 
261
Article 14 of the proposed Directive. Market operators include “provider[s] of information society services which enable the provision of other information society services” (under Article 3).
 
262
See Article 14 of the proposal for a Directive concerning measures to ensure a high common level of network and information security across the Union. Service providers targeted by the e-privacy Directive and Directive 2002/21/EC of the European Parliament and of the Council of 7 March 2002 on common regulatory framework for electronic communications networks and services (“Framework Directive”) (see Article 4 and Article 13a, respectively) are already under such obligations. There is thus over time an extension of the scope of security obligations. The protection of the network has become a major concern and is distinct from that of the protection of personal data. Compare with Article 17 of the data protection Directive.
 
263
Under Article 3(3) risk is defined as “any circumstance or event having a potential adverse effect on security”.
 
264
Article 6(3) of the e-privacy Directive.
 
265
Recital 18 of the e-privacy Directive. Customers’ enquiries in general should probably be part of the category of value-added services, although Art. 6(5) seems to distinguish between them.
 
266
Article 6(3) of the e-privacy Directive.
 
267
See Article 5 of the e-privacy Directive.
 
268
RIPA 2000 s.3(3).
 
269
Directive 2009/136/EC of the European Parliament and of the Council Of 25 November 2009 amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation(EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws, OJ L337/11, 18.12.2009, pp.11–36.
 
270
Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC OJ L 105, 13/04/2006 pp. 54–63. Interestingly, data retention obligations are not present in US law. As service providers do retain data and there is not a comprehensive data protection law, data retention obligations are in fact not really needed.
 
271
CJEU Joined cases C-393/12 and C-594/12 Digital Rights Ireland Ltd v Minister for Communications, Marin and Natural Resources et al. and Kärntner Landesregierung, Micheal Seitlinger, Christof Tschohl and others of 8 April 2014 (Digital Rights Ireland).
 
272
Which arguably are also traffic data within the meaning of the e-privacy Directive, Article 9.
 
273
Article 6 of the data retention Directive.
 
274
Article 1(1) of the data retention Directive.
 
275
And more broadly the system of legitimizing ends to be found in the data protection Directive.
 
276
The proposed Directive on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data seems to be a bit more prescriptive though.
 
277
See Article 12–14 of the Council Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L178/1.
 
278
Feiler [24], at 2–3 (Feiler).
 
279
Feiler at 3.
 
280
See also Recital 6 of the e-privacy Directive which evokes a wide range of electronic communications services: “[t]he Internet is overturning traditional market structures by providing a common, global infrastructure for the delivery of a wide range of electronic communications services. Publicly available electronic communications services over the Internet open new possibilities for users but also new risks for their personal data and privacy”. Notably in England, the Court’s ruling in Chambers v DPP [2012] EWHC 2157 (Admin) seems to show that the term “public electronic communications network” must be interpreted extensively and include privately owned networks.
 
281
It does not seem that the new definition of personal data to be found in the proposed general data protection Regulation alters these findings.
 
282
See the example of AOL who in 2006 published the logs of 675 000 Americans with the intent to feed university research. The name of users had been replaced by the numbers of cookies to anonymise the logs. However, combining together all the searches corresponding to the same numbers of cookies, some journalists managed to identify some users. More generally, see the search facility still offered by AOL at www.​aolstalker.​com.
 
283
See, e.g. Article 5 of the data retention Direction and Article 2 and 6 of the e-privacy Directive 2002/58/EC.
 
284
See Article 1(2) of the data retention Directive.
 
285
In the UK, this issue has been heavily debated. In the end, it does seem that content data are the wining qualification. This results from a code of practice on acquisition and disclosure of communications data adopted in 2007 under Chapter II of Part I of the Regulation of Investigatory Powers Act 2000. HOME OFFICE, Acquisition and disclosure of communications data, available at http://​www.​homeoffice.​gov.​uk/​publications/​counter-terrorism/​ripa-forms/​code-of-practice-acquisition. It provides guidance on the procedures to be followed when acquisition of communications data takes place under those provisions. §2.20 reads as follow: “[i]n relation to internet communications, this means traffic data stops at the apparatus within which files or programs are stored, so that traffic data may identify a server or domain name (web site) but not a web page”. However, law-makers are often less nuanced. See, for example, the statements made in the introduction to the UK Draft Communications Data Bill: “[c]ommunications data is very different from communications content—for example, the text of an email or a telephone conversation, and arrangements for the police and security agencies to intercept the content of a communication are very different from arrangements to get access to communications data”.
 
286
Digital Rights Ireland at [34].
 
287
Digital Rights Ireland at [37].
 
288
Digital Rights Ireland at [39].
 
289
Digital Rights Ireland at [48].
 
290
Digital Rights Ireland at [57].
 
291
Digital Rights Ireland at [58].
 
292
Digital Rights Ireland at [59].
 
293
Digital Rights Ireland at [60–61].
 
294
Digital Rights Ireland at [62].
 
295
Digital Rights Ireland at [64].
 
296
Digital Rights Ireland at [66–68].
 
297
See in particular Digital Rights Ireland at [53].
 
298
See Article 33.
 
299
See for example: Brian Wheeler, How much privacy can smartphone owners expect? (BBC News, Washington, 22 November 2011), available at http://​www.​bbc.​co.​uk/​news/​magazine-15730499; and Emma Barnett, Facebook’s Mark Zuckerberg says privacy is no longer a “social norm” (Telegraph, 11 January 2010), available at http://​www.​telegraph.​co.​uk/​technology/​facebook/​6966628/​Facebooks-Mark-Zuckerberg-says-privacy-is-no-longer-a-social-norm.​html. The role of privacy considering computers was even questioned in 1975, see Sol Wachtler, The right to privacy 14 Judges Journal 7, at 8 (1975).
 
300
“Every second, 18 adults become a victim of cybercrime—that’s more than one and a half million cybercrime victims each day globally” says the Symantec website, at http://​www.​symantec.​com/​corporate_​responsibility/​topic.​jsp?​id=​cybercrime.
 
301
See also article 3 of the Universal declaration of human rights.
 
302
The formulation of the European charter is a way more succinct though: “[e]veryone has the right to liberty and security of person”.
 
303
See, e.g. Robert [26], Simitis [27], Cohen [28], Schwartz [29], Gavison [30], Wong [31], Oliver [32].
 
304
Klass at [48].
 
305
Doswald-Beck [33, p. 456] (Doswald-Beck).
 
306
Doswald-Beck p. 456.
 
307
See Segerstedt-Wiberg at [90–91]. See also Rotaru at [53].
 
308
Weber at [114].
 
309
Weber at [112].
 
310
See, e.g. as regard the definition of terrorism Report of the special representative on human rights defenders, UN Doc A/58/380, 18 September 2003; Report of the Independent expert on the protection of human rights and fundamental freedoms while countering terrorism, Robert K Goldman, UN Doc E/CN.4/103, 7 February 2005; Report of the United Nations High Commissioner for Human Rights on the protection of human rights and fundamental freedoms while countering terrorism, UN Doc A/HRC/8/13, 2 June 2008; Report of the Special rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, Martin Scheinin, UN Doc A/64/211, 3 August 2009.
 
311
See, e.g. the website of the UK security service, the famous MI5, stating that “As a matter of Government policy, the term ‘national security’ is taken to refer to the security and well-being of the United Kingdom as a whole. The ‘nation’ in this sense is not confined to the UK as a geographical or political entity but extends to its citizens, wherever they may be, and its system of government” at https://​www.​mi5.​gov.​uk/​home/​about-us/​what-we-do/​protecting-national-security.​html.
 
312
The survival of the nation-state should be at stake here. See nevertheless, the words of the UK MI5 which defines espionage as “a process which involves human sources (agents) or technical means to obtain information which is not normally publically available. lt may also involve seeking to influence decision makers and opinion-formers to benefit the interests of a foreign power.” MI5, What is Espionage?, 2013, available at https://​www.​securityservice.​gov.​uk/​home/​the-threats/​espionage/​what-is-espionage.​html.
 
313
Report of the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, Martin Scheinin, Ten areas of best practices in countering terrorism, 22 December 2010, UN Doc A/HRC/16/51, at [28].
 
314
EU Council Framework Decision 2002/475/JHA on combating terrorism, 13 June 2002, OJ L 164, 22.6.2002, pp. 3–7 and EU Council Framework Decision 2008/919/JHA amending Framework decision 2002/475/JHA on combating terrorism, 28 November 2008, OJ L 330 of 9.12.2008.
 
315
Not all authors distinguish between national security and public security. See, e.g. Acquilina [34], at 131 who adopts the following definition: “Public security as understood in this paper encompasses a fourfold categorisation: (i) national (or state) security, (ii) public safety, (iii) the economic well-being of a country, and (iv) protection from other criminal offences directed at the public at large such as those relating to disorder, health, morals and the rights and freedoms of others”.
 
316
See, e.g. Article 8(2) of the ECHR.
 
317
See, e.g. proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union, COM (2013) 48.
 
318
See, e.g. the documents listed in Rita Tehan, Cybersecurity: Authoritative Reports and Resources, April 17, 2013, available at http://​www.​fas.​org/​sgp/​crs/​misc/​R42507.​pdf.
 
319
Joint communication to the European Parliament, the Council, the European Economic and Social Committee of the Regions, Cybersecurity Strategy of the European Union: An Open, Safe and Secure Cyberspace /* JOIN/2013/01 final, in fn 4 (Cybersecurity Strategy of the European Union).
 
320
Cybersecurity Strategy of the European Union. See also the French White Paper Livre Blanc sur la Défense et la Sécurité nationale, Paris, Editions Odile Jacob-La Documentation Française, 2008.
 
321
Cybersecurity Strategy of the European Union at [1.1].
 
322
This explains why the term cybersecurity as its cousin cyberspace may be improper in as much as it seems to imply that the regulation to be applied to that space should differ in kind.
 
323
Interestingly, five of the biggest industrial players (Cisco Systems, IBM, Intel, Juniper Networks and Microsoft) have formed a coalition, the Industry Consortium for Advancement of Security on the Internet, to find adequate answers to security threats. See www.​icasi.​org.
 
324
See Council Framework Decision 2005/222/JHA of 24 February 2005 on attacks against information systems OJ L 69, 16.3.2005, introducing offences such as illegal access to information systems, illegal system interference and illegal data interference and Directive 2013/40/EU of the European Parliament and of the Council of 12 August 2013 on attacks against information systems and replacing Council Framework Decision 2005/222/JHA, OJ L 218, 14.8.2013, pp. 8–14. The Directive makes illegal interception a criminal offence in order to address new modes of committing cybercrimes such as the use of botnets (networks of computer that have been infected by computer viruses).
 
325
ISO/IEC 27032:2012 Information technology—Security techniques—Guidelines for cybersecurity, available at http://​www.​iso.​org/​iso/​catalogue_​detail?​csnumber=​44375. See also the definition of the International Telecommunications Union (“the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user’s assets. Organization and user’s assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment. Cybersecurity strives to ensure the attainment and maintenance of the security properties of the organization and user’s assets against relevant security risks in the cyber environment. The general security objectives comprise the following: availability; integrity, which may include authenticity and non-repudiation; and confidentiality”) in ITU—Recommendation X.1205 (04/2008) available here: http://​www.​itu.​int/​rec/​T-REC-X.​1205-200804-I.
 
326
[35]
 
327
In this line see Solove [36] (Solove, Nothing to hide).
 
328
Solove, Nothing to hide, at 751. D Solove quotes the words of Richard Posner in Richard A. Posner, Economic Analysis of Law 46 (5th ed. 1998).
 
329
Solove, Nothing to hide at 753.
 
330
See e.g. BBC, Terror attacks levelled out, says 10-year study 4 December 2012, available at http://​www.​bbc.​co.​uk/​news/​world-20588238; and “On the one hand, terrorism is relatively infrequent and hard to predict; on the other hand, when it starts to happen there is a tendency for it to happen in the same place a lot”. The Institute for Economics and Peace Global Terrorism Index: Capturing the Impact of Terrorism for the Last Decade (2012), available at http://​www.​visionofhumanity​.​org/​wp-content/​uploads/​2012/​12/​2012-Global-Terrorism-Index-Report1.​pdf.
 
331
Special Eurobarometer 404, November 2013, Report on Cyber-security, survey conducted by TNS Opinion and Social, available at http://​ec.​europa.​eu/​public_​opinion/​archives/​ebs/​ebs_​404_​en.​pdf. In the USA, in 2007 at least, identity theft seemed to be the most common online fraud: “[t]he United States Federal Trade Commission has labelled identity theft as the most common type of consumer fraud, affecting thousands of people every day. In fact, approximately 40 % of the frauds reported to the United States Federal Trade Commission (2007) over the last few years has involved some type of identity theft”. Chad Albrecht, Conan Albrecht and Shay Tzafrir, How to protect and minimize consumer risk to identity theft 18 Journal of Financial Crime 405, at 405 (2011).
 
332
Generally understood as a “form of online identity theft that aims to steal sensitive information from users such as online banking passwords and credit card information. Phishing attacks use a combination of social engineering and technical spoofing techniques to persuade users into giving away sensitive information (e.g. using a web form on a spoofed web page) that the attacker can then use to make a financial profit.” Kirda and Kruegel [37], at 554.
 
333
The aforementioned definition of identity theft is too narrow for it does not include a more recent trend relying upon acquisition of data voluntarily released by data subjects.
 
334
This is the case in England. See Oxford v Moss (1979) 68 Cr. App. R. 183 in fine. The Computer Misuse Act of 1990 is indeed of limited use here.
 
335
Marshal and Tompsett [38] (Marshal and Tompsett). For a broader approach to the issue of identity theft not merely focusing upon online identity theft, see, e.g. Albrecht et al. [39].
 
336
Marshal and Tompsett at 131.
 
337
Marshal and Tompsett at 132.
 
338
Marshal and Tompsett at 132–133.
 
339
Marshal and Tompsett at 129.
 
340
Data acquisition should constitute a fraud punished by criminal sanctions in the same way as the use of email scams or malicious software. Under s.2 of the UK Fraud Act of 2006 a fraud by misrepresentation is committed when a person “(a) dishonestly makes a false representation, and (b) intends, by making the representation [either] (i) to make a gain for himself or another, or (ii) to cause loss to another or to expose another to a risk of loss”. However, contrary to the use of phishing techniques, it is only when the personal data are used that a misrepresentation will occur, hence the importance of data protection rules. See on the legal implication of phishing techniques Savirimuthu and Savirimuthu [40].
 
341
Understood in modern times as a synonym of encryption, which involves the transformation of intelligible information into unintelligible information.
 
342
Anderson and Landrock [41], at 342 (1996) (Anderson and Landrock).
 
343
Anderson and Landrock at 342.
 
344
Anderson and Landrock at 342. For an overview of the debate at the end of the 1990s see e.g. Anderson and Landrock [41], Gerard and Broze [42], Ward [43], Akdeniz and Walker [44], Sundt [45], Bert-Jaap Koops [46], Kennedy et al. [47].
 
345
See, e.g. in the UK, the proposal of the Department of Trade and Industry to set up a key escrow system in DTI, Licencing of Trusted Third Parties for the Provision of Encryption Services, March 1997, available at http://​www.​fipr.​org/​polarch/​ttp.​html. For a critique of these techniques, see Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption, 1997, available at http://​www.​schneier.​com/​paper-key-escrow.​pdf.
 
346
Part III of the RIPA did not enter into force until 2007. S.49(2) of chapter III ensures that so long as “any person with the appropriate permissions under schedule 2” has reasonable grounds to believe:
(a)
“that a key to the protected information is in the possession of any person,
 
(b)
that the imposition of a disclosure requirement in respect of the protected information is—
i.
necessary on grounds falling within subsection (3), or
 
ii.
necessary for the purpose of securing the effective exercise or proper performance by any public authority of any statutory power or statutory duty,
 
 
(c)
that the imposition of such a requirement is proportionate to what is sought to be achieved by its imposition, and
 
(d)
that it is not reasonably practicable for the person with the appropriate permission to obtain possession of the protected information in an intelligible form without the giving of a notice under this section”,
 
They may impose a “disclosure requirement in respect of the protected information” by notice to the person “whom he believes to have possession of the key”.The necessary grounds mentioned in s.49(3) comprise the protection of the interests of national security, the prevention or detection of crime and the pursuit of the interests of the economic well-being of the United Kingdom.
 
347
For example, The Netherlands, France and Belgium.
 
348
The Convention (ETS 185). At the time of writing, 42 States are Parties to the Convention and 11 States have signed it.
 
349
Article 19(4).
 
350
Article 19(1).
 
351
See in the UK, the case R v S(F) and A(S) [2008] EWCA Crim 2177.
 
352
It is important to distinguish between the technology itself and the way the technology is used in order to assess its legitimacy.
 
353
Solove, Nothing to hide at 756. D. Solove uses the metaphor developed in Franz Kafka’s novel The Trial to explain that at least two distinct concerns are at stake because of the use of surveillance techniques: the inhibition or chilling of individuals’ behaviour and the frustration of individuals being unable to understand the reasons for the decisions affecting their interests. They are thus in a state of “helplessness and powerlessness”. See also Solove [48].
 
354
For a deeper analysis of the legal scrutiny deployed to confine deep packet inspection practices, see Stalla-Bourdillon et al. [49].
 
355
See, e.g. the way the Cleanfeed technology functions as explained in Richard Clayton, Anonymity and traceability in cyberspace, 2005, Technical Report, pp. 120–121, available at http://​www.​cl.​cam.​ac.​uk/​techreports/​UCAM-CL-TR-653.​html.
 
356
See the development of an updated version of Einstein intrusion-detection programme (Einstein 3), which relies upon deep packet inspection of the “.gov” traffic, to detect attacks and malware, especially associated with e-mail. The aim is to “not only be able to detect malicious traffic targeting Federal Government networks, but also prevent malicious traffic from harming those networks. This will be accomplished through delivering intrusion prevention capabilities as a Managed Security Service provided by Internet Service Providers (ISP). Under the direction of DHS [Department of Homeland Security], ISPs will administer intrusion prevention and threat-based decision-making on network traffic entering and leaving participating federal civilian Executive Branch agency networks”, http://​www.​dhs.​gov/​privacy-documents-national-protection-and-programs-directorate-nppd. See, however, the privacy impact assessment conducted for Einstein 3 available at http://​www.​dhs.​gov/​sites/​default/​files/​publications/​privacy/​PIAs/​PIA%20​NPPD%20​E3A%20​20130419%20​FINAL%20​signed.​pdf.
 
357
See Recital 26 of the data protection Directive. See also the Article 29 Working Party (1/2008 (WP148) p. 8 (“An individual's search history is personal data if the individual to which it relates, is identifiable. Though IP addresses in most cases are not directly identifiable by search engines, identification can be achieved by a third party. Internet access providers hold IP address data. Law enforcement and national security authorities can gain access to these data and in some Member States private parties have gained access also through civil litigation. Thus, in most cases—including cases with dynamic IP address allocation—the necessary data will be available to identify the user(s) of the IP address”).
 
358
see supra pp. 22 ff
 
359
EMI Records (Ireland) Ltd and Others v UPC Communications Ireland Ltd, [2010] IEHC 377, §107. The Copysense technology was at stake in this case.
 
360
See, for example, the wording of RIPA 2000 which provides in its Sect. 3(2) that “(3)Conduct consisting in the interception of a communication is authorised by this section if—
(a)
it takes place for purposes connected with the provision or operation of that service or with the enforcement, in relation to that service, of any enactment relating to the use of postal services or telecommunications services”.
 
 
361
If deep packet inspection is used to safeguard the security of the network, it is in fact likely to be used as a measure of surveillance.
 
362
The vagueness of Article 15(2) which is aimed at carving exceptions to Article 15(1) could mean that in practice the right not to be subjected to a decision based solely on automated processing is a mere formal protection.
 
363
Phorm has had several Internet access providers for partners and in particular the UK service providers BT, TalkTalk and Virgin media. Public outcry at the national level followed by the condemnation of the EU Commission has managed to convince UK Internet service providers to stop implementing this technology. Outside the UK, Phorm has been dealing with Oi, Telefonica in Brazil, TTNET-Türk Telekom in Turkey, and Romtelecom in Romania.
 
364
See the website of the company for a description of the technology used, available at http://​www.​phorm.​com/​technologies.
 
365
Twentieth Century Fox Film Corp v BT [2011] EWHC 1981 (Ch).
 
366
See supra p. 4 and pp. 54–55.
 
367
See e.g. Marx [50], Lyon [51], Hildebrandt and Gutwirth [52], Koutsias [53], Schermer [54] (Schermer). See also Fulda [55], Tien [56], Kawakami and McCarty [57], Slobogin [58], Rubinstein et al. [59], Solove [60].
 
368
Schermer, at 45 citing Frawley et al. [61, p. 58].
 
369
Schermer, at 45.
 
370
Schermer, at 45.
 
371
As aforementioned, the means at the disposal of the data controller must be taken into account to determine whether the person is identifiable.
 
372
See Article 8 which provides namely that “1. Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life. 2. Paragraph 1 shall not apply where:… (e) the processing relates to data which are manifestly made public by the data subject or is necessary for the establishment, exercise or defence of legal claims”. The same exclusion is to be found in the proposed general data protection Regulation. See Article 9(e).
 
373
For a similar view, see [62].
 
374
It is interesting to note that in the USA, the fourth Amendment protection does not seem to apply to data mining unless it amounts to a search. See e.g. Tien [56], Kawakami and McCarty [57], Slobogin [63].
 
375
Bruce Schneier [64] (Schneier, Data Mining).
 
376
“A false positive is when the system identifies a terrorist plot that really isn't one”. Schneier, Data Mining.
 
377
“A false negative is when the system misses an actual terrorist plot”. Schneier, Data Mining.
 
378
Schneier, Data Mining.
 
379
Presented to Parliament by the Secretary of State for the Home Department by Command of Her Majesty I June 2012 Cm 8359 Available at http://​www.​parliament.​uk/​draft-communications-bill.
 
380
Under clause 28 communications data comprise subscriber data (“information…held or obtained by a person providing a telecommunications service about those to whom the service is provided by that person”), use data (“information about the use made by a person of a telecommunications service or in connection with the provision to or use by any person of a telecommunications service”) and traffic data (“data which is comprised in, attached to or logically associated with a communication (whether by the sender or otherwise) for the purposes of a telecommunication system by means of which the communication is being or may be transmitted…”).
 
381
Not only Internet access providers seemed to be concerned but also over the Internet service providers for providers of services merely facilitating the use of telecommunications systems are also targeted. See clause 28, which defines telecommunications operators in the following manner: “a person who (a) controls or provides a telecommunication system, or (b) provides a telecommunications service”. A telecommunication system is “a system (including the apparatus comprised in it) that exists (whether wholly or partly in the United Kingdom or elsewhere) for the purpose of facilitating the transmission of communications by any means involving the use of electrical or electro-magnetic energy”. And a telecommunication service is a “service that consists in the provision of access to, and of facilities for making use of, a telecommunication system (whether or not one provided by the person providing the service)”.
 
382
Clause 1 is very generic in terms: “[t]he Secretary of State may by order (a) ensure that communications data is available to be obtained from telecommunications operators by relevant public authorities in accordance with Part 2, or (b) otherwise facilitate the availability of communications data to be so obtained from telecommunications operators”. The Home Office stated in the Explanatory Notes to the draft Bill that in practice that would mean that telecommunications operators would be required to generate all “necessary” communications data for the services or systems they provide; to retain “necessary” communications data, and even to process the data to ease the efficient and effective obtaining of the data by public authorities. Data mining techniques could thus be imposed upon such providers. There is, however, no indication as to what “necessary” communications data means. Joint Committee of the House of Lords and House of Commons, Report on the Draft Communications Data, December 2012, available at http://​www.​parliament.​uk/​draft-communications-bill/​, p. 23.
 
383
Clause 14 provides a power to establish filtering arrangements to make it easier for public authorities to acquire communications data. Public authorities would be able to submit one request through the Request Filter which would then interrogate the databases of several telecommunications operators and automatically selecting the data to supply public authorities with “only” the relevant data. Joint Committee of the House of Lords and House of Commons, Report on the Draft Communications Data, December 2012, available at http://​www.​parliament.​uk/​draft-communications-bill/​, p. 34.
 
384
Home Office written evidence, paragraphs 13 and 15.
 
385
Home Office written evidence, paragraph 16.
 
386
Joint Committee of the House of Lords and House of Commons, Report on the Draft Communications Data, December 2012, available at http://​www.​parliament.​uk/​draft-communications-bill/​, p. 16.
 
387
See BBC, Nick Clegg: No “web snooping” bill while Lib Dems in government 25 April 2013, available at http://​www.​bbc.​co.​uk/​news/​uk-politics-22292474. See nonetheless BBC, Fresh proposals’ planned over cyber-monitoring 8 May 2013, available at http://​www.​bbc.​co.​uk/​news/​uk-politics-22449209.
 
388
see Fighting cyber crime.
 
Literature
1.
go back to reference Lyon, D. (2003). Surveillance after September 11. Cambridge: Polity Press. Lyon, D. (2003). Surveillance after September 11. Cambridge: Polity Press.
2.
go back to reference Seamon, R., Gardner, W. (2004). The Patriot Act and the wall between foreign intelligence and law enforcement. Harvard Journal of Law & Public Policy, 28, 319. Seamon, R., Gardner, W. (2004). The Patriot Act and the wall between foreign intelligence and law enforcement. Harvard Journal of Law & Public Policy, 28, 319.
3.
go back to reference Schulhofer, S. (2006). The new world of foreign intelligence surveillance. Stanford Law & Policy Review, 17, 531. Schulhofer, S. (2006). The new world of foreign intelligence surveillance. Stanford Law & Policy Review, 17, 531.
6.
go back to reference Rigaux François, M. (1980). La liberté de la vie privée. Revue internationale de droit comparé. Juillet-septembre, 43(3), 539–563. Rigaux François, M. (1980). La liberté de la vie privée. Revue internationale de droit comparé. Juillet-septembre, 43(3), 539–563.
7.
go back to reference Warren, D., Brandeis, L. (1890/1891). The right to privacy. Harvard Law Review, 4, 193. Warren, D., Brandeis, L. (1890/1891). The right to privacy. Harvard Law Review, 4, 193.
8.
go back to reference Richards, N., & Solove, D. (2007). Privacy other’s path: recovering the law of privacy. Georgetown Law Journal, 96, 123. Richards, N., & Solove, D. (2007). Privacy other’s path: recovering the law of privacy. Georgetown Law Journal, 96, 123.
9.
go back to reference Whitman, J. (2004). The two western cultures of privacy: dignity versus liberty. The Yale Law Journal, 113, 1151.CrossRef Whitman, J. (2004). The two western cultures of privacy: dignity versus liberty. The Yale Law Journal, 113, 1151.CrossRef
10.
go back to reference Prosser, W. (1971). The law of torts (4th ed.). St. Paul: West Publishing Company. Prosser, W. (1971). The law of torts (4th ed.). St. Paul: West Publishing Company.
11.
go back to reference Prosser, W., et al. (1984). Prosser and Keeton on torts (5th ed.). Stavanger: West Group. Prosser, W., et al. (1984). Prosser and Keeton on torts (5th ed.). Stavanger: West Group.
12.
go back to reference Solove, D. J. (2006). A taxonomy of privacy. University of Pennsylvania Law Review, 154, 477.CrossRef Solove, D. J. (2006). A taxonomy of privacy. University of Pennsylvania Law Review, 154, 477.CrossRef
13.
go back to reference Kayser, P. (2005). Protection de la vie privée, Economica, Paris, n 8. Kayser, P. (2005). Protection de la vie privée, Economica, Paris, n 8.
16.
go back to reference Moreham, N. A. (2008). The right to respect for private life in the European convention on human rights: a re-examination. European Human Rights Law Review, 44(1), 19. Moreham, N. A. (2008). The right to respect for private life in the European convention on human rights: a re-examination. European Human Rights Law Review, 44(1), 19.
17.
go back to reference Solove, D. (2002). Conceptualizing privacy. California Law Review, 90, 1087.CrossRef Solove, D. (2002). Conceptualizing privacy. California Law Review, 90, 1087.CrossRef
18.
go back to reference Solove, D. (2006). The digital person: technology and privacy in the information age. New York: New York University Press. Solove, D. (2006). The digital person: technology and privacy in the information age. New York: New York University Press.
19.
go back to reference Costa, L., & Poullet, Y. (2012). Privacy and the regulation of 2012. Computer Law & Security Review, 28, 254.CrossRef Costa, L., & Poullet, Y. (2012). Privacy and the regulation of 2012. Computer Law & Security Review, 28, 254.CrossRef
21.
go back to reference Pearce, G., & Platten, N. (1998). Achieving personal data protection in the European Union. Journal of Common Market Studies, 36, 529–547.CrossRef Pearce, G., & Platten, N. (1998). Achieving personal data protection in the European Union. Journal of Common Market Studies, 36, 529–547.CrossRef
22.
go back to reference Mayer-Schönberger, V. (1998). Generational development of data protection in Europe. In P. E. Agre & M. Rotenberg (Eds.), Technology and privacy: the new landscape (pp. 219–229). Cambridge: The MIT Press. Mayer-Schönberger, V. (1998). Generational development of data protection in Europe. In P. E. Agre & M. Rotenberg (Eds.), Technology and privacy: the new landscape (pp. 219–229). Cambridge: The MIT Press.
23.
go back to reference Reidenberg, J. R. (1992). The privacy obstacle course: hurdling barriers to transnational financial services. Fordham Law Review, 60, S137. Reidenberg, J. R. (1992). The privacy obstacle course: hurdling barriers to transnational financial services. Fordham Law Review, 60, S137.
24.
go back to reference Feiler, L. (2010). The legality of the data retention directive in light of the fundamental rights to privacy and data protection. European Journal of Law and Technology, 1(3), 1. Feiler, L. (2010). The legality of the data retention directive in light of the fundamental rights to privacy and data protection. European Journal of Law and Technology, 1(3), 1.
25.
go back to reference Anderson, R. (2008). Security engineering (2nd ed., p. 782). New York: Wiley. Anderson, R. (2008). Security engineering (2nd ed., p. 782). New York: Wiley.
26.
go back to reference Robert, C. (1989). Post, the social foundations of privacy: community and self in the common law tort. California Law Review, 77, 957.CrossRef Robert, C. (1989). Post, the social foundations of privacy: community and self in the common law tort. California Law Review, 77, 957.CrossRef
27.
go back to reference Simitis, S. (1987). Reviewing privacy in an information society. University of Pennsylvania Law Review, 135, 707–709.CrossRef Simitis, S. (1987). Reviewing privacy in an information society. University of Pennsylvania Law Review, 135, 707–709.CrossRef
28.
go back to reference Cohen, J. E. (2000). Examined lives: informational privacy and the subject as object. Stanford Law & Policy Review, 52, 1373–1438.CrossRef Cohen, J. E. (2000). Examined lives: informational privacy and the subject as object. Stanford Law & Policy Review, 52, 1373–1438.CrossRef
29.
go back to reference Schwartz, P. M. (1999). Privacy and democracy in cyberspace. Vanderbilt Law Review, 52, 1609–1613. Schwartz, P. M. (1999). Privacy and democracy in cyberspace. Vanderbilt Law Review, 52, 1609–1613.
30.
go back to reference Gavison, R. (1980). Privacy and the limits of law. The Yale Law Journal, 89, 421.CrossRef Gavison, R. (1980). Privacy and the limits of law. The Yale Law Journal, 89, 421.CrossRef
31.
go back to reference Wong, S. (1996). The concept, value and right of privacy. UCL Jurisprudential Review, 3, 165. Wong, S. (1996). The concept, value and right of privacy. UCL Jurisprudential Review, 3, 165.
32.
go back to reference Oliver, H. (2002). Email and internet monitoring in the workplace: information privacy and contracting-out. Industrial Law Journal, 31(4), 321.CrossRefMathSciNet Oliver, H. (2002). Email and internet monitoring in the workplace: information privacy and contracting-out. Industrial Law Journal, 31(4), 321.CrossRefMathSciNet
33.
go back to reference Doswald-Beck, L. (2011). Human rights in times of conflict and terrorism (p. 456). Oxford: Oxford University Press.CrossRef Doswald-Beck, L. (2011). Human rights in times of conflict and terrorism (p. 456). Oxford: Oxford University Press.CrossRef
34.
go back to reference Acquilina, K. (2010). Public security versus privacy in technology law: a balancing act? Computer Law & Security Review, 26(2), 140. Acquilina, K. (2010). Public security versus privacy in technology law: a balancing act? Computer Law & Security Review, 26(2), 140.
35.
go back to reference Bigo, D. Boulet, G. Bowden, C. et al., (2012). Fighting cyber crime and protecting privacy in the cloud. Study 2012 for the Directorate-General for Internet Policies, p.22 (Fighting cyber crime) Bigo, D. Boulet, G. Bowden, C. et al., (2012). Fighting cyber crime and protecting privacy in the cloud. Study 2012 for the Directorate-General for Internet Policies, p.22 (Fighting cyber crime)
36.
go back to reference Solove, D. (2007). I’ve got nothing to hide” and other misunderstanding of privacy. San Diego Law Review, 44, 745. Solove, D. (2007). I’ve got nothing to hide” and other misunderstanding of privacy. San Diego Law Review, 44, 745.
37.
go back to reference Kirda, E., & Kruegel, C. (2006). Protecting users against phishing attacks. The Computer Journal, 49(5), 554.CrossRef Kirda, E., & Kruegel, C. (2006). Protecting users against phishing attacks. The Computer Journal, 49(5), 554.CrossRef
38.
go back to reference Marshal, A., & Tompsett, B. (2005). Identity theft in an online world. Computer Law & Security Report, 21, 128.CrossRef Marshal, A., & Tompsett, B. (2005). Identity theft in an online world. Computer Law & Security Report, 21, 128.CrossRef
39.
go back to reference Albrecht, C., Albrecht, C., & Tzafrir, S. (2011). How to protect and minimize consumer risk to identity theft. Journal of Financial Crime, 18, 405.CrossRef Albrecht, C., Albrecht, C., & Tzafrir, S. (2011). How to protect and minimize consumer risk to identity theft. Journal of Financial Crime, 18, 405.CrossRef
40.
go back to reference Savirimuthu, A., & Savirimuthu, J. (2007). Identity theft and systems theory: the fraud act 2006 in perspective. SCRIPTed—A Journal of Law, Technology & Society, 4(4), 437. Savirimuthu, A., & Savirimuthu, J. (2007). Identity theft and systems theory: the fraud act 2006 in perspective. SCRIPTed—A Journal of Law, Technology & Society, 4(4), 437.
41.
go back to reference Anderson, M., Landrock, P. (1996) Encryption and Interception. Computer Law and Security Report 12, 342. Anderson, M., Landrock, P. (1996) Encryption and Interception. Computer Law and Security Report 12, 342.
42.
go back to reference Gerard, P., & Broze, G. (1997). Encryption: an overview of European policies: I.T., telecoms and broadcasting. Computer and Telecommunications Law Review, 3(4), 168. Gerard, P., & Broze, G. (1997). Encryption: an overview of European policies: I.T., telecoms and broadcasting. Computer and Telecommunications Law Review, 3(4), 168.
43.
go back to reference Ward, C. (1997). Regulation of the use of cryptography and cryptographic systems in the United Kingdom: the policy and practice. Computer and Telecommunications Law Review, 3, 105. Ward, C. (1997). Regulation of the use of cryptography and cryptographic systems in the United Kingdom: the policy and practice. Computer and Telecommunications Law Review, 3, 105.
44.
go back to reference Akdeniz, Y., & Walker, C. (1998). UK Government policy on encryption: trust is the key? Journal of Civil Liberties, 3, 110. Akdeniz, Y., & Walker, C. (1998). UK Government policy on encryption: trust is the key? Journal of Civil Liberties, 3, 110.
45.
go back to reference Sundt, C. (1999). Law enforcement and cryptography: a personal view. Computer and Telecommunications Law Review 187. Sundt, C. (1999). Law enforcement and cryptography: a personal view. Computer and Telecommunications Law Review 187.
46.
go back to reference Koops, B.-J. (1996). A survey of cryptography laws and regulations. Computer Law and Security Report, 12, 349.CrossRef Koops, B.-J. (1996). A survey of cryptography laws and regulations. Computer Law and Security Report, 12, 349.CrossRef
47.
go back to reference Kennedy, G. (2000). Codemakers, codebreakers and rulemakers: dilemmas in current encryption policies. Computer Law and Security Report, 16(4), 240.CrossRef Kennedy, G. (2000). Codemakers, codebreakers and rulemakers: dilemmas in current encryption policies. Computer Law and Security Report, 16(4), 240.CrossRef
48.
go back to reference Solove, D. (2004). The digital person: technology and privacy in the information age (p. 47). New York: New York University Press. Solove, D. (2004). The digital person: technology and privacy in the information age (p. 47). New York: New York University Press.
49.
go back to reference Stalla-Bourdillon, S., Papadaki, E., Chown, T. From porn to cybersecurity passing by copyright: how mass surveillance techniques are gaining legitimacy (forthcoming). Stalla-Bourdillon, S., Papadaki, E., Chown, T. From porn to cybersecurity passing by copyright: how mass surveillance techniques are gaining legitimacy (forthcoming).
50.
go back to reference Marx, G. T. (1985). The surveillance society: the threat of 1984-style techniques. In the Futurist, 19, 21–26. Marx, G. T. (1985). The surveillance society: the threat of 1984-style techniques. In the Futurist, 19, 21–26.
51.
go back to reference Lyon, D. (2003). Surveillance as social sorting, privacy risk and digital discrimination. New York: Routledge. Lyon, D. (2003). Surveillance as social sorting, privacy risk and digital discrimination. New York: Routledge.
52.
go back to reference Hildebrandt, M., & Gutwirth, S. (2008). Profiling the European citizen. New York: Springer.CrossRef Hildebrandt, M., & Gutwirth, S. (2008). Profiling the European citizen. New York: Springer.CrossRef
53.
go back to reference Koutsias, M. (2012). Privacy and data protection in an information society: how reconciled are the English with the European Union privacy norms? Computer and Telecommunications Law Review, 18, 261. Koutsias, M. (2012). Privacy and data protection in an information society: how reconciled are the English with the European Union privacy norms? Computer and Telecommunications Law Review, 18, 261.
54.
go back to reference Schermer, B. W. (2011). The limits of privacy in automated profiling and data mining. Computer Law and Security Review, 27, 45.CrossRef Schermer, B. W. (2011). The limits of privacy in automated profiling and data mining. Computer Law and Security Review, 27, 45.CrossRef
55.
go back to reference Fulda, J. (2000/2001). Data mining and privacy. Albany Law Journal of Science & Technology, 11, 105. Fulda, J. (2000/2001). Data mining and privacy. Albany Law Journal of Science & Technology, 11, 105.
56.
go back to reference Tien, L. (2004). Privacy, technology and data mining. Ohio Northern University Law Review, 30, 389. Tien, L. (2004). Privacy, technology and data mining. Ohio Northern University Law Review, 30, 389.
57.
go back to reference Kawakami, S., & McCarty, S. (2004–2005). Privacy year in review: privacy impact assessments, airline passenger pre-screening, and government data mining. I/S: A Journal of Law and Policy, 1, 219. Kawakami, S., & McCarty, S. (2004–2005). Privacy year in review: privacy impact assessments, airline passenger pre-screening, and government data mining. I/S: A Journal of Law and Policy, 1, 219.
58.
go back to reference Slobogin, C. (2008). Government data mining and the fourth amendment. University of Chicago Law Review 75, 317. Slobogin, C. (2008). Government data mining and the fourth amendment. University of Chicago Law Review 75, 317.
59.
go back to reference Rubinstein, I. S., Lee, R. D., & Schwartz, P. M. (2008). Data mining and internet profiling: emerging regulatory and technological approaches. The University of Chicago Law Review, 75, 261. Rubinstein, I. S., Lee, R. D., & Schwartz, P. M. (2008). Data mining and internet profiling: emerging regulatory and technological approaches. The University of Chicago Law Review, 75, 261.
60.
go back to reference Solove, D. (2008). Data mining and the security-liberty debate. The University of Chicago Law Review, 74, 343. Solove, D. (2008). Data mining and the security-liberty debate. The University of Chicago Law Review, 74, 343.
61.
go back to reference Frawley, W.J., Piatetsky-Shapiro, G., Matheus, C. J. (1992). Knowledge discovery in databases: an overview in AAI Magazine, Fall, p. 58. Frawley, W.J., Piatetsky-Shapiro, G., Matheus, C. J. (1992). Knowledge discovery in databases: an overview in AAI Magazine, Fall, p. 58.
62.
go back to reference Winton, A., Cohen, N., (2012). The general data protection regulation as it applies to online advertising, e-commerce and social media 18 computer and telecommunications Law 97, p. 98. Winton, A., Cohen, N., (2012). The general data protection regulation as it applies to online advertising, e-commerce and social media 18 computer and telecommunications Law 97, p. 98.
63.
go back to reference Slobogin, C. (2008). Government data mining and the fourth amendment. The University of Chicago Law Review, 75, 317. Slobogin, C. (2008). Government data mining and the fourth amendment. The University of Chicago Law Review, 75, 317.
Metadata
Title
Privacy Versus Security… Are We Done Yet?
Author
Sophie Stalla-Bourdillon
Copyright Year
2014
Publisher
Springer London
DOI
https://doi.org/10.1007/978-1-4471-6530-9_1

Premium Partner