Skip to main content
Log in

Addressing privacy requirements in system design: the PriS method

  • Original Article
  • Published:
Requirements Engineering Aims and scope Submit manuscript

Abstract

A major challenge in the field of software engineering is to make users trust the software that they use in their every day activities for professional or recreational reasons. Trusting software depends on various elements, one of which is the protection of user privacy. Protecting privacy is about complying with user’s desires when it comes to handling personal information. Users’ privacy can also be defined as the right to determine when, how and to what extend information about them is communicated to others. Current research stresses the need for addressing privacy issues during the system design rather than during the system implementation phase. To this end, this paper describes PriS, a security requirements engineering method, which incorporates privacy requirements early in the system development process. PriS considers privacy requirements as organisational goals that need to be satisfied and adopts the use of privacy-process patterns as a way to: (1) describe the effect of privacy requirements on business processes; and (2) facilitate the identification of the system architecture that best supports the privacy-related business processes. In this way, PriS provides a holistic approach from ‘high-level’ goals to ‘privacy-compliant’ IT systems. The PriS way-of-working is formally defined thus, enabling the development of automated tools for assisting its application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. Since pseudonymity can be considered as part of anonymity, they are both addressed in one pattern.

References

  1. Lunheim R, Sindre GS (1994) Privacy and computing: a cultural perspective. Security and control of information technology. In: Sizer R (ed) A Society (A-43)/x. Elsevier, North Holland, pp 25–40

    Google Scholar 

  2. Fischer-Hübner S (2001) IT-security and privacy, design and use of privacy enhancing security mechanisms. Lect Notes Comp Sci, vol. 1958. Springer, Berlin

  3. Cannon JC (2004) Privacy, what developers and IT professionals should know. Addison-Wesley, Reading

  4. Koorn R, van Gils H, ter Hart J, Overbeek P, Tellegen R Privacy Enhancing Technologies, White paper for Decision Makers. Ministry of the Interior and Kingdom Relations, the Netherlands, December 2004

  5. University of the Aegean, E-Vote: an Internet-based electronic voting system. University of the Aegean, Project Deliverable D 7.6, IST Programme 2000#29518, 21 October 2003, Samos

  6. Kavakli E, Gritzalis S, Kalloniatis C (2007) Protecting privacy in system design: the electronic voting case. Transf Gov People Process Policy 1(4):307–332. doi:10.1108/17506160710839150

    Article  Google Scholar 

  7. Kavakli E, Kalloniatis C, Loucopoulos P, Gritzalis S (2006) “Incorporating Privacy Requirements into the System Design Process: The PriS Conceptual Framework”, Internet research, special issue on privacy and anonymity in the digital era: theory. Technol Pract 16(2):140–158

    Google Scholar 

  8. Kalloniats C, Kavakli E, Gritzalis S (2005) Dealing with privacy issues during the system design process, 5th IEEE International Symposium on Signal Processing and Information Technology, 18–21 December 2005, Athens, Greece

  9. Loucopoulos P, Kavakli V (1999) Enterprise knowledge management and conceptual modelling. LNCS, vol. 1565. Springer, Berlin, pp 123–143

  10. Loucopoulos P (2000) From information modelling to enterprise modelling. In: Information systems engineering: state of the art and research themes. Springer, Berlin, pp 67–78

  11. Kalloniatis C, Kavakli E, Gritzalis S (2007) Using privacy process patterns for incorporating privacy requirements into the system design process, Workshop on Secure Software Engineering (SecSe 2007) in conjunction with the International Conference on Availability, Reliability and Security (ARES 2007), April 2007, Vienna, Austria

  12. Kavakli V (2002) Goal oriented requirements engineering: a unifying framework. Req Eng J 6(4):237–251. Springer, London

    Article  MATH  Google Scholar 

  13. META Group Report v1.1 (2005) Privacy Enhancing Technology. March 2005

  14. Code of Fair Information Practices (The) (1973), US Department of Health, Education and Welfare

  15. Chung L (1993) Dealing with Security Requirements during the development of Information Systems, CaiSE ‘93, The 5th International Conference of Advanced Information System Engineering. Paris, France, pp 234–251

  16. Mylopoulos J, Chung L, Nixon B (1992) Representing and using non-functional requirements: a process oriented approach. IEEE Trans Softw Eng 18:483–497. doi:10.1109/32.142871

    Article  Google Scholar 

  17. Liu L, Yu E, Mylopoulos J (2003) Security and privacy requirements analysis within a social setting, 11th IEEE International Requirements Engineering Conference (RE’03), Monterey Bay, California, USA, pp 151–161

  18. Mouratidis H, Giorgini P, Manson G (2003) An ontology for modelling security: the Tropos project, Proceedings of the KES 2003 Invited Session Ontology and Multi-Agent Systems Design (OMASD’03), UK, University of Oxford, Palade V, Howlett RJ, Jain L (eds) Lecture Notes in Artificial Intelligence 2773, Springer 2003, pp 1387–1394

  19. Mouratidis H, Giorgini P, Manson G (2003) Integrating Security and Systems Engineering: towards the modelling of secure information systems, CAiSE ‘03, LNCS 2681. Springer, Berlin, pp 63–78

    Google Scholar 

  20. van Lamsweerde A, Letier E (2000) Handling obstacles in goal-oriented requirements engineering. IEEE Trans Softw Eng 26:978–1005. doi:10.1109/32.879820

    Article  Google Scholar 

  21. Liu L, Yu E, Mylopoulos J (2002) Analyzing security requirements as relationships among strategic actors, (SREIS’02), e-proceedings available at http://www.sreis.org/old/2002/finalpaper9.pdf, Raleigh, North Carolina

  22. He Q, Antόn IA (2003) A Framework for modelling privacy requirements in role engineering, International Workshop on Requirements Engineering for Software Quality (REFSQ), 16–17 June 2003, Austria Klagenfurt/Velden, pp 115–124

  23. Moffett DJ, Nuseibeh AB (2003) A framework for security requirements engineering. Report YCS 368, Department of Computer Science, University of York

  24. Antόn IA (1996) Goal-based requirements analysis, ICRE ‘96 IEEE Colorado Springs, Colorado, USA, pp 136–144

  25. Antόn IA, Earp BJ (2000) Strategies for developing policies and requirements for secure electronic commerce systems. 1st ACM Workshop on Security and Privacy in E-Commerce (CCS 2000), 1–4 November 2000, unnumbered pages

  26. Bellotti V, Sellen A (1993) Design for privacy in ubiquitous computing environments. In: Michelis G, Simone C, Schmidt K (eds) Proceedings of the Third European Conference on Computer Supported Cooperative Work—ECSCW 93, pp 93–108

  27. Hong JI, Ng J, Lederer S, Landay JA (2004) Privacy risk models for designing privacy-sensitive ubiquitous computing systems, Designing Interactive Systems, Boston MA

  28. Jensen C, Tullio J, Potts C, Mynatt DE (2005) STRAP: a structured analysis framework for privacy, GVU Technical Report

  29. Anonymizer, available at www.anonymizer.com

  30. Reiter KM, Rubin DA (1998) Crowds: anonymity for web transactions. ACM Trans Inf Syst Secur 1(1):66–92. doi:10.1145/290163.290168

    Article  Google Scholar 

  31. Reiter KM, Rubin DA (1999) Anonymous web transactions with crowds. Commun ACM 42(2):32–38. doi:10.1145/293411.293778

    Article  Google Scholar 

  32. Reed M, Syverson P, Goldschlag D (1998) Anonymous connections and Onion Routing. IEEE J Sel Areas Comm 16(4):482–494. doi:10.1109/49.668972

    Article  Google Scholar 

  33. Goldschlag D, Syverson P, Reed M (1999) Onion Routing for anonymous and private Internet connections. Commun ACM 42(2):39–41. doi:10.1145/293411.293443

    Article  Google Scholar 

  34. Chaum D (1985) Security without identification: transactions systems to make Big Brother Obsolete. Commun ACM 28(10):1030–1044. doi:10.1145/4372.4373

    Article  Google Scholar 

  35. Chaum D (1988) The dining cryptographers problem: unconditional sender and recipient untraceability. J Cryptol 1(1):65–75. doi:10.1007/BF00206326

    Article  MATH  MathSciNet  Google Scholar 

  36. Chaum D (1981) untraceable electronic mail, return addresses, and digital pseudonyms. Commun ACM 24(2):84–88. doi:10.1145/358549.358563

    Article  Google Scholar 

  37. Pfitzmann A, Waidner M (1987) Networks without user Observability. Comput Secur 6(2):158–166

    Article  Google Scholar 

  38. Shields C, Levine NB (2000) A protocol for anonymous communication over the Internet. In: Samarati P, Jajodia S (eds) Proceedings of the 7th ACM Conference on Computer and Communications Security. ACM Press, New York, 33–42

  39. Bennett K, Grothoff C (2003) GAP-Practical Anonymous networking. Proceeding of the Workshop on PET2003 Privacy Enhancing Technologies. Available at http://www.citeseer.nj.nec.com/bennett02gap.html

  40. Dingledine R, Mathewson N, Syverson PT (2004) The second-generator Onion Router. Proceedings of the 13th USENIX Security Symposium, San Diego, CA, USA

  41. Amoroso EG AT&T Bell Laboratories (1994) Fundamentals of computer security technology. P.T. R. Prentice Hall, ISBN 0-13-108929-3

  42. Schneier B (1999) Attack trees 21–29. Dr Dobb’s J Softw Tools 24 12(12):21–29

    Google Scholar 

  43. John MCF (1999) Using abuse case models for security requirements analysis, 15th Annual Computer Security Applications Conference (ACSAC ‘99), pp 55

  44. Sindre G, Opdahl AL (2005) Eliciting security requirements with misuse cases. Requir Eng 10(1):34–44. doi:10.1007/s00766-004-0194-4

    Article  Google Scholar 

  45. Sindre G, Opdahl AL (2002) Templates for misuse case description. In: Proceedings of the Seventh International Workshop on Requirements Engineering: Foundations for Software Quality—REFSQ’2001, Camille BA, et al (eds) Essener Informatik BeitrÃge, University of Essen, Germany, pp 125–136

  46. Alexander I (2003) Use/misuse case analysis elicits non-functional requirements. Comput Contr Eng J 14(1):40–45. doi:10.1049/cce:20030108

    Article  Google Scholar 

  47. Firesmith D (2003) Security use cases. J Object Technol 2(1):53–64

    Google Scholar 

  48. Lin L, Nuseibeh B, Ince D, Jackson M, Moffett JD (2003) Introducing abuse frames for analysing security requirements. Requirements Engineering 2003, 11th IEEE International Conference on Requirements Engineering (RE 2003), 8–12 September 2003, Monterey Bay, CA, USA. IEEE Computer Society 2003, pp 371–372

  49. European Parliament and the Council: Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and of the free movement of such data. October 1995

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christos Kalloniatis.

Appendix 1

Appendix 1

Figure 6 presents the process pattern for addressing the authentication requirement, which describes the relevant activities needed to realise that process. Every time a user submits a request to the system, the system should check that request and if authentication is needed the user should provide the proper authentication data based on which access is granted or denied.

Fig. 6
figure 6

Authentication pattern

The authorisation process pattern is presented in Fig. 7. According to the authorisation requirement, user’s private data should only be accessed by authorised users. Therefore, initially when a user submits a request to a system the nature of the request should first be checked since it is not legal for example to ask from that user to login for a service that identification is not needed. If the user requests specific services or access to data that need authorisation then he/she should pass the authentication process and then, according to his/her rights, get the privileges for accessing or not the specific service or data.

Fig. 7
figure 7

Authorisation pattern

The pattern corresponding to the identification requirement is presented in Fig. 8. The role of identification is twofold; first to protect both the user that accesses a resource or service and the user’s data that are stored in the system and second to allow only authorised people to access them.

Fig. 8
figure 8

Identification pattern

As shown in Fig. 8, when a user submits a request the identification process checks whether identity is required or not. If identity is not needed the system returns the information requested to the user without asking any kind of digital identity. If the request is related to accessing private information or accessing personalised services, then the process of authorisation is triggered. It should be noted that user anonymity is not ensured since this is not an anonymity service, just a transaction without providing identities. If anonymity is also required then the relevant process pattern described below, should also be applied.

Figure 9, presents the data protection process pattern. The aim here is to ensure that every transaction with personal data is realised according to the system’s privacy regulations and Directive 95/46/EU [49] regarding the processing of personal data and the free movement of such data.

Fig. 9
figure 9

Data protection pattern

When a user tries to access private data, an identification process is triggered for identifying the user and for granting him/her with the rights of reading, processing, storing, or deleting private data. Subsequently, if the user asks to perform any of the above tasks the system checks whether this complies with the privacy regulations and the request is either granted or denied, accordingly. Thus, there are two intermediate “inspections” before actually a user is able to perform various tasks on other users’ private data.

The next pattern (Fig. 10), addresses the anonymity and pseudonymity requirements. These two are addressed in one pattern since pseudonymity could be considered as part of anonymity.

Fig. 10
figure 10

Anonymity and pseudonymity pattern

As shown in Fig. 10, first, the user’s request is checked in order to decide whether or not identity is needed. If there is a need for knowing user’s identity, the identification process is triggered. If not, the user not only receives his/her information without providing any personal data, but also specific techniques for protecting his/her anonymity are realised. Thus, identification may be a subpart of anonymity depending on whether or not specific data of user’s identity are asked for processing. On the other hand, anonymity is a privacy requirement that needs to be protected and specific technologies should be used to realise user’s anonymisation while he/she accessing the system and also during the whole communication. Pseudonymity is used when anonymity cannot be provided but again for the purpose of protecting user’s anonymity.

Finally, the patterns for unlinkability and unobservability requirements (Figs. 11, 12, respectively) are presented below. The two patterns have a similar structure. User asks for a request. Based on system’s requirements if one or both of these requirements need to be realised, then, appropriate unlinkability or unobservability techniques are used for connecting the user to the system.

Fig. 11
figure 11

Unlinkability pattern

Fig. 12
figure 12

Unobservability pattern

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kalloniatis, C., Kavakli, E. & Gritzalis, S. Addressing privacy requirements in system design: the PriS method. Requirements Eng 13, 241–255 (2008). https://doi.org/10.1007/s00766-008-0067-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00766-008-0067-3

Keywords

Navigation