main-content

This Springer brief provides the necessary foundations to understand differential privacy and describes practical algorithms enforcing this concept for the publication of real-time statistics based on sensitive data. Several scenarios of interest are considered, depending on the kind of estimator to be implemented and the potential availability of prior public information about the data, which can be used greatly to improve the estimators' performance. The brief encourages the proper use of large datasets based on private data obtained from individuals in the world of the Internet of Things and participatory sensing. For the benefit of the reader, several examples are discussed to illustrate the concepts and evaluate the performance of the algorithms described. These examples relate to traffic estimation, sensing in smart buildings, and syndromic surveillance to detect epidemic outbreaks.

### Chapter 1. Defining Privacy-Preserving Data Analysis

Abstract
With the growing focus on instrumenting our environment and monitoring our activities, there is a need to implement privacy-preserving algorithms into our technological systems. Defining privacy formally is a delicate task but is a necessary first step to be able to provide clear guarantees to the individuals being monitored. In this chapter, after discussing the pitfalls of naive approaches to data privacy, we review the notion of differential privacy, a state-of-the-art definition of privacy that we adopt in the rest of this monograph, and which provides guarantees against adversaries with arbitrary side information. Privacy-preserving data analysis has a relatively long history in fields such as econometrics and statistics or for the processing of sensitive static data stored for example in medical databases. Current trends emphasize the need to work with streams of data originating from many sources and requiring sanitization in real-time, which brings new challenges to the field.
Jerome Le Ny

### Chapter 2. Basic Differentially Private Mechanisms

Abstract
This chapter presents simple mechanisms for signal filtering under differential privacy constraints, which add white noise directly on the sensitive input signals or at the output of a desired filter. We introduce concrete examples of adjacency relations for individual and collective privacy-sensitive input signals. We then describe the Laplace and Gaussian mechanisms to enforce $$\varepsilon$$- or $$(\varepsilon , \delta )$$-differential privacy with respect to these adjacency relations, by adding Laplace and Gaussian noise respectively. For these mechanisms, adding noise at the output of the desired filter requires computing the sensitivity of this filter with respect to the signal variations allowed by the chosen adjacency relation.
Jerome Le Ny

### Chapter 3. A Two-Stage Architecture for Differentially Private Filtering

Abstract
This chapter presents an architecture generalizing the input and output mechanisms to process dynamic data streams while enforcing differential privacy. A privacy-sensitive signal that we want to process in order to publish real-time statistics is first shaped by certain pre-filter, then perturbed to obtain a differentially private signal, and finally post-filtered to mitigate the effect of the noise and the pre-filter. A general methodology is provided for the design of such two-stage architectures, and an example demonstrates the significant performance improvements achievable over the input and output mechanisms.
Jerome Le Ny

### Chapter 4. Differentially Private Filtering for Stationary Stochastic Collective Signals

Abstract
This chapter builds on the two-stage architecture for differentially private filtering, and presents mechanisms with better performance than the zero-forcing equalization mechanism, for the situation where we have some knowledge about the statistics of the privacy-sensitive input signals, which moreover are assumed to be stationary. The mechanisms described use as second stage in the architecture a Wiener filter, and the performance of the overall mechanism is then optimized, following the general methodology outlined in Chap. 3.
Jerome Le Ny

### Chapter 5. Differentially Private Kalman Filtering

Abstract
This chapter is concerned with the design of model-based differentially private filters, when the privacy-sensitive signal to be processed can be modeled as the output of a linear finite-dimensional system with publicly known parameters. Such models can capture for example known physical laws that govern the behavior of the input signal, e.g., a kinematic model linking position and velocity measurements obtained from individual users. In the absence of privacy constraint, Kalman filtering provides a solution to the problem of estimating the state of the system while minimizing the mean square error. We adapt here this filter to accommodate differential privacy constraints, for various scenarios involving either individual or collective signals.
Jerome Le Ny

### Chapter 6. Differentially Private Nonlinear Observers

Abstract
This chapter introduces tools for the design of differentially private nonlinear dynamic observers. Indeed, the dynamic models useful to estimate and predict the characteristics of a population, originating for example from epidemiology or the social sciences, are often nonlinear. The main issue discussed is the computation of the sensitivity of a class of nonlinear observers, which is necessary to design differentially private output perturbation or two-stage mechanisms, when the first stage is nonlinear. We use here contraction analysis in order to design convergent observers with appropriately controlled sensitivity. Two examples are also discussed for illustration purposes: estimating the edge formation probabilities in a social network using a dynamic stochastic block model, and syndromic surveillance relying on an epidemiological model.
Jerome Le Ny

### Chapter 7. Conclusion

Abstract
This monograph has introduced a number of techniques for privacy-preserving signal processing, enforcing a state-of-the-art notion of privacy, differential privacy. Here, we summarize some of the main ideas and briefly suggest some directions for future work.
Jerome Le Ny