Skip to main content

2018 | Buch

Data Privacy Management, Cryptocurrencies and Blockchain Technology

ESORICS 2018 International Workshops, DPM 2018 and CBT 2018, Barcelona, Spain, September 6-7, 2018, Proceedings

herausgegeben von: Joaquin Garcia-Alfaro, Jordi Herrera-Joancomartí, Dr. Giovanni Livraga, Ruben Rios

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter

CBT Workshop: Smart Contracts

Frontmatter
Succinctly Verifiable Sealed-Bid Auction Smart Contract

The recently growing tokenization process of digital and physical assets over the Ethereum blockchain requires a convenient trade and exchange mechanism. Sealed-bid auctions are powerful trading tools due to the advantages they offer compared to their open-cry counterparts. However, the inherent transparency and lack of privacy on the Ethereum blockchain conflict with the main objective behind the sealed-bid auctions. In this paper, we tackle this challenge and present a smart contract protocol for a succinctly verifiable sealed-bid auction on the Ethereum blockchain. In particular, we utilize various cryptographic primitives including zero-knowledge Succinct Non-interactive Argument of Knowledge (zk-SNARK), Multi-Party Computation (MPC), Public-Key Encryption (PKE) scheme, and commitment scheme for our approach. First, the proving and verification keys for zk-SNARK are generated via an MPC protocol between the auctioneer and bidders. Then, when the auction process starts, the bidders submit commitments of their bids to the smart contract. Subsequently, each bidder individually reveals her commitment to the auctioneer using the PKE scheme. Then, according to the auction rules, the auctioneer claims a winner and generates a proof off-chain based on the proving key, commitments which serve as public inputs, and their underlying openings which are considered the auctioneer’s witness. Finally, the auctioneer submits the proof to the smart contract which in turn verifies its validity based on the public inputs, and the verification key. The proposed protocol scales efficiently as it has a constant-size proof and verification cost regardless of the number of bidders. Furthermore, we provide an analysis of the smart contract design, in addition to the estimated gas costs associated with the different transactions.

Hisham S. Galal, Amr M. Youssef
Blockchain-Based Fair Certified Notifications

Lots of traditional applications can be redefined thanks to the benefits of Blockchain technologies. One of these services is the provision of fair certified notifications. Certified notifications is one of the applications that require a fair exchange of values: a message and a non-repudiation of origin proof in exchange for a non-repudiation of reception evidence. To the best of our knowledge, this paper presents the first blockchain-based certified notification system. We propose two solutions that allow sending certified notifications when confidentiality is required or when it is necessary to register the content of the notification, respectively. First, we present a protocol for Non Confidential Fair Certified Notifications that satisfies the properties of strong fairness and transferability of the proofs thanks to the use of a smart contract and without the need of a Trusted Third Party. Then, we also present a DApp for Confidential Certified Notifications with a smart contract that allows a timeliness optimistic exchange of values with a stateless Trusted Third Party.

Macià Mut-Puigserver, M. Magdalena Payeras-Capellà, Miquel A. Cabot-Nadal
On Symbolic Verification of Bitcoin’s script Language

Validation of Bitcoin transactions rely upon the successful execution of scripts written in a simple and effective, non-Turing-complete by design language, simply called script. This makes the validation of closed scripts, i.e. those associated to actual transactions and bearing full information, straightforward. Here we address the problem of validating open scripts, i.e. we address the validation of redeeming scripts against the whole set of possible inputs, i.e. under which general conditions can Bitcoins be redeemed? Even if likely not one of the most complex languages and demanding verification problems, we advocate the merit of formal verification for the Bitcoin validation framework. We propose a symbolic verification theory for open script, a verifier tool-kit, and illustrate examples of use on Bitcoin transactions. Contributions include (1) a formalisation of (a fragment of script) the language; (2) a novel symbolic approach to script verification, suitable, e.g. for the verification of newly defined and non-standard payment schemes; and (3) building blocks for a larger verification theory for the developing area of Bitcoin smart contracts. The verification of smart contracts, i.e. agreements built as transaction-based protocols, is currently a problem that is difficult to formalise and computationally demanding.

Rick Klomp, Andrea Bracciali
Self-reproducing Coins as Universal Turing Machine

Turing-completeness of smart contract languages in blockchain systems is often associated with a variety of language features (such as loops). On the contrary, we show that Turing-completeness of a blockchain system can be achieved through unwinding the recursive calls between multiple transactions and blocks instead of using a single one. We prove it by constructing a simple universal Turing machine using a small set of language features in the unspent transaction output (UTXO) model, with explicitly given relations between input and output transaction states. Neither unbounded loops nor possibly infinite validation time are needed in this approach.

Alexander Chepurnoy, Vasily Kharin, Dmitry Meshkov

CBT Workshop: Second Layer, Off-chain Transactions and Transparency

Frontmatter
Split Payments in Payment Networks

Traditional blockchain systems, such as Bitcoin, focus on transactions in which entire amount is transferred from one owner to the other, in a single, atomic operation. This model has been re-used in the context of payment networks such as Lightning network. In this work, we propose and investigate new payment model, called split payments, in which the total amount to be transferred is split into unit-amounts and is transferred independently through the same or different routes. By splitting the payments this way, we achieve an improved total liquidity of the payment network, simplify the route advertising, reduce the amount of funds needed to be locked in the channels, and improve the privacy properties.

Dmytro Piatkivskyi, Mariusz Nowostawski
Payment Network Design with Fees

Payment channels are the most prominent solution to the blockchain scalability problem. We introduce the problem of network design with fees for payment channels from the perspective of a Payment Service Provider (PSP). Given a set of transactions, we examine the optimal graph structure and fee assignment to maximize the PSP’s profit. A customer prefers to route transactions through the PSP’s network if the cheapest path from sender to receiver is financially interesting, i.e., if the path costs less than the blockchain fee. When the graph structure is a tree, and the PSP facilitates all transactions, the problem can be formulated as a linear program. For a path graph, we present a polynomial time algorithm to assign optimal fees. We also show that the star network, where the center is an additional node acting as an intermediary, is a near-optimal solution to the network design problem.

Georgia Avarikioti, Gerrit Janssen, Yuyi Wang, Roger Wattenhofer
Atomic Information Disclosure of Off-Chained Computations Using Threshold Encryption

Public Blockchains on their own are, by definition, incapable of keeping data private and disclosing it at a later time. Control over the eventual disclosure of private data must be maintained outside a Blockchain by withholding and later publishing encryption keys, for example. We propose the Atomic Information Disclosure (AID) pattern based on threshold encryption that allows a set of key holders to govern the release of data without having access to it. We motivate this pattern with problems that require independently reproduced solutions. By keeping submissions private until a deadline expires, participants are unable to plagiarise and must therefore generate their own solutions which can then be aggregated and analysed to determine a final answer. We outline the importance of a game-theoretically sound incentive scheme, possible attacks, and other future work.

Oliver Stengele, Hannes Hartenstein
Contour: A Practical System for Binary Transparency

Transparency is crucial in security-critical applications that rely on authoritative information, as it provides a robust mechanism for holding these authorities accountable for their actions. A number of solutions have emerged in recent years that provide transparency in the setting of certificate issuance, and Bitcoin provides an example of how to enforce transparency in a financial setting. In this work we shift to a new setting, the distribution of software package binaries, and present a system for so-called “binary transparency.” Our solution, Contour, uses proactive methods for providing transparency, privacy, and availability, even in the face of persistent man-in-the-middle attacks. We also demonstrate, via benchmarks and a test deployment for the Debian software repository, that Contour is the only system for binary transparency that satisfies the efficiency and coordination requirements that would make it possible to deploy today.

Mustafa Al-Bassam, Sarah Meiklejohn

CBT Workshop: Consensus, Mining Pools and Performance

Frontmatter
What Blockchain Alternative Do You Need?

With billions of dollars spent on blockchain, there clearly is a need to determine if this technology should be used, as demonstrated by the many proposals for decision schemes. In this work we rigorously analyze 30 existing schemes. Our analysis demonstrates contradictions between these schemes – so clearly they cannot all be right – and also highlights what we feel is a more structural flaw of most of them, namely that they ignore alternatives to blockchain-based solutions. To remedy this, we propose an improved scheme that does take alternatives into account, which we argue is more useful in practice to decide an optimal solution for a particular use case.

Tommy Koens, Erik Poll
Valuable Puzzles for Proofs-of-Work

Proof-of-work (PoW) is used as the consensus mechanism in most cryptocurrencies. PoW-based puzzles play an important part in the operation and security of a cryptocurrency, but come at a considerable energy cost. One approach to the problem of energy wastage is to find ways to build PoW schemes from valuable computational problems. This work proposes calibration of public key cryptographic systems as a suitable source of PoW puzzles. We describe the properties needed to adapt public key cryptosystems as PoW functions suitable for decentralised cryptocurrencies and provide a candidate example.

Colin Boyd, Christopher Carr
A Poisoning Attack Against Cryptocurrency Mining Pools

This paper discusses a potentially serious attack against public crypto-currency mining pools. By deliberately introducing errors under benign miners’ names, this attack can fool the mining pool administrator into punishing any innocent miner; when the top miners are punished, this attack can significantly slow down the overall production of the mining pool. We show that an attacker needs only a small fraction (e.g., one millionth) of the resources of a victim mining pool, which makes this attack scheme very affordable by a less powerful competing mining pool. We experimentally confirm the effectiveness of this attack scheme against a few well-known mining pools such as Minergate and Slush Pool.

Mohiuddin Ahmed, Jinpeng Wei, Yongge Wang, Ehab Al-Shaer
Using Economic Risk to Model Miner Hash Rate Allocation in Cryptocurrencies

Abrupt changes in the miner hash rate applied to a proof-of-work (PoW) blockchain can adversely affect user experience and security. Because different PoW blockchains often share hashing algorithms, miners face a complex choice in deciding how to allocate their hash power among chains. We present an economic model that leverages Modern Portfolio Theory to predict a miner’s allocation over time using price data and inferred risk tolerance. The model matches actual allocations with mean absolute error within 20% for four out of the top five miners active on both Bitcoin (BTC) and Bitcoin Cash (BCH) blockchains. A model of aggregate allocation across those four miners shows excellent agreement in magnitude with the actual aggregate as well a correlation coefficient of 0.649. The accuracy of the aggregate allocation model is also sufficient to explain major historical changes in inter-block time (IBT) for BCH. Because estimates of miner risk are not time-dependent and our model is otherwise price-driven, we are able to use it to anticipate the effect of a major price shock on hash allocation and IBT in the BCH blockchain. Using a Monte Carlo simulation, we show that, despite mitigation by the new difficulty adjustment algorithm, a price drop of 50% could increase the IBT by 50% for at least a day, with a peak delay of 100%.

George Bissias, Brian N. Levine, David Thibodeau

CBT Workshop: Deadlocks, Attacks and Privacy

Frontmatter
Avoiding Deadlocks in Payment Channel Networks

Payment transaction channels are one of the main proposed approaches to scaling cryptocurrency payment systems. Recent work by Malavolta et al. [7] has shown that the privacy of the protocol may conflict with its concurrent nature and may lead to deadlocks. In this paper we ask the natural question: can payments in routing networks be routed so as to avoid deadlocks altogether? Our results show that it is in general NP-complete to determine whether a deadlock-free routing exists in a given payment graph. On the other hand, Given some fixed routing, we propose another way to resolve the problem of deadlocks. We offer a modification of the protocols in lightning network and in Fulgor [7] that pre-locks edges in an order that guarantees progress, while still maintaining the protocol’s privacy requirements.

Shira Werman, Aviv Zohar
Coloured Ring Confidential Transactions

Privacy in block-chains is considered second to functionality, but a vital requirement for many new applications, e.g., in the industrial environment. We propose a novel transaction type, which enables privacy preserving trading of independent assets on a common block-chain. This is achieved by extending the ring confidential transaction with an additional commitment to a colour and a publicly verifiable proof of conservation. With our coloured confidential ring signatures, new token types can be introduced and transferred by any participant using the same sized anonymity set as single-token privacy aware block-chains. Thereby, our system facilitates tracking assets on an immutable ledger without compromising the confidentiality of transactions.

Felix Engelmann, Frank Kargl, Christoph Bösch
Pitchforks in Cryptocurrencies:
Enforcing Rule Changes Through Offensive Forking- and Consensus Techniques (Short Paper)

The increasing number of cryptocurrencies, as well as the rising number of actors within each single cryptocurrency, inevitably leads to tensions between the respective communities. As with open source projects, (protocol) forks are often the result of broad disagreement. Usually, after a permanent fork both communities “mine” their own business and the conflict is resolved. But what if this is not the case? In this paper, we outline the possibility of malicious forking and consensus techniques that aim at destroying the other branch of a protocol fork. Thereby, we illustrate how merged mining can be used as an attack method against a permissionless PoW cryptocurrency, which itself involuntarily serves as the parent chain for an attacking merge mined branch of a hard fork.

Aljosha Judmayer, Nicholas Stifter, Philipp Schindler, Edgar Weippl

DPM Workshop: Privacy Assessment and Trust

Frontmatter
Towards an Effective Privacy Impact and Risk Assessment Methodology: Risk Analysis

Privacy Impact Assessments (PIAs) play a crucial role in providing privacy protection for data subjects and supporting risk management. From an engineering perspective, the core of a PIA is a risk assessment, which typically follows a step-by-step process of risk identification and risk mitigation. In order for a PIA to be holistic and effective, it needs to be complemented by an appropriate privacy risk model that considers legal, organisational, societal and technical aspects. We propose a data-centric approach for identifying and analysing potential privacy risks in a comprehensive manner.

Majed Alshammari, Andrew Simpson
Privacy Risk Assessment: From Art to Science, by Metrics

Privacy risk assessments aim to analyze and quantify the privacy risks associated with new systems. As such, they are critically important in ensuring that adequate privacy protections are built in. However, current methods to quantify privacy risk rely heavily on experienced analysts picking the “correct” risk level on e.g. a five-point scale. In this paper, we argue that a more scientific quantification of privacy risk increases accuracy and reliability and can thus make it easier to build privacy-friendly systems. We discuss how the impact and likelihood of privacy violations can be decomposed and quantified, and stress the importance of meaningful metrics and units of measurement. We suggest a method of quantifying and representing privacy risk that considers a collection of factors as well as a variety of contexts and attacker models. We conclude by identifying some of the major research questions to take this approach further in a variety of application scenarios.

Isabel Wagner, Eerke Boiten
Bootstrapping Online Trust: Timeline Activity Proofs

Establishing initial trust between a new user and an online service, is being generally facilitated by centralized social media platforms, i.e., Facebook, Google, by allowing users to use their social profiles to prove “trustworthiness” to a new service which has some verification policy with regard to the information that it retrieves from the profiles. Typically, only static information, e.g., name, age, contact details, number of friends, are being used to establish the initial trust. However, such information provides only weak trust guarantees, as (malicious) users can trivially create new profiles and populate them with static data fast to convince the new service.We argue that the way the profiles are used over (longer) periods of time should play a more prominent role in the initial trust establishment. Intuitively, verification policies, in addition to static data, could check whether profiles are being used on a regular basis and have a convincing footprint of activities over various periods of time to be perceived as more trustworthy.In this paper, we introduce Timeline Activity Proofs (TAP) as a new trust factor. TAP allows online users to manage their timeline activities in a privacy-preserving way and use them to bootstrap online trust, e.g., as part of registration to a new service. In our model we do not rely on any centralized social media platform. Instead, users are given full control over the activities that they wish to use as part of $$\mathsf {TAP}$$ proofs. A distributed public ledger is used to provide the crucial integrity guarantees, i.e., that activities cannot be tampered with retrospectively. Our $$\mathsf {TAP}$$ construction adopts standard cryptographic techniques to enable authorized access to encrypted activities of a user for the purpose of policy verification and is proven to provide data confidentiality protecting the privacy of user’s activities and authenticated policy compliance protecting verifiers from users who cannot show the required footprint of past activities.

Constantin Cătălin Drăgan, Mark Manulis

DPM Workshop: Private Data and Searches

Frontmatter
Post-processing Methods for High Quality Privacy-Preserving Record Linkage

Privacy-preserving record linkage (PPRL) supports the integration of person-related data from different sources while protecting the privacy of individuals by encoding sensitive information needed for linkage. The use of encoded data makes it challenging to achieve high linkage quality in particular for dirty data containing errors or inconsistencies. Moreover, person-related data is often dense, e.g., due to frequent names or addresses, leading to high similarities for non-matches. Both effects are hard to deal with in common PPRL approaches that rely on a simple threshold-based classification to decide whether a record pair is considered to match. In particular, dirty or dense data likely lead to many multi-links where persons are wrongly linked to more than one other person. Therefore, we propose the use of post-processing methods for resolving multi-links and outline three possible approaches. In our evaluation using large synthetic and real datasets we compare these approaches with each other and show that applying post-processing is highly beneficial and can significantly increase linkage quality in terms of both precision and F-measure.

Martin Franke, Ziad Sehili, Marcel Gladbach, Erhard Rahm
-DOCA: Achieving Privacy in Data Streams

Numerous real world applications continuously publish data streams to benefit people in their daily activities. However, these applications may collect and release sensitive information about individuals and lead to serious risks of privacy breach. Differential Privacy (DP) has emerged as a mathematical model to release sensitive information of users while hindering the process of distinguishing individuals’ records on databases. Although DP has been widely used for protecting the privacy of individual users’ data, it was not designed, in essence, to provide its guarantees for data streams, since these data are potentially unbounded sequences and continuously generated at rapid rates. Consequently, the noise required to mask the effect of sequences of objects in data streams tend to be higher. In this paper, we design a new technique, named $$\delta $$ -DOCA, to publish data streams under differential privacy. Our approach provides a strategy to determine the sensitivity value of DP and reduces the necessary noise. Our experiments show that the application of $$\delta $$ -DOCA to anonymize data streams not only reduced significantly the necessary noise to apply differential privacy, but also allowed for the output data to preserve the original data distribution.

Bruno C. Leal, Israel C. Vidal, Felipe T. Brito, Juvêncio S. Nobre, Javam C. Machado
Data Oblivious Genome Variants Search on Intel SGX

We show how to build a practical, private data oblivious genome variants search using Intel SGX. More precisely, we consider the problem posed in Track 2 of the iDash Privacy and Security Workshop 2017 competition, which was to search for variants with high $$\chi ^{2}$$ statistic among certain genetic data over two populations. The winning solution of this iDash competition (developed by Carpov and Tortech) is extremely efficient, but not memory oblivious, which potentially made it vulnerable to a whole host of memory- and cache-based side channel attacks on SGX. In this paper, we adapt a framework in which we can exactly quantify this leakage. We provide a memory oblivious implementation with reasonable information leakage at the cost of some efficiency. Our solution is roughly an order of magnitude slower than the non-memory oblivious implementation, but still practical and much more efficient than naive memory-oblivious solutions–it solves the iDash problem in approximately 5 min. In order to do this, we develop novel definitions and models for oblivious dictionary merging, which may be of independent theoretical interest.

Avradip Mandal, John C. Mitchell, Hart Montgomery, Arnab Roy

DPM Workshop: Internet of Things

Frontmatter
Developing GDPR Compliant Apps for the Edge

We present an overview of the Databox application development environment or SDK as a means of enabling trusted IoT app development at the network edge. The Databox platform is a dedicated domestic platform that stores IoT, mobile and cloud data and executes local data processing by third party apps to provide end-user control over data flow. Key challenges for building apps in edge environments concern (i) the complexity of IoT devices and user requirements, and (ii) supporting privacy preserving features that meet new data protection regulations. We examine how the Databox SDK can ease the burden of regulatory compliance and be used to sensitize developers to privacy related issues in the very course of building apps.

Tom Lodge, Andy Crabtree, Anthony Brown
YaPPL - A Lightweight Privacy Preference Language for Legally Sufficient and Automated Consent Provision in IoT Scenarios

In this paper, we present YaPPL—a Privacy Preference Language explicitly designed to fulfill consent-related requirements of the GDPR as well as to address technical givens of IoT scenarios. We analyze what criteria consent must meet in order to be legally sufficient and translate these into a formal representation of consent as well as into functional requirements that YaPPL must fulfill. Taking into account further nonfunctional requirements particularly relevant in the IoT context, we then derive a specification of YaPPL, which we prototypically implemented in a reusable software library and successfully instantiated in a proof of concept scenario, paving the way for viable technical implementations of legally sufficient consent mechanisms in the IoT.

Max-R. Ulbricht, Frank Pallas
PrivacyGuard: Enforcing Private Data Usage with Blockchain and Attested Execution

In the upcoming evolution of the Internet of Things (IoT), it is anticipated that billions of devices will be connected to the Internet. Many of these devices are capable of collecting information from individual users and their physical surroundings. They are also capable of taking smart actions, which are usually from a backend cloud server in the IoT system. While IoT promises a more connected and smarter world, this pervasive large-scale data collection, storage, sharing, and analysis raise many privacy concerns.In the current IoT ecosystem, IoT service providers have full control of the collected user data. While the original intended use of such data is primarily for smart IoT system and device control, the data is often used for other purposes not explicitly consented to by the users. We propose a novel user privacy protection framework, PrivacyGuard, that aims to empower users with full privacy control of their data. PrivacyGuard framework seamlessly integrates two new technologies, blockchain and trusted execution environment (TEE). By encoding data access policy and usage as smart contracts, PrivacyGuard can allow data owners to control who can have what access to their data, and be able to maintain a trustworthy record of their data usage. Using remote attestation and TEE, PrivacyGuard ensures that data is only used for the intended purposes approved by the data owner. Our approach represents a significant departure from traditional privacy protections which often rely on cryptography and pure software-based secure computation techniques. Addressing the fundamental problem of data usage control, PrivacyGuard will become the cornerstone for free market of private information.

Ning Zhang, Jin Li, Wenjing Lou, Y. Thomas Hou

DPM Workshop: Privacy and Cryptography

Frontmatter
A Performance and Resource Consumption Assessment of Secret Sharing Based Secure Multiparty Computation

In recent years, Secure Multiparty Computation (SMC) advanced from a theoretical technique to a practically applicable cryptographic technology. Several frameworks were proposed of which some are still actively developed.We perform a first comprehensive study of performance characteristics of SMC protocols using a promising implementation based on secret sharing, a common and state-of-the-art foundation. We analyze its scalability with respect to environmental parameters as the number of peers and network properties – namely transmission rate, packet loss, network latency – as parameters and execution time, CPU cycles, memory consumption and amount of transmitted data as variables.Our insights on the resource consumption show that such a solution is practically applicable in intranet environments and – with limitations – in Internet settings.

Marcel von Maltitz, Georg Carle
Privacy-Preserving Trade Chain Detection

In this paper, we present a novel multi-party protocol to facilitate the privacy-preserving detection of trade chains in the context of bartering. Our approach is to transform the parties’ private quotes into a flow network such that a minimum-cost flow in this network encodes a set of simultaneously executable trade chains for which the number of parties that can trade is maximized. At the core of our novel protocol is a newly developed privacy-preserving implementation of the cycle canceling algorithm that can be used to solve the minimum cost flow problem on encrypted flow networks.

Stefan Wüller, Malte Breuer, Ulrike Meyer, Susanne Wetzel
FHE-Compatible Batch Normalization for Privacy Preserving Deep Learning

Deep Learning has recently become very popular thanks to major advances in cloud computing technology. However, pushing Deep Learning computations to the cloud poses a risk to the privacy of the data involved. Recent solutions propose to encrypt data with Fully Homomorphic Encryption (FHE) enabling the execution of operations over encrypted data. Given the serious performance constraints of this technology, recent privacy preserving deep learning solutions aim at first customizing the underlying neural network operations and further apply encryption. While the main neural network layer investigated so far is the activation layer, in this paper we study the Batch Normalization (BN) layer: a modern layer that, by addressing internal covariance shift, has been proved very effective in increasing the accuracy of Deep Neural Networks. In order to be compatible with the use of FHE, we propose to reformulate batch normalization which results in a moderate decrease on the number of operations. Furthermore, we devise a re-parametrization method that allows the absorption of batch normalization by previous layers. We show that whenever these two methods are integrated during the inference phase and executed over FHE-encrypted data, there is a significant performance gain with no loss on accuracy. We also note that this gain is valid both in the encrypted and unencrypted domains.

Alberto Ibarrondo, Melek Önen

DPM Workshop: Future Internet

Frontmatter

Open Access

A General Algorithm for k-anonymity on Dynamic Databases

In this work we present an algorithm for k-anonymization of datasets that are changing over time. It is intended for preventing identity disclosure in dynamic datasets via microaggregation. It supports adding, deleting and updating records in a database, while keeping k-anonymity on each release.We carry out experiments on database anonymization. We expected that the additional constraints for k-anonymization of dynamic databases would entail a larger information loss, however it stays close to MDAV’s information loss for static databases.Finally, we carry out a proof of concept experiment with directed degree sequence anonymization, in which the removal or addition of records, implies the modification of other records.

Julián Salas, Vicenç Torra
On Security of Anonymous Invitation-Based System

In an anonymous invitation-based system, a user can join a group by receiving invitations sent by current members, i.e., inviters, to a server anonymously. This kind of system is suitable for social networks, and a formal framework with the anonymity of inviters and the unforgeability of an invitation letter was proposed in DPM 2017. The main concept of this previous system is elegant, but the formal security definitions are insufficient and weak in a realistic application scenario. In this paper, we revise formal security definitions as attacks representing a realistic scenario. In addition, we define a new aspect of the security wherein an adversary maliciously generates an invitation letter, i.e., invitation opacity, and the security for guaranteeing that an invitee with a valid invitation letter can always join the system, i.e., invitation extractability. A secure and useful construction can be expected by satisfying the security definitions described above.

Naoto Yanai, Jason Paul Cruz

Open Access

Probabilistic Metric Spaces for Privacy by Design Machine Learning Algorithms: Modeling Database Changes

Machine learning, data mining and statistics are used to analyze the data and to build models from them. Data privacy for big data needs to find a compromise between data analysis and disclosure risk. Privacy by design machine learning algorithms need to take into account the space of models and the relationship between the data that generates the models and the models themselves. In this paper we propose the use of probabilistic metric spaces for comparing these models.

Vicenç Torra, Guillermo Navarro-Arribas
Lifelogging Protection Scheme for Internet-Based Personal Assistants

Internet-based personal assistants are promising devices combining voice control and search technologies to pull out relevant information to domestic users. They are expected to assist in a smart way to household activities, such as scheduling meetings, finding locations, reporting of cultural events, sending of messages and a lot more. The information collected by these devices, including personalized lifelogs about their corresponding users, is likely to be stored by well-established Internet players related to web search engines and social media. This can lead to serious privacy risks. The issue of protecting the identity of domestic users and their sensitive data must be tackled at design time, to promptly mitigate privacy threats. Towards this end, this paper proposes a protection scheme that jointly handles the aforementioned issues by combining log anonymization and sanitizable signatures.

David Pàmies-Estrems, Nesrine Kaaniche, Maryline Laurent, Jordi Castellà-Roca, Joaquin Garcia-Alfaro
Backmatter
Metadaten
Titel
Data Privacy Management, Cryptocurrencies and Blockchain Technology
herausgegeben von
Joaquin Garcia-Alfaro
Jordi Herrera-Joancomartí
Dr. Giovanni Livraga
Ruben Rios
Copyright-Jahr
2018
Electronic ISBN
978-3-030-00305-0
Print ISBN
978-3-030-00304-3
DOI
https://doi.org/10.1007/978-3-030-00305-0

Premium Partner