Skip to main content

2012 | Buch

Theory of Security and Applications

Joint Workshop, TOSCA 2011, Saarbrücken, Germany, March 31 - April 1, 2011, Revised Selected Papers

herausgegeben von: Sebastian Mödersheim, Catuscia Palamidessi

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This book constitutes the thoroughly refereed post-conference proceedings of the Joint Workshop on Theory of Security and Applications (formely known as ARSPA-WITS), TOSCA 2011, held in Saarbrücken, Germany, in March/April 2011, in association with ETAPS 2011. The 9 revised full papers presented together with 3 invited talks were carefully reviewed and selected from 24 submissions. The papers feature topics including various methods in computer security, including the formal specification, analysis and design of security protocols and their applications, the formal definition of various aspects of security such as access control mechanisms, mobile code security and denial-of-service attacks, and the modeling of information flow and its application.

Inhaltsverzeichnis

Frontmatter
Union and Intersection Types for Secure Protocol Implementations
Abstract
We present a new type system for verifying the security of cryptographic protocol implementations. The type system combines prior work on refinement types, with union, intersection, and polymorphic types, and with the novel ability to reason statically about the disjointness of types. The increased expressivity enables the analysis of important protocol classes that were previously out of scope for the type-based analyses of protocol implementations. In particular, our types can statically characterize: (i) more usages of asymmetric cryptography, such as signatures of private data and encryptions of authenticated data; (ii) authenticity and integrity properties achieved by showing knowledge of secret data; (iii) applications based on zero-knowledge proofs. The type system comes with a mechanized proof of correctness and an efficient type-checker.
Michael Backes, Cătălin Hriţcu, Matteo Maffei
Secure Composition of Protocols
Abstract
Security protocols are small distributed programs that are designed to ensure security over untrusted networks such as the Internet. They are notoriously dificult to design and flaws can be found several years after their publication and even their deployment. In particular, they are not securely composable in general: two protocols may be secure when analyzed separately but may cause harmful interactions to each other. We explore how tagging protocols allows to securely compose protocols.
Véronique Cortier
Constructive Cryptography – A New Paradigm for Security Definitions and Proofs
Abstract
Constructive cryptography, an application of abstract cryptography proposed by Maurer and Renner, is a new paradigm for defining the security of cryptographic schemes such as symmetric encryption, message authentication codes, public-key encryption, key-agreement protocols, and digital signature schemes, and for proving the security of protocols making use of such schemes. Such a cryptographic scheme can be seen (and defined) as constructing a certain resource (e.g. a channel or key) with certain security properties from another (weaker) such resource. For example, a secure encryption scheme constructs a secure channel from an authenticated channel and a secret key.
The term “construct”, which is defined by the use of a simulator, is composable in the sense that a protocol obtained by the composition of several secure constructive steps is itself secure. This is in contrast to both the traditional, game-based security definitions for cryptographic schemes and the attack-based security definitions used in formal-methods based security research, which are generally not composable.
Constructive cryptography allows to take a new look at cryptography and the design of cryptographic protocols. One can give explicit meaning to various types of game-based security notions of confidentiality, integrity, and malleability, one can design key agreement, secure communication, certification, and other protocols in a modular and composable manner, and one can separate the understanding of what cryptography achieves from the technical security definitions and proofs, which is useful for didactic purposes and protocol design.
Ueli Maurer
G2C: Cryptographic Protocols from Goal-Driven Specifications
Abstract
We present G2C, a goal-driven specification language for distributed applications. This language offers support for the declarative specification of functionality goals and security properties. The former comprise the parties, their inputs, and the goal of the communication protocol. The latter comprise secrecy, access control, and anonymity requirements. A key feature of our language is that it abstracts away from how the intended functionality is achieved, but instead lets the system designer concentrate on which functional features and security properties should be achieved. Our framework provides a compilation method for transforming G2C specifications into symbolic cryptographic protocols, which are shown to be optimal. We provide a technique to automatically verify the correctness and security of these protocols using ProVerif, a state-of-the-art automated theorem-prover for cryptographic protocols. We have implemented a G2C compiler to demonstrate the feasibility of our approach.
Michael Backes, Matteo Maffei, Kim Pecina, Raphael M. Reischuk
Modeling Long-Term Signature Validation for Resolution of Dispute
Abstract
This paper considers the case where a dispute occurs between a verifier and a signer about the validity of a digital signature. In non-repudiation services such dispute may occur long after the signature creation and approval. We present a security model for digital signature validation with the notion of dispute. The first contribution of this paper is the definition of the semantics of a Resolution of Dispute Rule (RDR ) in the scope of this model. The second contribution is a calculus for reasoning about the validation of digital signatures at a particular date which may be in the past (so-called long-term signature validation). This calculus is then used to implement the RDR. The usefulness of the calculus is demonstrated through modeling Evidence Record Syntax (ERS), one of the main protocols used in practice for long-term signature validation.
Moez Ben MBarka, Francine Krief, Olivier Ly
Formal Analysis of Privacy for Anonymous Location Based Services
Abstract
We propose a framework for formal analysis of privacy in location based services such as anonymous electronic toll collection. We give a formal definition of privacy, and apply it to the VPriv scheme for vehicular services. We analyse the resulting model using the ProVerif tool, concluding that our privacy property holds only if certain conditions are met by the implementation. Our analysis includes some novel features such as the formal modelling of privacy for a protocol that relies on interactive zero-knowledge proofs of knowledge and list permutations.
Morten Dahl, Stéphanie Delaune, Graham Steel
Formal Analysis of the EMV Protocol Suite
Abstract
This paper presents a formal model of the EMV (Europay-MasterCard-Visa) protocol suite in F# and its analysis using the protocol verification tool ProVerif [5] in combination with FS2PV [4].
The formalisation covers all the major options of the EMV protocol suite, including all card authentication mechanisms and both on- and offline transactions. Some configuration parameters have to be fixed to allow any security analysis; here we follow the configuration of Dutch EMV banking cards, but the model could easily be adapted to other configurations.
As far as we know this is the first comprehensive formal description of EMV. The convenience and expressivity of F# proved to be a crucial advantage to make the formalisation of something as complex as EMV feasible. Even though the EMV specs amount to over 700 pages, our formal model is only 370 lines of code.
Formal analysis of our model with ProVerif is still possible, though this requires some care. Our formal analysis does not reveal any new weaknesses of the EMV protocol suite, but it does reveal all the known weaknesses, as a formal analysis of course should.
Joeri de Ruiter, Erik Poll
Security Goals and Protocol Transformations
Abstract
Cryptographic protocol designers work incrementally. Having achieved some goals for confidentiality and authentication in a protocol Π1, they transform it to a richer Π2 to achieve new goals.
But do the original goals still hold? More precisely, if a goal formula Γ holds whenever Π1 runs against an adversary, does a translation of Γ hold whenever Π2 runs against it?
We prove that a transformation preserves goal formulas if a labeled transition system for analyzing Π1 simulates a portion of an lts for analyzing Π2, while preserving progress in that portion.
Thus, we examine the process of analyzing a protocol Π. We use ltss that describe our activity when analyzing Π, not that of the principals executing Π. Each analysis step considers—for an observed message reception—what earlier transmissions would explain it. The lts then contains a transition from a fragmentary execution containing the reception to a richer one containing an explaining transmission. The strand space protocol analysis tool cpsa generates some of the ltss used.
Joshua D. Guttman
Model-Checking Secure Information Flow for Multi-threaded Programs
Abstract
This paper shows how secure information flow properties of multi-threaded programs can be verified by model checking in a precise and efficient way, by using the idea of self-composition.
It discusses two properties that aim to capture secure information flow for multi-threaded programs, and it shows how these properties can be characterised in modal μ-calculus. For this characterisation, a self-composed model of the program is constructed. More precisely, this is a model that contains two copies of the labelled transition system induced by the program, so that the program is executed in parallel with itself. The self-composed model allows to compare two program executions in a single temporal formula that characterises a secure information flow property.
Both the formula and model are translated into the input language for the Concurrency Workbench model checker. We discuss this encoding, and use it for some practical experiments on several simple examples.
Marieke Huisman, Henri-Charles Blondeel
Multiple Congruence Relations, First-Order Theories on Terms, and the Frames of the Applied Pi-Calculus
Abstract
We investigate the problem of deciding first-order theories of finite trees with several distinguished congruence relations, each of them given by some equational axioms. We give an automata-based solution for the case where the different equational axiom systems are linear and variable-disjoint (this includes the case where all axioms are ground), and where the logic does not permit to express tree relations x = f(y,z). We show that the problem is undecidable when these restrictions are relaxed. As motivation and application, we show how to translate the model-checking problem of \(\mathcal{A}\pi \mathcal{L}\), a spatial equational logic for the applied pi-calculus, to the validity of first-order formulas in term algebras with multiple congruence relations.
Florent Jacquemard, Étienne Lozes, Ralf Treinen, Jules Villard
Automated Code Injection Prevention for Web Applications
Abstract
We propose a new technique based on multitier compilation for preventing code injection in web applications. It consists in adding an extra stage to the client code generator which compares the dynamically generated code with the specification obtained from the syntax of the source program. No intervention from the programmer is needed. No plugin or modification of the web browser is required. The soundness and validity of the approach are proved formally by showing that the client compiler can be fully abstract. The practical interest of the approach is proved by showing the actual implementation in the Hop environment.
Zhengqin Luo, Tamara Rezk, Manuel Serrano
Soundness of Removing Cancellation Identities in Protocol Analysis under Exclusive-OR
Abstract
In [Mil03,LM05], Millen-Lynch-Meadows proved that, under some restrictions on messages, including identities for canceling an encryption and a decryption within the same term during analysis will be redundant. i.e., they will not lead to any new attacks that were not found without them. In this paper, we prove that slightly modified restrictions are sufficient to safely remove those identities, even when protocols contain operators such as the notorious Exclusive-OR operator that break the free algebra assumption with their own identities, in addition to the identities considered by Millen-Lynch-Meadows.
Sreekanth Malladi
Backmatter
Metadaten
Titel
Theory of Security and Applications
herausgegeben von
Sebastian Mödersheim
Catuscia Palamidessi
Copyright-Jahr
2012
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-27375-9
Print ISBN
978-3-642-27374-2
DOI
https://doi.org/10.1007/978-3-642-27375-9