Skip to main content

2008 | Buch

Public Key Cryptography – PKC 2008

11th International Workshop on Practice and Theory in Public-Key Cryptography, Barcelona, Spain, March 9-12, 2008. Proceedings

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter

Session I: Algebraic and Number Theoretical Cryptanalysis (I)

Total Break of the ℓ-IC Signature Scheme
Abstract
In this paper, we describe efficient forgery and full-key recovery attacks on the ℓ-IC signature scheme recently proposed at PKC 2007. This cryptosystem is a multivariate scheme based on a new internal quadratic primitive which avoids some drawbacks of previous multivariate schemes: the scheme is extremely fast since it requires one exponentiation in a finite field of medium size and the public key is shorter than in many multivariate signature schemes. Our attacks rely on the recent cryptanalytic tool developed by Dubois et al. against the SFLASH signature scheme. However, the final stage of the attacks requires the use of Gröbner basis techniques to conclude to actually forge a signature (resp. to recover the secret key). For the forgery attack, this is due to the fact that Patarin’s attack is much more difficult to mount against ℓ-IC. The key recovery attack is also very efficient since it is faster to recover equivalent secret keys than to forge.
Pierre-Alain Fouque, Gilles Macario-Rat, Ludovic Perret, Jacques Stern
Recovering NTRU Secret Key from Inversion Oracles
Abstract
We consider the NTRU encryption scheme as lately suggested for use, and study the connection between inverting the NTRU primitive (i.e., the one-way function over the message and the blinding information which underlies the NTRU scheme) and recovering the NTRU secret key (universal breaking). We model the inverting algorithms as black-box oracles and do not take any advantage of the internal ways by which the inversion works (namely, it does not have to be done by following the standard decryption algorithm). This allows for secret key recovery directly from the output on several inversion queries even in the absence of decryption failures. Our oracles might be queried on both valid and invalid challenges e, however they are not required to reply (correctly) when their input is invalid. We show that key recovery can be reduced to inverting the NTRU function. The efficiency of the reduction highly depends on the specific values of the parameters. As a side-result, we connect the collisions of the NTRU function with decryption failures which helps us gain a deeper insight into the NTRU primitive.
Petros Mol, Moti Yung
Solving Systems of Modular Equations in One Variable: How Many RSA-Encrypted Messages Does Eve Need to Know?
Abstract
We address the problem of polynomial time solving univariate modular equations with mutually co-prime moduli. For a given system of equations we determine up to which size the common roots can be calculated efficiently. We further determine the minimum number of equations which suffice for a recovery of all common roots. The result that we obtain is superior to Håstad’s original RSA broadcast attack, even if Håstad’s method is combined with the best known lattice technique due to Coppersmith. Namely, our reduction uses a slightly different transformation from polynomial systems to a single polynomial. Thus, our improvement is achieved by optimal polynomial modelling rather than improved lattice techniques. Moreover, we show by a counting argument that our results cannot be improved in general. A typical application for our algorithm is an improved attack on RSA with a smaller number of polynomially related messages.
Alexander May, Maike Ritzenhofen

Session II: Theory of Public Key Encryption

Relations Among Notions of Plaintext Awareness
Abstract
We introduce a new simplified notion of plaintext awareness, which we term PA2I, and show that this is equivalent to the standard definition of PA2 plaintext awareness for encryption schemes that satisfy certain weak security and randomness requirements. We also show that PA2 plaintext awareness is equivalent to PA2+ plaintext awareness under similar security and randomness requirements. This proves a conjecture of Dent that, for suitably random public-key encryption schemes, PA2 plaintext awareness implies PA1+ plaintext awareness.
James Birkett, Alexander W. Dent
Completely Non-malleable Encryption Revisited
Abstract
Several security notions for public-key encryption schemes have been proposed so far, in particular considering the powerful adversary that can play a so called “man-in-the-middle” attack.
In this paper we extend the notion of completely non-malleable encryption introduced in [Fischlin, ICALP 05]. This notion immunizes a scheme from adversaries that can generate related ciphertexts under new public keys. This notion is motivated by its powerful features when encryption schemes are used as subprotocols. While in [Fischlin, ICALP 05] the only notion of simulation-based completely non-malleable encryption with respect to CCA2 adversaries was given, we present new game-based definitions for completely non-malleable encryption that follow the standard separations among NM-CPA, NM-CCA1 and NM-CCA2 security given in [Bellare et al., CRYPTO 98]. This is motivated by the fact that in several cases, the simplest notion we introduce (i.e., NM-CPA*) in several cases suffices for the main application that motivated the introduction of the notion of NM-CCA2* security, i.e., the design of non-malleable commitment schemes. Further the game-based definition of NM-CPA* security actually implies the simulation-based one.
We then focus on constructing encryption schemes that satisfy these strong security notions and show: 1) an NM-CCA2* secure encryption scheme in the shared random string model; 2) an NM-CCA2* secure encryption scheme in the plain model; for this second result, we use interaction and non-black-box techniques to overcome an impossibility result.
Our results clarify the importance of these stronger notions of encryption schemes and show how to construct them without requiring random oracles.
Carmine Ventre, Ivan Visconti

Invited Talk I

Cryptographic Test Correction
Abstract
Multiple choice questionnaires (mcqs) are a widely-used assessment procedure where examinees are asked to select one or more choices from a list.
This invited talk explores the possibility of transferring a part of the mcq’s correction burden to the examinee when sophisticated technological means (e.g. optical character recognition systems) are unavailable. Evidently, such schemes must make cheating difficult or at least conspicuous.
We did not manage to devise a fully satisfactory solution (cheating strategies do exist) – but our experiments with a first clumsy system encouraged us to develop alternative mcq formats and analyze their performance and security.
Eric Levieil, David Naccache

Session III: Digital Signatures (I)

Off-Line/On-Line Signatures: Theoretical Aspects and Experimental Results
Abstract
This paper presents some theoretical and experimental results about off-line/on-line digital signatures. The goal of this type of schemes is to reduce the time used to compute a signature using some kind of preprocessing. They were introduced by Even, Goldreich and Micali and constructed by combining regular digital signatures with efficient one-time signatures. Later Shamir and Tauman presented an alternative construction (which produces shorter signatures) by combining regular signatures with chameleon hash functions.
We first unify the Shamir-Tauman and Even et al. approaches by showing that they can be considered different instantiations of the same paradigm. We do this by showing that the one-time signatures needed in the Even et al. approach only need to satisfy a weak notion of security. We then show that chameleon hashing are in effect a type of one-time signatures which satisfy this weaker security notion.
In the process we study the relationship between one-time signatures and chameleon hashing, and we prove that a special type of chameleon hashing (which we call two-trapdoor) is a fully secure one-time signature.
Finally we ran experimental tests using OpenSSL libraries to test the difference between the two approaches. In our implementation we make extensive use of the observation that off-line/on-line digital signatures do not require collision-resistant hash functions to compress the message, but can be safely implemented with universal one-way hashing in both the off-line and the on-line step. The main application of this observation is that both the steps can be applied to shorter digests. This has particular relevance if block-ciphers or hash functions based one-time signatures are used since these are very sensitive to the length of the message. Interestingly, we show that (mostly due to the above observation about hashing), the two approaches are comparable in efficiency and signature length.
Dario Catalano, Mario Di Raimondo, Dario Fiore, Rosario Gennaro
Construction of Universal Designated-Verifier Signatures and Identity-Based Signatures from Standard Signatures
Abstract
We give a generic construction for universal designated-verifier signature schemes from a large class, ℂ, of signature schemes. The resulting schemes are efficient and have two important properties. Firstly, they are provably DV-unforgeable, non-transferable and also non-delegatable. Secondly, the signer and the designated verifier can independently choose their cryptographic settings. We also propose a generic construction for identity-based signature schemes from any signature scheme in ℂ and prove that the construction is secure against adaptive chosen message and identity attacks. We discuss possible extensions of our constructions to universal multi-designated-verifier signatures, hierarchical identity-based signatures, identity-based universal designated verifier signatures, and identity-based ring signatures from any signature in ℂ.
Siamak F. Shahandashti, Reihaneh Safavi-Naini
Proxy Signatures Secure Against Proxy Key Exposure
Abstract
We provide an enhanced security model for proxy signatures that captures a more realistic set of attacks than previous models of Boldyreva et al. and of Malkin et al.. Our model is motivated by concrete attacks on existing schemes in scenarios in which proxy signatures are likely to be used. We provide a generic construction for proxy signatures secure in our enhanced model using sequential aggregate signatures; our construction provides a benchmark by which future specific constructions may be judged. Finally, we consider the extension of our model and constructions to the identity-based setting.
Jacob C. N. Schuldt, Kanta Matsuura, Kenneth G. Paterson

Session IV: Identification, Broadcast and Key Agreement

Lattice-Based Identification Schemes Secure Under Active Attacks
Abstract
There is an inherent difficulty in building 3-move ID schemes based on combinatorial problems without much algebraic structure. A consequence of this, is that most standard ID schemes today are based on the hardness of number theory problems. Not having schemes based on alternate assumptions is a cause for concern since improved number theoretic algorithms or the realization of quantum computing would make the known schemes insecure. In this work, we examine the possibility of creating identification protocols based on the hardness of lattice problems. We construct a 3-move identification scheme whose security is based on the worst-case hardness of the shortest vector problem in all lattices, and also present a more efficient version based on the hardness of the same problem in ideal lattices.
Vadim Lyubashevsky
Efficient Simultaneous Broadcast
Abstract
We present an efficient simultaneous broadcast protocol ν-SimCast that allows n players to announce independently chosen values, even if up to \(t < \frac{n}{2}\) players are corrupt. Independence is guaranteed in the partially synchronous communication model, where communication is structured into rounds, while each round is asynchronous. The ν-SimCast protocol is more efficient than previous constructions. For repeated executions, we reduce the communication and computation complexity by a factor \(\mathcal{O} (n)\). Combined with a deterministic extractor, ν-SimCast provides a particularly efficient solution for distributed coin-flipping. The protocol does not require any zero-knowledge proofs and is shown to be secure in the standard model under the Decisional Diffie Hellman assumption.
Sebastian Faust, Emilia Käsper, Stefan Lucks
SAS-Based Group Authentication and Key Agreement Protocols
Abstract
New trends in consumer electronics have created a strong demand for fast, reliable and user-friendly key agreement protocols. However, many key agreement protocols are secure only against passive attacks. Therefore, message authentication is often unavoidable in order to achieve security against active adversaries. Pasini and Vaudenay were the first to propose a new compelling methodology for message authentication. Namely, their two-party protocol uses short authenticated strings (SAS) instead of pre-shared secrets or public-key infrastructure that are classical tools to achieve authenticity. In this article, we generalise this methodology for multi-party settings. We give a new group message authentication protocol that utilises only limited authenticated communication and show how to combine this protocol with classical key agreement procedures. More precisely, we describe how to transform any group key agreement protocol that is secure against passive attacks into a new protocol that is secure against active attacks.
Sven Laur, Sylvain Pasini

Session V: Implementation of Fast Arithmetic

An Optimized Hardware Architecture for the Montgomery Multiplication Algorithm
Abstract
Montgomery modular multiplication is one of the fundamental operations used in cryptographic algorithms, such as RSA and Elliptic Curve Cryptosystems. At CHES 1999, Tenca and Koç introduced a now-classical architecture for implementing Montgomery multiplication in hardware. With parameters optimized for minimum latency, this architecture performs a single Montgomery multiplication in approximately 2n clock cycles, where n is the size of operands in bits. In this paper we propose and discuss an optimized hardware architecture performing the same operation in approximately n clock cycles with almost the same clock period. Our architecture is based on pre-computing partial results using two possible assumptions regarding the most significant bit of the previous word, and is only marginally more demanding in terms of the circuit area. The new radix-2 architecture can be extended for the case of radix-4, while preserving a factor of two speed-up over the corresponding radix-4 design by Tenca, Todorov, and Koç from CHES 2001. Our architecture has been verified by modeling it in Verilog-HDL, implementing it using Xilinx Virtex-II 6000 FPGA, and experimentally testing it using SRC-6 reconfigurable computer.
Miaoqing Huang, Kris Gaj, Soonhak Kwon, Tarek El-Ghazawi
New Composite Operations and Precomputation Scheme for Elliptic Curve Cryptosystems over Prime Fields
Abstract
We present a new methodology to derive faster composite operations of the form dP + Q, where d is a small integer ≥ 2, for generic ECC scalar multiplications over prime fields. In particular, we present an efficient Doubling-Addition (DA) operation that can be exploited to accelerate most scalar multiplication methods, including multiscalar variants. We also present a new precomputation scheme useful for window-based scalar multiplication that is shown to achieve the lowest cost among all known methods using only one inversion. In comparison to the remaining approaches that use none or several inversions, our scheme offers higher performance for most common I/M ratios. By combining the benefits of our precomputation scheme and the new DA operation, we can save up to 6.2% on the scalar multiplication using fractional wNAF.
Patrick Longa, Ali Miri

Session VI: Digital Signatures (II)

Online-Untransferable Signatures
Abstract
Non-transferability of digital signatures is an important security concern, traditionally achieved via interactive verification protocols. Such protocols, however, are vulnerable to “online transfer attacks” — i.e., attacks mounted during the protocols’ executions.
In this paper, we show how to guarantee online untransferability of signatures, via a reasonable public-key infrastructure and general assumptions, without random oracles. Our untransferable signatures are as efficient as prior ones that provably provide weaker types of untransferability.
Moses Liskov, Silvio Micali
Security of Digital Signature Schemes in Weakened Random Oracle Models
Abstract
We formalize the notion of several weakened random oracle models in order to capture which property of a hash function is crucial to prove the security of a cryptographic scheme. In particular, we focus on augmenting the random oracle with additional oracles that respectively return collisions, second-preimages, and first-preimages. We study the security of the full domain hash signature scheme, as well as three variants thereof in the weakened random oracle models, leading to a separation result.
Akira Numayama, Toshiyuki Isshiki, Keisuke Tanaka
A Digital Signature Scheme Based on CVP  ∞ 
Abstract
In Crypto 1997, Goldreich, Goldwasser and Halevi (GGH) proposed a lattice analogue of McEliece public key cryptosystem, which security is related to the hardness of approximating the closest vector problem (CVP) in a lattice. Furthermore, they also described how to use the same principle of their encryption scheme to provide a signature scheme. Practically, this cryptosystem uses the euclidean norm, l 2-norm, which has been used in many algorithms based on lattice theory. Nonetheless, many drawbacks have been studied and these could lead to cryptanalysis of the scheme. In this paper, we present a novel method of reducing a vector under the l  ∞ -norm and propose a digital signature scheme based on it. Our scheme takes advantage of the l  ∞ -norm to increase the resistance of the GGH scheme and to decrease the signature length. Furthermore, after some other improvements, we obtain a very efficient signature scheme, that trades the security level, speed and space.
Thomas Plantard, Willy Susilo, Khin Than Win

Session VII: Algebraic and Number Theoretical Cryptanalysis (II)

An Analysis of the Vector Decomposition Problem
Abstract
The vector decomposition problem (VDP) has been proposed as a computational problem on which to base the security of public key cryptosystems. We give a generalisation and simplification of the results of Yoshida on the VDP. We then show that, for the supersingular elliptic curves which can be used in practice, the VDP is equivalent to the computational Diffie-Hellman problem (CDH) in a cyclic group. For the broader class of pairing-friendly elliptic curves we relate VDP to various co-CDH problems and also to a generalised discrete logarithm problem 2-DL which in turn is often related to discrete logarithm problems in cyclic groups.
Steven D. Galbraith, Eric R. Verheul
A Parameterized Splitting System and Its Application to the Discrete Logarithm Problem with Low Hamming Weight Product Exponents
Abstract
A low Hamming weight product (LHWP) exponent is used to increase the efficiency of cryptosystems based on the discrete logarithm problem (DLP). In this paper, we introduce a new tool, called a Parameterized Splitting System, to analyze the security of the DLP with LHWP exponents.
We apply a parameterized splitting system to attack the GPS identification scheme modified by Coron, Lefranc and Poupard in CHES’05 and obtain an algorithm of 261.6 time complexity which was expected to be 278. Also a parameterized splitting system can be used to solve the DLP with a LHWP exponent proposed by Hoffstein and Silverman in 254.51 time complexity, that is smaller than 259 in the recent Cheon-Kim attack.
Sungwook Kim, Jung Hee Cheon

Session VIII: Public Key Encryption

Certificateless Encryption Schemes Strongly Secure in the Standard Model
Abstract
This paper presents the first constructions for certificateless encryption (CLE) schemes that are provably secure against strong adversaries in the standard model. It includes both a generic construction for a strongly secure CLE scheme from any passively secure scheme as well as a concrete construction based on the Waters identity-based encryption scheme.
Alexander W. Dent, Benoît Libert, Kenneth G. Paterson
Unidirectional Chosen-Ciphertext Secure Proxy Re-encryption
Abstract
In 1998, Blaze, Bleumer, and Strauss proposed a cryptographic primitive called proxy re-encryption, in which a proxy transforms – without seeing the corresponding plaintext – a ciphertext computed under Alice’s public key into one that can be opened using Bob’s secret key. Recently, an appropriate definition of chosen-ciphertext security and a construction fitting this model were put forth by Canetti and Hohenberger. Their system is bidirectional: the information released to divert ciphertexts from Alice to Bob can also be used to translate ciphertexts in the opposite direction. In this paper, we present the first construction of unidirectional proxy re-encryption scheme with chosen-ciphertext security in the standard model (i.e. without relying on the random oracle idealization), which solves a problem left open at CCS’07. Our construction is efficient and requires a reasonable complexity assumption in bilinear map groups. Like the Canetti-Hohenberger scheme, it ensures security according to a relaxed definition of chosen-ciphertext introduced by Canetti, Krawczyk and Nielsen.
Benoît Libert, Damien Vergnaud
Public Key Broadcast Encryption with Low Number of Keys and Constant Decryption Time
Abstract
In this paper we propose three public key BE schemes that have efficient complexity measures. The first scheme, called the BE-PI scheme, has O(r) header size, O(1) public keys and O(logN) private keys per user, where r is the number of revoked users. This is the first public key BE scheme that has both public and private keys under O(logN) while the header size is O(r). These complexity measures match those of efficient secret key BE schemes.
Our second scheme, called the PK-SD-PI scheme, has O(r) header size, O(1) public key and O(log2 N) private keys per user. They are the same as those of the SD scheme. Nevertheless, the decryption time is remarkably O(1). This is the first public key BE scheme that has O(1) decryption time while other complexity measures are kept low. The third scheme, called, the PK-LSD-PI scheme, is constructed in the same way, but based on the LSD method. It has O(r/ε) ciphertext size and O(log1 + ε N) private keys per user, where 0 < ε< 1. The decryption time is also O(1).
Our basic schemes are one-way secure against full collusion of revoked users in the random oracle model under the BDH assumption. We can modify our schemes to have indistinguishably security against adaptive chosen ciphertext attacks.
Yi-Ru Liu, Wen-Guey Tzeng
Backmatter
Metadaten
Titel
Public Key Cryptography – PKC 2008
herausgegeben von
Ronald Cramer
Copyright-Jahr
2008
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-78440-1
Print ISBN
978-3-540-78439-5
DOI
https://doi.org/10.1007/978-3-540-78440-1