Skip to main content

1990 | Buch

Advances in Cryptology — CRYPTO’ 88

Proceedings

herausgegeben von: Shafi Goldwasser

Verlag: Springer New York

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter

Cryptographic Primitives

Weakening Security Assumptions and Oblivious Transfer
(Abstract)

Our work is motivated by a recent trend in cryptographic research. Protocol problems that have previously been solved subject to intractability assumptions are now being solved without these assumptions. Examples of this trend include a new completeness theorem for multiparty protocols [BGW,CCD], and a protocol for byzantine agreement using private channels [FM]. These breakthroughs illustrate both the strengths and the weaknesses of using the cryptographic model. Devising first a protocol that uses cryptographic assumptions can give powerful intuition that later allows one to create a protocol that works without assumptions. However, there is a danger that the cryptographic assumptions one uses can become inextricably bound up in the protocol. It may take years before these assumptions can be ironed out of the final protocol.

Claude Crépeau, Joe Kilian
Limits on the Provable Consequences of One-way Permutations

We present strong evidence that the implication, “if one-way permutations exist, then secure secret key agreement is possible”, is not provable by standard techniques. Since both sides of this implication are widely believed true in real life, to show that the implication is false requires a new model. We consider a world where all parties have access to a black box for a randomly selected permutation. Being totally random, this permutation will be strongly one-way in a provable, information-theoretic way. We show that, if P = NP, no protocol for secret key agreement is secure in such a setting. Thus, to prove that a secret key agreement protocol which uses a one-way permutation as a black box is secure is as hard as proving P ≠ NP. We also obtain, as a corollary, that there is an oracle relative to which the implication is false, i.e., there is a one-way permutation, yet secret-exchange is impossible. Thus, no technique which relativizes can prove that secret exchange can be based on any one-way permutation. Our results present a general framework for proving statements of the form, “Cryptographic application X is not likely possible based solely on complexity assumption Y.”

Russell Impagliazzo, Steven Rudich
Generalized Secret Sharing and Monotone Functions

Secret Sharing from the perspective of threshold schemes has been well-studied over the past decade. Threshold schemes, however, can only handle a small fraction of the secret sharing functions which we may wish to form. For example, if it is desirable to divide a secret among four participants A, B. C, and D in such a way that either A together with B can reconstruct the secret or C together with D can reconstruct the secret, then threshold schemes (even with weighting) are provably insufficient.This paper will present general methods for constructing secret sharing schemes for any given secret sharing function. There is a natural correspondence between the set of “generalized” secret sharing functions and the set of monotone functions, and tools developed for simplifying the latter set can be applied equally well to the former set.

Josh Benaloh, Jerry Leichter

Zero-Knowledge

Everything Provable is Provable in Zero-Knowledge

Assuming the existence of a secure probabilistic encryption scheme, we show that every language that admits an interactive proof admits a (computational) zero-knowledge interactive proof. This result extends the result of Goldreich, Micali and Wigderson, that, under the same assumption, all of NP admits zero-knowledge interactive proofs. Assuming envelopes for bit commitment, we show tht every language that admits an interactive proof admits a perfect zero-knowledge interactive proof.

Michael Ben-Or, Oded Goldreich, Shafi Goldwasser, Johan Håstad, Joe Kilian, Silvio Micali, Phillip Rogaway
A Perfect Zero-Knowledge Proof for a Problem Equivalent to Discrete Logarithm

An interactive proof is called perfect zero-knowledge if the probability distribution generated by any probabilistic polynomial-time verifier interacting with the prover on input a theorem φ, can be generated by another probabilistic polynomial time machine which only gets φ as input (and interacts with nobody!).In this paper we present a perfect zero-knowledge proof system for a decision problem which is computationally equivalent to the Discrete Logarithm Problem. Doing so we provide additional evidence to the belief that perfect zero-knowledge proofs exist in a non-trivial manner (i.e. for languages not in BPP). Our results extend to the logarithm problem in any finite Abelian group.

Oded Goldreich, Eyal Kushilevitz
Zero-Knowledge With Finite State Verifiers
(Extended Abstract)

We initiate an investigation of interactive proof systems (IPS’s) and zero knowledge interactive proof systems where the verifier is a 2-way probabilistic finite state automaton (2pfa). Among other results, we show: 1.There is a class of 2pfa verifiers and a language L such that L has a zero knowledge IPS with respect to this class of verifiers, and L cannot be recognized by any verifier in the class on its own;2.There is a language L such that L has an IPS with 2pfa verifiers but L has no zero knowledge IPS with 2pfa verifiers.

Cynthia Dwork, Larry Stockmeyer

Number Theory

Intractable Problems in Number Theory

This paper surveys computational problems related to integer factorization and the calculation of discrete logarithms in various groups. Its aim is to provide theory sufficient for the derivation of heuristic running time estimates, and at the same time introduce algorithms of practical value.

Eric Bach
A Family of Jacobians Suitable for Discrete Log Cryptosystems

We investigate the jacobians of the hyperelliptic curves v2+ v = u2g+1 over finite fields, and discuss which are likely to have “almost prime” order.

Neal Koblitz
Computation of Approximate L-th Roots Modulo n and Application to Cryptography

The goal of this paper is to give a unified view of various known results (apparently unrelated) about numbers arising in crypto schemes as RSA, by considering them as variants of the computation of approximate L-th roots modulo n. Here one may be interested in a number whose L-th power is “close” to a given number, or in finding a number that is “close” to its exact L-th root. The paper collects numerous algorithms which solve problems of this type.

Marc Girault, Philippe Toffin, Brigitte Vallée

Cryptanalysis

On the McEliece Public-Key Cryptosystem

Based on an idea by Hin, the method of obtaining the original message after selecting k of n coordinates at random in the McEliece public-key cryptosystem is improved. The attack, which is more efficient than the attacks previously proposed, is characterized by a systematic method of checking and by a random bit swapping procedure. An optimization procedure similar to the one proposed by Lee and Brickell is used to improve the attack. The attack is highly suitable for parallel and pipelined implementation. The work factor and the values, which yield ‘maximum’ security for the system are given.It is shown that the public-key can be reduced to k × (n − k) bits.

Johan van Tilburg
A Constraint Satisfaction Algorithm for the Automated Decryption of Simple Substitution Ciphers

This paper describes a systematic procedure for decrypting simple substitution ciphers with word divisions. The algorithm employs an exhaustive search in a large on-line dictionary for words that satisfy constraints on word length, letter position and letter multiplicity. The method does not rely on statistical or semantical properties of English, nor does it use any language-specific heuristics. The system is, in fact, language independent in the sense that it would work equally well over any language for which a sufficiently large dictionary exists on-line. To reduce the potentially high cost of locating all words that contain specified patterns, the dictionary is compiled into a database from which groups of words that satisfy simple constraints may be accessed simultaneously. The algorithm (using a relatively small dictionary of 19,000 entries) has been implemented in Franz Lisp on a Vax 11/780 computer running 4.3 BSD Unix. The system is frequently successful in a completely automated mode — preliminary testing indicates about a 60% success rate, usually in less than three minutes of CPU time. If it fails, there exist interactive facilities, permitting the user to guide the search manually, that perform very well with minor human intervention.

Michael Lucks

Pseudorandomness

On the Existence of Pseudorandom Generators

Pseudorandom generators (suggested and developed by Blum and Micali and Yao) are efficient deterministic programs that expand a randomly selected k -bit seed into a much longer pseudorandom bit sequence which is indistinguishable in polynomial time from an (equally long) sequence of unbiased coin tosses. Pseudorandom generators are known to exist assuming the existence of functions that cannot be efficiently inverted on the distributions induced by applying the function iteratively polynomially many times. This sufficient condition is also a necessary one, but it seems difficult to check whether particular functions, assumed to be one-way, are also one-way on their iterates. This raises the fundamental question whether the mere existence of one-way functions suffices for the construction of pseudorandom generators.In this paper we present progress towards resolving this question. We consider regular functions, in which every image of a k -bit string has the same number of preimages of length k. We show that if a regular function is one-way then pseudorandom generators do exist. In particular, assuming the intractability of general factoring, we can now prove that pseudorandom generators do exist. Other applications are the construction of pseudorandom generators based on the conjectured intractability of decoding random linear codes, and on the assumed average case difficulty of combinatorial problems as subset-sum.

Oded Goldreich, Hugo Krawczyk, Michael Luby
On The Randomness of Legendre and Jacobi Sequences

Most of the work done in cryptography in the last few years depend on the hardness of a few specific number theoretic problems, such as factoring, discrete log, etc. Since no one has so far been able to prove that these problems are genuinely hard, it is clearly of interest to find new candidates for hard problems. In this paper, we propose such a new candidate problem. namely the problem of predicting a sequence of consecutive Legendre (Jacobi) symbols modulo a prime (composite), when the starting point and possibly also the prime is unknown. Clearly, if this problem turns out to be hard, it can be used directly to construct a cryptographically strong pseudorandom bitgenerator. Its complexity seems to be unrelated to any of the well known number theoretical problems, whence it may be able to survive the discovery of fast factoring or discrete log algorithms. Although the randomness of Legendre sequences has part of the folklore in number theory at least since the thirties, they have apparently not been considered for use in cryptography before.

Ivan Bjerre Damgård
Efficient, Perfect Random Number Generators

We describe a method that transforms every perfect random number generator into one that can be accelerated by parallel evaluation. Our method of parallelization is perfect, m parallel processors speed the generation of pseudo-random bits by a factor m; these parallel processors need not to communicate. Using sufficiently many parallel processors we can generate pseudo-random bits with nearly any speed. These parallel generators enable fast retrieval of substrings of very long pseudo-random strings. Individual bits of pseudo-random strings of length 1020 can be accessed within a few seconds. We improve and extend the RSA-random number generator to a polynomial generator that is almost as efficient as the linear congruential generator. We question the existence of polynomial random number generators that are perfect and use a prime modulus.

S. Micali, C. P. Schnorr

Signatures and Authentication

How To Sign Given Any Trapdoor Function
(extended abstract)

We present a digital signature scheme based on trapdoor permutations. This scheme is secure against existential forgery under adaptive chosen message attack. The only previous scheme with the same level of security was based on factoring.Although the main thrust of our work is the question of reduced assumptions, we believe that the scheme itself is of some independent interest. We mention improvements on the basic scheme which lead to a memoryless and more efficient version.

Mihir Bellare, Silvio Micali
A “Paradoxical” Indentity-Based Signature Scheme Resulting from Zero-Knowledge

At EUROCRYPT’88, we introduced an interactive zero-knowledge protocol (Guillou and Quisquater [13]) fitted to the authentication of tamper-resistant devices (e.g. smart cards, Guillou and Ugon [14]).Each security device stores its secret authentication number, an RSA-like signature computed by an authority from the device identity. Any transaction between a tamper-resistant security device and a verifier is limited to a unique interaction: the device sends its identity and a random test number; then the verifier tells a random large question; and finally the device answers by a witness number. The transaction is successful when the test number is reconstructed from the witness number, the question and the identity according to numbers published by the authority and rules of redundancy possibly standardized.This protocol allows a cooperation between users in such a way that a group of cooperative users looks like a new entity, having a shadowed identity the product of the individual shadowed identities, while each member reveals nothing about its secret.In another scenario, the secret is partitioned between distinct devices sharing the same identity. A group of cooperative users looks like a unique user having a larger public exponent which is the greater common multiple of each individual exponent.In this paper, additional features are introduced in order to provide: firstly, a mutual interactive authentication of both communicating entities and previously exchanged messages, and, secondly, a digital signature of messages, with a non-interactive zero-knowledge protocol. The problem of multiple signature is solved here in a very smart way due to the possibilities of cooperation between users.The only secret key is the factors of the composite number chosen by the authority delivering one authentication number to each smart card. This key is not known by the user. At the user level, such a scheme may be considered as a keyless identity-based integrity scheme. This integrity has a new and important property: it cannot be misused, i.e. derived into a confidentiality scheme.

Louis Claude Guillou, Jean-Jacques Quisquater
A Modification of the Fiat-Shamir Scheme

Fiat-Shamir’s identification and signature scheme is efficient as well as provably secure, but it has a problem in that the transmitted information size and memory size cannot simultaneously be small. This paper proposes an identification and signature scheme which overcomes this problem. Our scheme is based on the difficulty of extracting the L-th roots mod n (e.g., L = 2 ∼ 1020) when the factors of n are unknown. We define some variations of no transferable information and prove that the sequential version of our scheme is a zero knowledge interactive proof system and our parallel version satisfies these variations of no transferable information under some conditions. The speed of our scheme’s typical implementation is at least one order of magnitude faster than that of the RSA scheme and is relatively slow in comparison with that of the Fiat-Shamir scheme.

Kazuo Ohta, Taisuaki Okamoto
An Improvement of the Fiat-Shamir Identification and Signature Scheme

In 1986 Fiat and Shamir exhibited zero-knowledge based identification and digital signature schemes which require only 10 to 30 modular multiplications per party. In this paper we describe an improvement of this scheme which reduces the verifier’s complexity to less than 2 modular multiplications and leaves the prover’s complexity unchanged.The new variant is particularly useful when a central computer has to verify in real time signed messages from thousands of remote terminals, or when the same signature has to be repeatedly verified.

Silvio Micali, Adi Shamir

On the Theory of Security I

A Basic Theory of Public and Private Cryptosystems

Not since the early work of [DH], [RSA], and [GM] has there been a great deal of work on the basic definition of “normal” cryptography, and on what it means for a cryptosystem to be secure. By normal cryptogaphy, I mean not protocols to accomplish sophisticated goals, but merely the situation where party A wishes to send a message to party B over a line which is being tapped. Existing definitions of such a system, when they aren’t too vague, are overly restrictive; existing definitions of security of such systems, when given rigorously, are usually overly liberal. In this paper I’ll present what seem to me to be the proper definitions, give statements of the basic theorems I know about these definitions, and raise some very fundamental open questions. Most of the definitions and results appeared in [R].

Charles Rackoff
Proving Security Against Chosen Ciphertext Attacks

The relevance of zero knowledge to cryptography has become apparent in the recent years. In this paper we advance this theory by showing that interaction in any zero-knowledge proof can be replaced by sharing a common, short, random string. This advance finds immediate application in the construction of the first public-key cryptosystem secure against chosen ciphertext attack.Our solution, though not yet practical, is of theoretical significance, since the existence of cryptosystems secure against chosen ciphertext attack has been a famous long-standing open problem in the field.

Manuel Blum, Paul Feldman, Silvio Micali
Non-Interactive Zero-Knowledge with Preprocessing

Non-Interactive Zero-Knowledge Proof Systems have been proven to exist under a specific complexity assumption; namely, under the Quadratic Residuosity Assumption which gives rise to a specific secure probabilistic encryption scheme.In this paper we prove that the existence of any secure probabilistic encryption scheme, actually any one-way encryption scheme, is enough for Non-Interactive Zero-Knowledge in a modified model. That is, we show that the ability to prove a randomly chosen theorem allows to subsequently prove non-interactively and in Zero-Knowledge any smaller size theorem whose proof is discovered.

Alfredo De Santis, Silvio Micali, Giuseppe Persiano

On the Theory of Security II

The Noisy Oracle Problem

We describe a model in which a computationally bounded verifier consults with a computationally unbounded oracle, in the presence of malicious faults on the communication lines. We require a fairness condition which in essence says that some of the oracle’s messages arrive uncorrupted. We show that a deterministic polynomial time verifier can test membership in any language in P-space, but cannot test membership in languages not in P-space, even if he is allowed to toss random coins in private. We discuss the zero knowledge aspects of our model, and demonstrate zero knowledge tests of membership for any language in P-space.

U. Feige, A. Shamir, M. Tennenholtz
On Generating Solved Instances of Computational Problems

We consider the efficient generation of solved instances of computational problems. In particular, we consider invulnerable generators. Let S be a subset of 0,1* and M be a Turing Machine that accepts S; an accepting computation w of M on input x is called a “witness” that x ∈ S. Informally, a program is an α-invulnerable generator if, on input 1n, it produces instance-witness pairs <x, w>, with |x| = n, according to a distribution under which any polynomial-time adversary who is given x fails to find a witness that x ∈ S, with probability at least α, for infinitely many lengths n.

Martín Abadi, Eric Allendert, Andrei Broder, Joan Feigenbaum, Lane A. Hemachandra
Bounds and Constructions for Authentication - Secrecy Codes with Splitting

It is the aim to deal with codes having unconditional security, which means that the security is independent of the computing power. Analogously to the theory of unconditional secrecy due to Shannon [12], Simmons developed a theory of unconditional authentication [10]. In this paper we give some new bounds and constructions for authentication/secrecy codes with splitting.

Marijke De Soete

Protocols

Untraceable Electronic Cash
(Extended Abstract)

The use of credit cards today is an act of faith on the p a t of all concerned. Each party is vulnerable to fraud by the others, and the cardholder in particular has no protection against surveillance.

David Chaum, Amos Fiat, Moni Naor
Payment Systems and Credential Mechanisms with Provable Security Against Abuse by Individuals
(Extended Abstract)

Payment systems and credential mechanisms are protocols allowing individuals to conduct a wide range of financial and social activities wide preventing even infinitely powerful and cooperating organizations from monitoring these activities. These concepts were invented and first studied by David Chaum.

Ivan Bjerre Damgård
A Universal Problem in Secure and Verifiable Distributed Computation

A notion of reduction among multi-party distributed computing problems is introduced and formally defined. Here the reduction from one multi-party distributed computing problem to another means, roughly speaking, a secure and verifiable protocol for the first problem can be constructed solely from a secure and verifiable protocol of the second. A universal or complete multi-party distributed computing problem is defined to be one to which the whole class of multiparty problems is reducible. One is interested in finding a simple and natural multi-party problem which is universal. The distributed sum problcm, of summing secret inputs from N parties, is shown to be such a universal problem. The reduction yields an efficient systematic method for the automatic generation of secure and verifiable protocols for all multi-party distributed computing problems. Incorporating the result from [l4], it also yields an alternative proof to the completeness theorem of [9] that assuming honest majority and the existence of a trap-door function, for all multi-party problems, there is a secure and verifiable protocol.

Ming-Deli A. Huang, Shang-Hua Teng

Security Concerns

An Abstract Theory of Computer Viruses

In recent years the detection of computer viruses has become common place. It appears that for the most part these viruses have been ‘benign’ or only mildly destructive. However, whether or not computer viruses have the potential to cause major and prolonged disruptions of computing environments is an open question.

Leonard M. Adleman
Abuses in Cryptography and How to Fight Them
(Extended Abstract)

The following seems quite familiar: “Alice and Bob want to flip a coin by telephone. (They have just divorced, live in different countries, want to decide who will have the children during the next holiday.). . .” So they use [Blu82]’s (or an improved) protocol. However, Alice and Bob’s divorce has been set up to cover up their spying activities. When they use [Blu82]’s protocol, they don’t care if the “coinflip” is random, but they want to abuse the protocol to send secret information to each other. The counter-espionage service, however, doesn’t know that the divorce and the use of the [Blu82]’s protocol are just cover-ups.In this paper, we demonstrate how several modern cryptosystems can be abused. We generalize [Sim83b]’s subliminal channel and [DGB87]’s abuse of the [FFS87, FS86] identification systems and demonstrate how one can prevent abuses of cryptosystems.

Yvo Desmedt
How to (Really) Share a Secret

In information based systems, the integrity of the information (from unauthorized scrutiny or disclosure, manipulation or alteration, forgery, false dating, etc.) is commonly provided for by requiring operation(s) on the information that one or more of the participants, who know some private piece(s) of information not known to all of the other participants, can carry out but which (probably) can’t be carried out by anyone who doesn’t know the private information. Encryption/decryption in a single key cryptoalgorithm is a paradigm of such an operation, with the key being the private (secret) piece of information. Although it is implicit, it is almost never stated explicitly that in a single-key cryptographic communications link, the transmitter and the receiver must unconditionally trust each other since either can do anything that the other can.

Gustavus J. Simmons

Linear Complexity

The Strict Avalanche Criterion: Spectral Properties of Boolean Functions and an Extended Definition

A necessary and sufficient condition on the Walsh-spectrum of a boolean function is given, which implies that this function fulfills the Strict Avalanche Criterion. This condition is shown to be fulfilled for a class of functions exhibiting simple spectral symmetries. Finally, an extended definition of the Strict Avalanche Criterion is proposed and the corresponding spectral characterization is derived.

Réjane Forrié
On the Linear Syndrome Method in Cryptanalysis

The linear syndrome (LS) method is elaborated for the purpose of solving problems encountered in cryptanalysis, which can be reduced to the following mathematical setting. Suppose the cryptanalyst has at his hand a sufficiently long segment of the binary sequence B = A + X, where A is a linear sequence with known feedback polynomial f(x) and X is a sequence with unknown or very complicated algebraic structure, but is sparse in the sense that, if we denote its signals by x(i), i > o, then we shall have s = prob( x(i) = 1) = 1/2 − ε, o < ∈ < 1/2. we call s the error rate of the sequence A in the sequence 8, and the job of the cryptanalyst is to recover the former from the captured segment of the latter.

Kencheng Zeng, Minqiang Hung
Aperiodic Linear Complexities of de Bruijn Sequences

Binary de Bruijn sequences of period 2n bits have the property that all 2n distinct n-tupies occur once per period. To generate such a sequence with an n-stage shift-register requires the use of nonlinear feedback. These properties suggest that de Bruijn sequences may be useful in stream ciphers. However, any binary sequence can be generated using a linear-feedback shift register (LFSR) of sufficient length. Thus, the linear complexity of a sequence, defined as the length of the shortest LFSR which generates it, is often used as a measure of the unpredictability of the sequence. This is a useful measure, since a well-known algorithm [1] can be used to successfully predict all bits of any sequence with linear complexity C from a. knowledge of 2C bits. AS an example, an m-sequence of period 2n −1 has linear complexity C=n, which clearly indicates that m-sequences are highly predictable.

Richard T. C. Kwok, Maurice Beale

Systems

The Application of Smart Cards for Rsa Digital Signatures in a Network Comprising Both Interactive and Store-and-Forward Facilities

Smart card technology is relatively new but offers an economic and convenient solution to the problems of user-authentication. This paper discusses the requirements for user authentication and digital signature in complex networks and examines the problems of integrating a smart-card sub-system. It proposes some design approaches for providing a useful lifetime for a smart card and for handling the computations required for 512-bit RSA digital signatures.

J. R. Sherwood, V. A. Gallo
Speeding Up Secret Computations with Insecure Auxiliary Devices

This paper deals with and gives some solutions to the problem of how a small device such as a smart card can efficiently execute secret computations using computing power of auxiliary devices like (banking-, telephone-, . . .) terminals which are not necessarily trusted. One of the solutions shows that the RSA signatures can be practically generated by a smart card.

Tsutomu Matsumoto, Koki Kato, Hideki Imai
Developing Ethernet Enhanced-Security System

The Ethernet Enhanced-Security System (EESS) provides encryption of Ethernet frames using the DES algorithm with pairwise keys, and a centralized key distribution center (KDC) using a variation of the Needham and Schroeder key distribution protocol. This paper is a discussion of some practical problems that arose during the development of this system. Section 1 contains an overview of the system and section 2 provides more detail on the system architecture. The remaining sections discuss various problem that were considered during the development and how they were resolved.

B. J. Herbison
A Secure Audio Teleconference System

Users of large communication networks often require a multi-party teleconferencing facility. The most common technique for providing secure audio teleconferencing requires the speech of each participant to be returned to clear form in a bridge circuit where it is combined with the speech of the other participants. The combined signal is then re-encrypted for distribution to the conferees. This introduces a security weakness as the bridge works with the clear speech and the cipher keys for all of the participants. In this paper we describe secure conferencing systems in which some of the bridge functions are distributed among the users so that no additional security weakness is introduced. The network conference bridge knows the addresses of the participants and receives and distributes the encrypted speech without modification. The conference system can be used with a number of encryption algorithms and the system is suitable for deployment on digital networks such as ISDN. High quality and robust secure voice communications can be provided with this technique.

D. G. Steer, L. Strawczynski, W. Diffie, M. Wiener

Short Rump Session Presentations

Diffie-Hellman is as Strong as Discrete Log for Certain Primes

Diffie and Hellman proposed a key exchange scheme in 1976, which got their name in the literature afterwards. In the same epoch-making paper, they conjectured that breaking their scheme would be as hard as taking discrete logarithms. This problem has remained open for the multiplicative group modulo a prime P that they originally proposed. Here it is proven that both problems are (probabilisticly) polynomial-time equivalent if the totient of P-1 has only small prime factors with respect to a (fixed) polynomial in 2logP.There is no algorithm known that solves the discrete log problem in probabilistic polynomial time for the this case, i.e., where the totient of P-1 is smooth. Consequently, either there exists a (probabilistic) polynomial algorithm to solve the discrete log problem when the totient of P-1 is smooth or there exist primes (satisfying this condition) for which Diffie-Hellman key exchange is secure.

Bert den Boer
Secret Error-Correcting Codes (SECC)

A secret error-correcting coding (SECC) scheme is one that provides both data secrecy and data reliability in one process to combat with problems in an insecure and unreliable channel. In an SECC scheme, only the authorized user havingsecretly held information can correct channel errors systematically. Two SECC schemes are proposed in this paper. The first is a block encryption using Preparata based nonlinear codes; the second one is based on block chaining technique. Along with each schemes can be secure.

Tzonelih Hwang, T. R. N. Rao
The Detection of Cheaters in Threshold Schemes

Informally, a (t, w)-threshold scheme is a way of distributing partial information (shadows) to w participants, so that any t of them can easily calculate a key (or secret), but no subset of fewer than t participants can determine the key. In this paper, we present an unconditionally secure threshold scheme in which any cheating participant can be detected and identified with high probability by any honest participant, even if the cheater is in coalition with other participants. We also give a construction that will detect with high probability a dealer who distributes inconsistent shadows (shares) to the honest participants. Our scheme is not perfect; a set of t − 1 participants can rule out at most % MathType!MTEF!2!1!+- % feaafiart1ev1aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn % hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr % 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq-Jc9 % vqaqpepm0xbba9pwe9Q8fs0-yqaqpepae9pg0FirpepeKkFr0xfr-x % fr-xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaaGymaiabgU % caRmaabmaabaqbaeqabiqaaaqaaGqaaiaa-DhacqGHsislcaWF0bGa % ey4kaSIaaGymaaqaaiaa-rhacqGHsislcaaIXaaaaaGaayjkaiaawM % caaaaa!403D! $$ 1 + \left( {\begin{array}{*{20}c} {w - t + 1} \\ {t - 1} \\ \end{array} } \right) $$ possible keys, given the information they have. In our scheme, the key will be an element of GF(q) for some prime power q. Hence, q can be chosen large enough so that the amount of information obtained by any t − 1 participants is negligible.

E. F. Brickell, D. R. Stinson
On the Power of 1-way Functions
(Abstract)

The earliest definition of 1-way function is due to Berman [Ber77], who considered polynomial-time computable, length-increasing, 1–1 functions that do not have a polynomial-time computable inverses. Recently, more powerful notions are considered, e.g., polynomial-time computable, length-increasing, 1–1 functions f such that the probability that a BPP algorithm can compute z from f(x) for a randomly selected x is superpolynomidly small [CYa82]. Whatever definition is used, these functions are necessarily easy invert on some inputs:Proposition 1If f is a polynomial-time computable, length-increasing, 1-1 function, and if p is a polynomial, then there is a polynomial time algorithm that for sufficiently large n inverts f on at least p(n) strings of length less than n. Therefore, the range of every such function must contain a polynomial-time computable subset of arbitrarily large polynomial census.

Stuart A. Kurtz, Stephen R. Mahaney, James S. Royer
“Practical IP” ⊆ MA

Interactive protocols [GMR] and Arthur-Merlin games [B] have attracted considerable interest since their introduction a few years ago. These notions make it (probably) possible to extend the concept of what is “efficiently” provabk to include, for instance, graph non-isomorphism [GMW]. In this short note, we assume that the reader is familiar with interactive protocols, Arthur-Merlin games, and the notion: of zero-knowledge [GMR].

Gilles Brassard, Ivan Bjerre Damgaard
Zero-Knowledge Authentication Scheme with Secret Key Exchange
(extended abstract)

In this note we first develop a new computationally zero-knowledge interactive proof system of knowledge, which then is modified into an authentication scheme with secret key exchange for subsequent conventional encryption. Implemented on a standard 32-bit chip or similar, the whole protocol, which involves mutual identification of two users, exchange of a random common secret key and verification of certificates for the public keys (RSA, 512 bits) takes less than 0.7 seconds.

Jørgen Brandt, Ivan Damgård, Peter Landrock, Torben Pedersen
Backmatter
Metadaten
Titel
Advances in Cryptology — CRYPTO’ 88
herausgegeben von
Shafi Goldwasser
Copyright-Jahr
1990
Verlag
Springer New York
Electronic ISBN
978-0-387-34799-8
Print ISBN
978-0-387-97196-4
DOI
https://doi.org/10.1007/0-387-34799-2