main-content

Recently, there has been a lot of interest in provably "good" pseudo-random number generators [lo, 4, 14, 31. These cryptographically secure generators are "good" in the sense that they pass all probabilistic polynomial time statistical tests. However, despite these nice properties, the secure generators known so far suffer from the han- cap of being inefiicient; the most efiicient of these take n2 steps (one modular multip- cation, n being the length of the seed) to generate one bit. Pseudc-random number g- erators that are currently used in practice output n bits per multiplication (n2 steps). An important open problem was to output even two bits on each multiplication in a cryptographically secure way. This problem was stated by Blum, Blum & Shub [3] in the context of their z2 mod N generator. They further ask: how many bits can be o- put per multiplication, maintaining cryptographic security? In this paper we state a simple condition, the XOR-Condition and show that any generator satisfying this condition can output logn bits on each multiplication. We show that the XOR-Condition is satisfied by the lop least significant bits of the z2-mod N generator. The security of the z2 mod N generator was based on Quadratic Residu- ity [3]. This generator is an example of a Trapdoor Generator [13], and its trapdoor properties have been used in protocol design. We strengthen the security of this gene- tor by proving it as hard as factoring.

A Prototype Encryption System Using Public Key

The use of cryptography to produce a secure method of user authentication and to encipher traffic on data or digital links has been the aim of many of those defining theoretical schemes and techniques. This paper describes an experimental realisation of these aims in hardware, in order to provide a secure and authenticated communications channel.

S C Serpell, C B Brookson, B L Clark

A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms

A new signature scheme is proposed together with an implementation of the Diffie - Hellman key distribution scheme that achieves a public key cryptosystem. The security of both systems relies on the difficulty of computing discrete logarithms over finite fields.

Taher ElGamal

A Public-Key Cryptosystem Based on the Word Problem

The undecidable word problem for groups and semigroups is investigated as a basis for a public-key cryptosystem. A specific approach is discussed along with the results of an experimental implementation. This approach does not give a provably secure or practical system, but shows the type of cryptosystem that could be constructed around the word problem. This cryptosystem is randomized, with infinitely many ciphertexts corresponding to each plaintext.

Neal R. Wagner, Marianne R. Magyarik

Efficient Signature Schemes Based on Polynomial Equations (preliminary version)

Signatures based on polynomial equations modulo n have been introduced by Ong, Schnorr, Shamir [3]. We extend the original binary quadratic OSS-scheme to algebraic integers. So far the generalised scheme is not vulnerable by the recent algorithm of Pollard for solving s12 + k s22 = m (mod n) which has broken the original scheme.

H. Ong, C. P. Schnorr, A. Shamir

Identity-Based Cryptosystems and Signature Schemes

In this paper we introduce a novel type of cryptographic scheme, which enables any pair of users to communicate securely and to verify each other’s signatures without exchanging private or public keys, without keeping key directories, and without using the services of a third party. The scheme assumes the existence of trusted key generation centers, whose sole purpose is to give each user a personalized smart card when he first joins the network. The information embedded in this card enables the user to sign and encrypt the messages he sends and to decrypt and verify the messages he receives in a totally independent way, regardless of the identity of the other party. Previously issued cards do not have to be updated when new users join the network, and the various centers do not have to coordinate their activities or even to keep a user list. The centers can be closed after all the cards are issued, and the network can continue to function in a completely decentralized way for an indefinite period.

A Knapsack Type Public Key Cryptosystem Based On Arithmetic in Finite Fields (preliminary draft)

We introduce a new knapsack type public key cryptosystem. The system is based on a novel application of arithmetic in finite fields, following a construction by Bose and Chowla. Appropriately choosing the parameters, we can control the density of the resulting knapsack. In particular, the density can be made high enough to foil “low density” attacks against our system. At the moment, we do not know of any attacks capable of “breaking” this system in a reasonable amount of time.

Benny Chor, Ronald L. Rivest

Some Public-Key Crypto-Functions as Intractable as Factorization

It is well known that if one can factor the modulus R = pq (p, q distinct large primes) of the RSA cryptosystem [4], then the system can be broken. However, it is not known whether the problem of breaking an RSA cryptosystem is equivalent in difficulty to factoring R. Rabin [3] has given a publik-key encryption method which is as difficult to break as it is to factor R, but the decryption process produces four possible candidates for the correct message and only one of these is the correct redundancy (e.g., a cryptographic key) there is no way for the sender to allow the recipient to identify the correct message being transmitted. Also, in [1] Lipton has pointed out some other weaknesses in the scheme when it is used as a cryptosystem. Indeed, Rabin only advocated its use as a signature system.

H. C. Williams

Computing Logarithms in GF (2n)

Consider the finite field having q elements and denote it by GF(q). Let α be a generator for the nonzero elements of GF(q). Hence, for any element b≠0 there exists an integer x, 0≤x≤q−2, such that b=αx. We call x the discrete logarithm of b to the base α and we denote it by x=logαb and more simply by log b when the base is fixed for the discussion. The discrete logarithm problem is stated as follows:Find a computationally feasible algorithm to compute logαb for any b∈GF(q), b≠0.

I. F. Blake, R. C. Mullin, S. A. Vanstone

Wyner’s Analog Encryption Scheme: Results of a Simulation

This paper presents the results of a simulation of an analog encryption scheme. The scheme, introduced in 1979 by Aaron Wyner of Bell Telephone Laboratories, provides secure, accurate scrambling of speech waveforms, while conforming to the bandlimitedness of a telephone channel. The simulation confirms the scheme’s theoretical properties, based on numerical measures and on listening to encrypted and decrypted waveforms.

Burt S. Kaliski

On Rotation Group and Encryption of Analog Signals

The problem of generating randon elements in groups has direct application to cryptography. For instance, we like to know whether the DES permutations are randon permutations of the 264 possible 64-bit words. The whole symmetric group is known to be generated by certain k-functions (7). Another example is the Wyner voice encryption scheme (12) which requires the production of large numbers of random real orthogonal matrices. N. J. A. Sloane has given a survey on this problem (11) which has led to the following questions for a given group G: 1.How does one generate elements of G at random?2.How can one test if certain given elements of G really are random?3.Does a given subset H generate the whole group G?4.If so, how long does it take? In this paper, we consider these questions for the orthogonal group O(d) for any positive integer d. By looking at the Lie algebra o(d) of O(d) and one-parameter subgroups of O(d), we can find the generation of an arbitrary element in terms of one-parameter rotation groups in uniform fashion. The length of generation can be determined. Random elements are generated using random number generator on the real parameter space of each one-parameter subgroup.

Su-shing Chen

The History of Book Ciphers

You can do a lot with a book, besides read it! In fact, we know that by 1526 — some 70 years after Gutenberg printed his first Bible — at least one of out forebears, Jacobus Silvestri, was thinking of how a book might be used for cryptographic purposes. Silverstri wrote of a sort of code book, or dictionary, which he recommended as a means to encipher written communications. For Silvestri, we can trace the development of book ciphers over a period of at least 400 years.

Albert C. Leighton, Stephen M. Matyas

An Update on Factorization at Sandia National Laboratories

Since Crypto 83 we have had considerably more experience in factoring large integers. Implementation of various modifications to the quadratic sieve algorith have enabled the factorization of hard 70-digit numbers in times comparable to 50 digits one year ago. These modifications include: 1)Subsequences with large divisors (Special q’s).2)Multipliers to improve quadratic properties.3)Increased size of prime base using segmented Gaussian Elimination.4)Optimization of the code with respect to Cray hardware. Using this code in its various stages of development the 10 most wanted numbers from the Cunningham Project have been factored. Details will be published elsewhere.

J. A. Davis, D. B. Holdridge

An LSI Digital Encryption Processor (DEP)

This paper describes an LSI digital encryption processor (DEP) for data ciphering. The DEP combines a fast hardware implementation of the Data Encryption Standard (DES) published by the National Bureau of Standards (PJBS) with a set of multiplexers and registers under the control of a user programmed sequencer. This architecture enables the user to program any of the DES modes of operation published by NBS. In addition, multiple ciphering operations and multiplexed ciphering operations using up to four different keys may be programmed and internally executed without any external hardware.The DEP is designed as a standard microprocessor peripheral. This LSI device should reduce the current cost and simplify the process of encrypting digital data to a point where it is feasible to include a ciphering function in modems, terminals, and work stations. The ability to internally program cascaded ciphers should substantially increase the security of the DES algorithm and hence, the life of the encryption equipment.

R. C. Fairfield, A. Matusevich, J. Plany

Efficient hardware and software implementations for the DES

Importance of DES: NBS, ANSI and ISO (in study) have DES as standards.The available divices or programs have some tedious properties for an extensive use: • hardware is expensive or slow, and limited,• software is slow.

Marc Davio, Yvo Desmedt, Jo Goubert, Frank Hoornaert, Jean-Jacques Quisquater

Efficient hardware implementation of the DES

Several improvements to realize implementations for DES are discussed. One proves that the initial permutation and the inverse initial permutation can be located at the input, respectively the output of each mode in DES. A realistic design for an exhaustive key search machine is presented.

Frank Hoornaert, Jo Goubert, Yvo Desmedt

A Self-Synchronizing Cascaded Cipher System with Dynamic Control of Error Propagation

A cipher system used for secure communication over a noisy channel can automatically synchronize the sender and receiver by computing a stateless function of a key and a limited amount of the recent ciphertext. The more ciphertext feedback is used, the more the errors from the noisy channel are propagated. The less feedback is used, the easier ciphertext-only and chosen-plaintext attacks become. There is a trade-off between security and noise that must be made when a self-synchronizing system is built.This paper presents a self-synchronizing cascaded cipher system that permits most combinations of key and ciphertext feedback lengths and also allows adjustment of the trade-off between security and noise during system operation. At times when maximum security is not needed, the error propagation can be reduced temporarily.As implemented in hardware, the cascaded cipher has a storage register for each stage. The function computed would normally depend on the state of this storage, but different clocks are used at each stage to render the function stateless. The use of a cascade helps to keep the hardware cost down.

Norman Proctor

Efficient and Secure Pseudo-Random Number Generation (Extended Abstract)

Cryptographically secure pseudo-random number generators known so far suffer from the handicap of being inefficient; the most efficient ones can generate only one bit on each modular multiplication (n2 steps). Blum, Blum and Shub ask the open problem of outputting even two bits securely. We state a simple condition, the XOR-Condition, and show that any generator satisfying this condition can output logn bits on each multiplication. We also show that the logn least significant bits of RSA, Rabin’s Scheme, and the x2 mod N generator satisfy this condition. As a corollary, we prove that all boolean predicates of these bits are secure. Furthermore, we strengthen the security of the x2 mod N generator, which being a Trapdoor Generator, has several applications, by proving it as hard as Factoring.

Umesh V. Vazirani, Vijay V. Vazirani

An LSI Random Number Generator (RNG)

This paper describes a CMOS digital LSI device which generates a random bit stream based on the frequency instability of a free running oscillator. The output of the device is a truly random binary number; not pseudo random.

R. C. Fairfield, R. L. Mortenson, K. B. Coulthart

Generalized Linear Threshold Scheme

A generalized linear threshold scheme is introduced. The new scheme generalizes the existing linear threshold schemes. The basic principles involved in the construction of linear threshold schemes are laid out and the relationships between the existing schemes are completely established. The generalized linear scheme is used to provide a hierarchical threshold scheme which allows multiple thresholds necessary in a hierarchical environment.

S. C. Kothari

Security of Ramp Schemes

A k out of n p/s/r process [AS81] is a very efficient way to convey information (k words suffice to reclaim k words). But it provides virtually no cryptographic security for the information it deals with.

A Fast Pseudo Random Permutation Generator With Applications to Cryptology

Pseudo random sequences of integers are most commonly generated by linear congruence methods [5] or by linear shift registers [1]. These sequences can be used in cryptology if they are cryptographically secure [9]: A pseudo-random sequence is cryptographically secure if given any segment of the sequence, it is computationally infeasible to compute other segments of the sequence.

Selim G. Akl, Henk Meijer

On the Cryptographic Applications of Random Functions (Extended Abstract)

Now that “random functions” can be efficiently constructed([GGM]), we discuss some of their possible applications to cryptography: 1)Distributing unforgable ID numbers which can be locally verified by stations which contain only a small amount of storage.2)Dynamic Hashing: even if the adversary can change the key-distribution depending on the values the hashing function has assigned to the previous keys, still he can not force collisions.3)Constructing deterministic, memoryless authentication schemes which are provably secure against chosen message attack.4)Construction Identity Friend or Foe systems.

Oded Goldreich, Shafi Goldwasser, Silvio Micali

An Efficient Probabilistic Public-Key Encryption Scheme Which Hides All Partial Information

This paper introduces the first probabilistic public-key encryption scheme which combines the following two properties: (1)Perfect secrecy with respect to polynomial time eavesdroppers: For all message spaces, no polynomial time bounded passive adversary who is tapping the lines, can compute any partial information about messages from their encodings, unless factoring composite integers is in probabilistic polynomial time.(2)Efficiecy: It compares favorably with the deterministic RSA public-key cryptosystem in both encoding and decoding time and bandwidth expansion.The security of the system we propose can also be based on the assumption that the RSA function is intractable, maintaining the same cost for encoding and decoding and the Same data expansion. This implementation may have advantages in practice.

Manuel Blum, Shafi Goldwasser

RSA/Rabin least significant bits are secure (Extended Abstract)

We prove that RSA least significant bit is $$\tfrac{1} {2} + \tfrac{1} {{log^c N}}$$ secure, for any constant c (where N is the RSA modulus). This means that an adversary, given the ciphertext, cannot guess the least significant bit of the plaintext with probability better than $$\tfrac{1} {2} + \tfrac{1} {{log^c N}}$$, unless he can break RSA.Our proof technique is strong enough to give, with slight modifications, the following related results: (1)The loglog N least significant bits are simultaneously $$\tfrac{1} {2} + \tfrac{1} {{log^c N}}$$ secure.(2)The above also holds for Rabin’s encryption function.Our results imply that Rabin/RSA encryption can be directly used for pseudo random bits generation, provided that factoring/inverting RSA is hard.

Benny Chor, Oded Goldreich

Information Theory without the Finiteness Assumption, I: Cryptosystems as Group-Theoretic Objects

This paper gives a definition of cryptosystem in terms of confusion, diffusion and replacement. This definition lends itself to infinite, as well as finite, structures, and the notion of group appears to play an essential role in it. We offer three theses for discussion. The first is that all known cryptosystems fit the definition. The second is that (Shannon) confusion amounts to left composition of a cryptographic relation with a message and left action of a cryptographic relation on a message, as well as that (Shannon) diffusion amounts to left composition of a message with a cryptographic relation and left action of a message on a cryptographic relatin. The third is what Shannon calls mixing cannot occur unless certain type of “nonassociativity”, or at least lack of adherence to some algebraic laws, is present in the description of a cryptosystem in accordance with this definition.

G. R. Blakley

The ADFGX cryptograhic system, invented by Fritz Nebel, was introduced by Germany during World War I on March 5, 1918. The names ADFGX and ADFGVX for the successor system refer to the use of only five (and later six) letters A, D, F, G, (V,) X in the ciphertext alphabet. Kahn [KA] suggests that these letters were chosen because differences in Morse Internaltional symbols A •-D -••F ••-•G --•V •••-X -••- aided the prevent misindentificationi due to transmission noise.

Alan G. Konheim

Breaking Iterated Knapsacks

This paper presents an outline of an attack that we have used successfully to break iterated knapsacks. Although we do not provide a proof that the attack almost always works, we do provide some heuristic arguments. We also give a detailed description of the examples we have broken.

Ernest F. Brickell

Dependence of output on input in DES: Small avalanche characteristics

New general properties in the S-boxes were found. Techniques and theorems are presented which allow to evaluate the non-substitution effect in f and the key clustering in DES. Examples are given. Its importance related to the security of DES is discussed.

Yvo Desmedt, Jean-Jacques Quisquater, Marc Davio

Des has no Per Round Linear Factors

Interest in the cryptanalysis of the National Bureau of Standards’ Data Encryption Standard (DES) has been strong since its announcement. Here we describe an attack on a class of ciphers like DES based on linear factors.If DES had any non trivial factors, these factors would provide an easier attack than one based on complete enumeration. Basically, a factor of order n reduces the cost of a solution from 256 to 2n+256−n At worst (n=1 or 55), this reduces the cost of a Diffie-Hellman search machine from 20 million dollars to 10 million dollars: a 10 million dollar savings. At best (n=281), even without iteration, the method could reduce the cost from 256 to 228+228: a computation well within the reach of a personal computer.Alas, DES has no such linear factors.

J. A. Reeds, J. L. Manferdelli

A Message Authenticator Algorithm Suitable for a Mainframe Computer

Authenticators are widely used to protect payment messages againts active attack. They produce a number, sometimes called a ‘MAC’ which is a function of the whole message and a secrety key. the earlier name for them in banking was ‘test-key’, but this obsolescent term is confusing to cryptographers.

Donald Watts Davies

Key Management for Secure Electronic Funds Transfer in a Retail Environment

In this paper we consider a system whose function is to enable users to pay for goods or services by direct electronic transfer of funds. The system consists of terminals, located at retail outlets, which can communicate with acquirers representing variuos financial institutions.

Henry Beker, Michael Walker

Authentication Theory/Coding Theory

We consider a communications scenario in which a transmitter attempts to inform a remote receiver of the state of a source by sending messages through an imperfect communications channel. There are two fundamentally different ways in which the receiver can end up being misinformed. The channel may be noisy so that symbols in the transmitted message can be received in error, or the channel may be under the control of an opponent who can either deliberately modify legitimate messages or else introduce fraudulent ones to deceive the receiver, i.e., what Wyner has called an “active wiretapper” [1]. The device by which the receiver improves his chances Of detecting error (deception) is the same in either case: the deliberate introduction of redundant information into the transmitted message. The way in which this redundant information is introduced and used, though, is diametrically opposite in the two cases.

Gustavus J. Simmons

New Secret Codes Can Prevent a Computerized Big Brother

As the use of computers becomes more pervasive, they are capturing increasingly more revealing data about our habits, lifestyles, values, whereabouts, associations, political and religious orientation, etc. The current approach, which requires individuals to identify themselves in relationships with organizations, allows records of all an individual’s relationships to be linked and collected together into a dossier or personal profile. Even though such profiles are too extensive to evaluate manually on a mass basis, automated evaluation is becoming increasingly feasible.

David Chaum

Fair Exchange of Secrets (extended abstract)

We consider two problems which arose in the context of “The Exchange of Secret Keys” (see [1]). (1).In the original protocol, one party may halt the exchange and have a 2 to 1 expected time advantage in computing the other party’s secret. To solve this problem, when there is a particular point in the exchange where this time advantage may be critical, we presented at CRYPTO 83 (see [5]), a method for exchanging “fractions” of a single bit.In this paper we extend the method so as to apply it to all bits to be exchanged, and show how it can be used in a more abstract setting (as in [2]).(2).We also present a solution to the problem of how to ensure a fair exchange of secrets when one party in the exchange is “risk seeking”, while the other is “risk-adverse”.

Tom Tedrick

Cryptoprotocols: Subscription to a Public Key, The Secret Blocking and The Multi-Player Mental Poker Game (extended abstract)

Investigating the capabilities of public key and related cryptographic techniques has recently become an important area of cryptographic research. In this paper we present some new algorithms and cryptographic protocols (Cryptoprotocols) which enlarge the range of applications of public key systems and enable us to perform certain transactions in communication networks. The basic cryptographic tools used are Rabin’s Oblivious Transfer Protocol and an algorithm we developed for Number Embedding which is provably hard to invert.We introduce the protocol Subscription to a Public Key, which gives a way to transfer keys over insecure communication channels and has useful applications to cryptosystems. We develop the Secret Blocking Protocol, specified as follows : ‘A transfers a secret to B, B can block the message. If B does not block it, there is a probability P that he might get it. (1/2 ≤ P < 1, where we can control the size of P). A does not know if the message was blocked (but he can find out later)’.The classic cryptotransaction is the Mental Poker Game. A cryptographically secure solution to the Multi Player Mental Poker Game is given. The approach used in constructing the solution provides a general methodology of provable and modular Protocol Composition

Mordechai Yung

Poker Protocols

The situation is quite serious. After four years of research, there has been no satisfactory way for a group of card sharks to play poker over the phone. Until now. In this paper, we present a new method for playing ‘mental poker,’ discuss its significance, and mention some of the further questions it raises. Ante up.

Steven Fortune, Michael Merritt

A “Paradoxical” Solution to The Signature Problem

We present a general signature scheme which uses any pair of trap-door permutations (f0, f1) for which it is infeasible to find any x, y with f0(x) = f1(y). The scheme possesses the novel property of being robust against an adaptive chosen message attack: no adversary who first asks for and then receives sgnatures for messages of his choice (which may depend on previous signatures seen) can later forge the signature of even a singl additional message.For specific instance of our general scheme, we prove that (1)forging signatures is provably equivalent to factoring (i.e., factoring is polynomial-time reducible to forging signatures, and vice versa) while(2)forging an additional signature, after an adaptive chosen message attack is still equivalent to factoring. Such scheme is “paradoxical” since the above two properties were believed (and even “proven” in the folklore) to be contradictory.The new scheme is potentially practical: signing and verifying signatures are reasonably fast, and signatures are not too long.

Shafi Goldwasser, Silvio Micali, Ronald L. Rivest

Sequence Complexity as a Test for Cryptographic Systems

The complexity of a finite sequence as defined by Lempel and Ziv is advocated as the basis of a test for cryptographic algorithms. Assuming binary data and block enciphering, it is claimed that the difference (exclusive OR sum) between the plaintext vector and the corresponding ciphertext vector should have high complexity, with very high probability. We may refer to this as plaintext/ciphertext complexity. Similarly, we can estimate an “avalanche” or ciphertext/ ciphertext complexity. This is determined by changing the plaintext by one bit and computing the complexity of the difference of the corresponding ciphertexts. These ciphertext vectors should appear to be statistically independent and thus their difference should have high complexity with very high probability. The distribution of com plexity of randomly selected binary blocks of the same length is used as a reference. If the distribution of complexity generated by the cryptographic algorithm matches well with the reference distribution, the algorithm passes the complexity test. For demonstration, the test is applied to modulo multiplication and to successive rounds (iterations) of the DES encryption algorithm. For DES, the plaintext/ ciphertext complexity test is satisfied by the second round, but the avalanche complexity test takes four to five rounds before a good fit is obtained.

A. K. Leung, S. E. Tavares

An Update on Quantum Cryptography

Although written aboutfifteen years ago, Wiesner’s seminal paper, to which the origin of quantum cryptography must be traced back, did not appear in print until the spring of 1983 [W83]. The first published account of these ideas thus appeard in the proceedings of the second annual CRYPTO conference [BBBW83]. However, the concepts presented there were mostly of theoretical interest, because the technology involved in implementing them wouldhave been far beyond the reach of our current knowledge. In particular, single polarized photons had to be trapped, bounding back and forth between perfectly reflecting mirrors, and perfect efficiency in photon detection was required. To make up for this inconvenience, we could prove that no technology whatsoever, as well as no amount of computing power, could break some of our schemes, as long as some of the most fundamental principles of quantum physics hold true.

Charles H. Bennett, Gilles Brassard

How to Keep a Secret Alive

Extensible Partial Key, Key Safeguarding, and Threshold Systems

Partial key, key safeguarding, and threshold techniques appear to be another example of similar good ideas springing up in several places at nearly the same time — each with a different name and associated terminology. The use of partial key techniques actually appeared in print first in a technical report [Chaum 79] before the key safeguarding techniques were presented at a conference [Blakley 79], and before the threshold schemes were submitted for publication [Shamir 79]. (In fact the author received comments on the technical report from Shamir, along with a draft of the threshold scheme.)

David Chaum