Skip to main content
Top

2002 | Book

Computer Logic

Design Principles and Applications

Author: John Y. Hsu

Publisher: Springer New York

insite
SEARCH

About this book

An understanding of modern computer logic - incorporating core knowledge of number systems, number conversions, Boolean algebra, memories, and logic circuits - is fundamental to further study of computer architectures, system software, and computer networks. Computer Logic: Design Principles and Applications introduces and describes the relevant concepts, principles and applications of modern computer logic design. The book is self-contained, with an introductory chapter that concisely covers the history of computing devices, as well as number systems, number conversions, signed and unsigned integers, external code, and digital and digitizing concepts. Dedicated chapters on Boolean algebra, transistor circuits, combinatorial logic circuits, and sequential logic circuits round off the work. The emphasis is on design and applications.

Table of Contents

Frontmatter
Chapter 1. Introduction
Abstract
In the history of computing, mechanical devices were invented before electronic devices. In 1642 in France, Blaise Pascal invented a gear-driven machine named “Pascalene”; it was the first calculator to perform arithmetic [Comp96]. In 1833, the Englishman Charles Babbage created the Analytical engine, which was used to calculate and print mathematical tables. In fact, Babbage conceived the idea of a stored-program computer in which all the numbers and instructions were entered before a computation could begin. In 1889, the Electric Tabulating Machine was invented by Herman Hollerith in the United States; it was the very first data-processing machine, and it was used to tabulate the 1890 census.
John Y. Hsu
Chapter 2. Boolean Algebra
Abstract
Boolean algebra was invented by George Boole (1815–64), the eldest son of a shoemaker. He was born in Lincoln, England, on November 2, 1815, and died of pneumonia on December 8, 1864. At an early age, Boole was determined to teach himself Latin, Greek, French, German, and Italian. In 1854, he published his masterpiece, “An Investigation of the Laws of Thoughts, on Which are Founded the Mathematical Theories of Logic and Probability.” This paper collected basic laws and theorems that deal with logic. More than 100 years later the algebra he developed has become the theoretical backbone of logic design in digital computers.
John Y. Hsu
Chapter 3. Transistor Circuits
Abstract
The terms logic device and logic circuit are often interchangeable. A logic device or circuit is designed to realize a logic function. In other words, such a device may employ many logic gates, and each gate may employ many transistors. A transistor is a basic element or switch in a logic circuit. The transistor was invented in 1947 by three scientists at Bell Labs: John Bardeen, Walter H. Brattain, and William P. Shockley. What is a transistor? A transistor is a current amplifier made of a semiconductor, also called a solid-state material. There are many types of materials used in semiconductors, including Si (silicon), Ge (germanium), and GaAs (gallium arsenide). Silicon is by far the most popular material for semiconductors because silicon-based transistors are reliable, fast, and cheap. However, due to recent developments, GaAs is gathering attention for its ultrahigh speed. Regardless of the material or internal design, a transistor operates like a switch in a logic circuit.
John Y. Hsu
Chapter 4. Combinational Logic Circuits
Abstract
In a digital system, the logic circuits can be divided into two classes: combinational and sequential. A combinational circuit uses logic gates only; a sequential circuit uses flip-flops (ffs) and logic gates. The discussion of ffs is left to the next chapter. Most of the flip-flops are synchronous. Such a flip-flop has two sets of input—the clock and the data—and its output state changes in synchronization with the clock.
John Y. Hsu
Chapter 5. Sequential Logic Circuits
Abstract
A sequential circuit consists of logic gates and flip-flops. A flip-flop (ff) is a bistable device that has two outputs. One output indicates the true variable of the output, and the other indicates its complement. Such a device is used to store one bit of information. In concept, a sequential circuit uses logic gates to provide the control functions, and it uses flip-flops to store the digital signals. If the output of an ff changes as soon as its input changes, it is called an asynchronous ff. If the output of an ff changes as its input changes but is controlled by a clock, it is a synchronous ff. An asynchronous ff requires no clock, but a synchronous one does. A synchronous ff has a clock input in addition to its data inputs. Thus, the data inputs and clock input jointly control the timing of the change in its output voltage. By grouping an ordered set of flip-flops, we obtain a register. Thus, a register is used to store many bits where each ff is a one-bit storage cell in a register. The length of a register is the number of bits that can be stored. Precisely, an eight-bit register can store eight bits, a 16-bit register can store 16 bits, and a 32-bit register can store 32 bits. If the output of register A is connected to the input of register B, the presence of a clock at the input can transfer the bits from register A to register B. The bits in register A remain after the operation.
John Y. Hsu
Backmatter
Metadata
Title
Computer Logic
Author
John Y. Hsu
Copyright Year
2002
Publisher
Springer New York
Electronic ISBN
978-1-4613-0047-2
Print ISBN
978-1-4612-6542-9
DOI
https://doi.org/10.1007/978-1-4613-0047-2