Skip to main content
main-content

Über dieses Buch

Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5–6s (0.

Inhaltsverzeichnis

Frontmatter

Chapter 1. Introduction

Tradition VLSI circuit design would contend with three general objectives Maximize performance: In the context of an SRAM bitcell, this may be captured by several different metrics, for example, the read current supplied by the cell, or the read access time, given a specific peripheral circuit. Minimize power consumption: For an SRAM bitcell, this could be, for instance, the active current drawn by the cell during a read operation, or the total leakage current drawn by the cell during standby. Maximize noise and noise susceptibility: Several metrics may measure robustness of the circuit operation to noise. Again for an SRAM bitcell, a measure can be the static noise margin (SNM) of the cell.
Amith Singhee

Chapter 2. Extreme Statistics in Memories

Memory design specifications typically include yield requirements, apart from performance and power requirements. These yield requirements are usually specified for the entire memory array at some supply voltage and temperature conditions. For example, the designer may be comfortable with an array failure probability of one in a thousand at 100°C and 1 V supply; i.e., F f,array ≤ 10−3. However, how does this translate to a yield requirement for the memory cell? What is the maximum cell failure probability, F f,cell, allowed so as to satisfy this array failure probability requirement? We will answer these questions and in the process understand the relevance of extreme statistics in memory design.
Amith Singhee

Chapter 3. Statistical Nano CMOS Variability and Its Impact on SRAM

The years of “happy scaling” are over and the fundamental challenges that the semiconductor industry faces at technology and device level will deeply affect the design of the next generations of integrated circuits and systems. The progressive scaling of CMOS transistors to achieve faster devices and higher circuit density has fueled the phenomenal success of the semiconductor industry – captured by Moore’s famous law [1]. Silicon technology has entered the nano CMOS era with 35-nm MOSFETs in mass production in the 45-nm technology generation. However, it is widely recognised that the increasing variability in the device characteristics is among the major challenges to scaling and integration for the present and next generation of nano CMOS transistors and circuits. Variability of transistor characteristics has become a major concern associated with CMOS transistors scaling and integration [2, 3]. It already critically affects SRAM scaling [4], and introduces leakage and timing issues in digital logic circuits [5]. The variability is the main factor restricting the scaling of the supply voltage, which for the last four technology generations has remained virtually constant, adding to the looming power crisis.
Asen Asenov

Chapter 4. Importance Sampling-Based Estimation: Applications to Memory Design

In this chapter, we describe a state-of-the-art simulation-based methodology for statistical SRAM design and analysis. It relies on mixture importance sampling. The method is comprehensive and computationally efficient, and its results are in excellent agreement with standard Monte Carlo methods. All this comes at significant gains in speed and accuracy, with speedup of more than 100× compared to regular Monte Carlo. We will also discuss applications of the methodology to memory design. First, we provide a review of statistical sampling, numerical integration and Monte Carlo methods.
Rouwaida Kanj, Rajiv Joshi

Chapter 5. Direct SRAM Operation Margin Computation with Random Skews of Device Characteristics

SRAM has been generally characterized with some SNM [12] from the voltage–voltage (VV) plots or the Icrit from the current–voltage (IV) plots. They do indicate the robustness of the SRAM operations but would not provide sufficient information for SRAM designers, as to the possible SRAM yield and the redundancy requirements. One way to estimate SRAM yield is based on the expected fail count with the Poisson distribution
$$ {\hbox{YIELD}} = \sum\limits_{n = 0}^k {\tfrac{{{\lambda^n}\exp ( - \lambda )}}{{n!}}} $$
(5.1)
where k is the maximum number of fails in the chip that can be repaired
Robert C. Wong

Chapter 6. Yield Estimation by Computing Probabilistic Hypervolumes

Parameter variations are inevitable in any IC process. Process steps such as oxidation, doping, molecular beam epitaxy, etc., are all fundamentally statistical in nature [1]. Design of functioning circuits and systems has traditionally relied heavily on the presumption that the law of large numbers [2] applies and that statistical averaging predominates over random variations – more precisely, that the statistical distributions of important process, geometrical, environmental, and electrical parameters cluster closely about their means. Unfortunately, with feature sizes having shrunk from 90 to 65 nm recently (with further scaling down to 45 and 32 nm predicted by the ITRS roadmap [3]), this assumption is no longer valid – in spite of efforts to control them [4, 5], large variations in process parameters are the norm today. This problem is most severe for circuits that try to use the minimum transistor size (e.g., memory circuits [6] for which chip area is of high priority). With transistors having become extremely small (e.g.: gates are only 10 molecules thick; minority dopants in the channel number in the 10s of atoms), small absolute variations in previous processes have become large relative ones. Lithography-related variability at nanoscales [5], which affect geometrical parameters such as effective length and width, further compound parameter variation problems.
Chenjie Gu, Jaijeet Roychowdhury

Chapter 7. Most Probable Point-Based Methods

Most Probable Point (MPP) based methods are widely used for engineering reliability analysis and reliability-based design. Their major advantage is the good balance between accuracy and efficiency. The most commonly used MPP-based method is the First-Order Reliability Method (FORM) [14]. We will primarily discuss the FORM in this chapter.
Xiaoping Du, Wei Chen, Yu Wang

Chapter 8. Extreme Value Theory: Application to Memory Statistics

Device variability is increasingly important in memory design, and a fundamental question is how much design margin is enough to ensure high quality and robust operation without overconstraining performance. For example, it is very unlikely that the “worst” bit cell is associated with the “worst” sense amplifier, making an absolute “worst-case” margin method overly conservative, but this assertion needs to be formalized and tested. Standard statistical techniques tend to be of limited use for this type of analysis, for two primary reasons: First, worst-case values by definition involve the tails of distributions, where data is limited and normal approximations can be poor. Second, the worst-case function itself does not lend itself to simple computation (the worst case of a sum is not the sum of worst cases, for example). These concepts are elaborated later in this chapter.
Robert C. Aitken, Amith Singhee, Rob A. Rutenbar

Backmatter

Weitere Informationen

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Globales Erdungssystem in urbanen Kabelnetzen

Bedingt durch die Altersstruktur vieler Kabelverteilnetze mit der damit verbundenen verminderten Isolationsfestigkeit oder durch fortschreitenden Kabelausbau ist es immer häufiger erforderlich, anstelle der Resonanz-Sternpunktserdung alternative Konzepte für die Sternpunktsbehandlung umzusetzen. Die damit verbundenen Fehlerortungskonzepte bzw. die Erhöhung der Restströme im Erdschlussfall führen jedoch aufgrund der hohen Fehlerströme zu neuen Anforderungen an die Erdungs- und Fehlerstromrückleitungs-Systeme. Lesen Sie hier über die Auswirkung von leitfähigen Strukturen auf die Stromaufteilung sowie die Potentialverhältnisse in urbanen Kabelnetzen bei stromstarken Erdschlüssen. Jetzt gratis downloaden!

Bildnachweise