z-logo
open-access-imgOpen Access
The Impact of Statistical Leakage Models on Design Yield Estimation
Author(s) -
Rouwaida Kanj,
Rajiv Joshi,
Sani Nassif
Publication year - 2011
Publication title -
vlsi design
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.123
H-Index - 24
eISSN - 1065-514X
pISSN - 1026-7123
DOI - 10.1155/2011/471903
Subject(s) - curse of dimensionality , leakage (economics) , matching (statistics) , yield (engineering) , process variation , computer science , key (lock) , process (computing) , design of experiments , algorithm , mathematics , statistics , machine learning , materials science , computer security , metallurgy , economics , macroeconomics , operating system
Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom