On the decline of testing efficiency as fault coverage approaches 100
Testing is an indispensable process to weed out the defective parts coming out of the manufacturing process. Traditionally, test generation targets on a specific fault model, usually the single stuck-at fault model, to produce tests that are expected to identify defects such as unintended shorts and...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Testing is an indispensable process to weed out the defective parts coming out of the manufacturing process. Traditionally, test generation targets on a specific fault model, usually the single stuck-at fault model, to produce tests that are expected to identify defects such as unintended shorts and opens. With this approach, the test quality relies on fortuitous detection of the non-target defects. As the quality demands and circuit sizes increase, the feasibility of test generation on a single fault model becomes questionable. In the paper, we present empirical data from experiments on ISCAS benchmark circuits to demonstrate that using traditional methods the probability of detecting nontarget defects drops rapidly as the fault coverage approaches 100%. By assuming surrogates, we explain the mechanism which produces this effect and describe a new test pattern generation approach with better testing efficiency. |
---|---|
ISSN: | 1093-0167 2375-1053 |
DOI: | 10.1109/VTEST.1995.512620 |