Abstract

This thesis shows that rigorous verification of some analog implementation of any Quasi-Delay-Insensitive (QDI) asynchronous circuit is possible. That is, we show that in an accurate analog model, any behavior will adhere to the digital computation specifications under any possible noise and environment timing. Unlike a traditional simulation, we can analyze all of the infinitely many possible analog behaviors, in a time linear in the circuit size. A problem that arises in asynchronous circuit design is that the analog implementations of digital computations do not in general exhibit all properties demanded by the digital model assumed in circuit construction. For example, the digital model is atomic, in a sense we define. By contrast, analog models are non-atomic, and, as a result, we can give examples of real circuits with operational failures. There exist other attributes of analog models which can cause failures, and no complete classification exists. Ultimately there is only one way to solve this problem: we must show that all possible analog behaviors obey the atomic model. We focus on CMOS implementations, and the associated accepted bulk-scale model. Given any canonically-generated implementation of a general computation, we can rigorously verify it. The only exception to this rule is that restoring delay elements must be inserted into some implementations (fortunately, this change has no semantic effect on QDI circuits, by definition). Our theorem guarantees that when any possible analog behavior is properly observed, we obtain a valid, atomic digital execution. Several rigorous verifications have been produced, including one for an asynchronous pipeline circuit with dual-rail data.