View/Open

Date

Author

Metadata

Abstract

Powerful sequential optimization techniques can drastically change the Integrated Circuit
(IC) design paradigm. Due to the limited capability of sequential verification tools, aggressive sequential optimization is shunned nowadays as there is no efficient way to prove the
preservation of equivalence after optimization. Due to the fact that the number of transistors
fitting on single fixed-size die increases with Mooreâ s law, the problem gets harder over time
and in an exponential rate. It is no surprise that functional verification becomes a major
bottleneck in the time-to-market of a product. In fact, literature has reported that 70% of
design time is spent on making sure the design is bug-free and operating correctly. One of
the core verification tasks in achieving high quality products is equivalence checking. Essentially, equivalence checking ensures the preservation of optimized productâ s functionality to
the unoptimized model. This is important for industry because the products are modified
constantly to meet different goals such as low power, high performance, etc. The mainstream
in conducting equivalence checking includes simulation and formal verification. In simulation
approach, golden design and design under verification (DUV) are fed with same stimuli for
input expecting outputs to produce identical responses. In case of discrepancy, traces will be
generated and DUV will undergo modifications. With the increase in input pins and state
elements in designs, exhaustive simulation becomes infeasible. Hence, the completeness of
the approach is not guaranteed and notions of coverage has to be accompanied. On the other
hand, formal verification incorporates mathematical proofs and guarantee the completeness
over the search space. However, formal verification has problems of its own in which it
is usually resource intensive. In addition, not all design can be verified after optimization
processes. That is to say the golden model and DUV are vastly different in structure which
cause modern checker to give inconclusive result. Due to this nature, this thesis focuses in
improving the strength and the efficiency of sequential equivalence checking (SEC) using
formal approach.
While there has been great strides made in the verification for combinational circuits, SEC
still remains rather rudimentary. Without powerful SEC as a backbone, aggressive sequential
synthesis and optimization are often avoided if the optimized design cannot be proved to be
equivalent to the original one. In an attempt to take on the challenges of SEC, we propose
two frameworks that successfully determining equivalence for hard-to-verify circuits. The
first framework utilizes arbitrary relations between any two nodes within the two sequential
circuits in question. The two nodes can reside in the same or across the circuits; likewise,
they can be from the same time-frame or across time-frames. The merit for this approach
is to use global structure of the circuits to speed up the verification process. The second
framework introduces techniques to identify subset but yet powerful multi-node relations
(involve more than 2 nodes) which then help to prune large donâ t care search space and result
in a successful SEC framework. In contrast with previous approaches in which exponential
number of multi-node relations are mined and learned, we alleviate the computation cost
by selecting much fewer invariants to achieve desired conclusion. Although independent,
the two frameworks could be used in sequential to complement each other. Experimental
results demonstrate that our frameworks can take on many hard-to-verify cases and show a
significant speed up over previous approaches.