This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

These Five Principles Define Formal Verification

After 20 years of researching formal-verification algorithms, developing formal-verification tools, and applying formal-verification technology to solve real-world verification challenges, I could say endless things about formal verification. If you heard me giving a talk at Oski Technology’s quarterly “Decoding Formal Club” events, you would know that keeping me to my allocated time slots has been a challenge for the organizing team!

However, in this article, I will limit myself to the top five important factors to remember about formal verification.

1. There are many types of formal verification. All are useful.

When I talk about formal verification, I am referring to a particular kind of formal –– model checking—and a subset of that called “end-to-end formal verification.” Depending on an engineer’s experience, he or she might think of other types of formal.

Formal verification can include a range of technologies, such as equivalence checking and model checking (Fig. 1). Model checking (MC) can be further classified into different categories based on its application, complexity, and benefits, with the most complete end-to-end formal verification at the top (Fig. 2).

Equivalence checking compares two versions of the design to make sure they are functionally equivalent. Model checking verifies whether a particular set of properties holds true for the design. The pyramid above shows further classification of formal in model checking.

2. Formal verification has four layers of complexity. End-to-End formal offers the greatest benefits because itâs used to verify complete functionality of a design under test.

Automatic formal is built into many lint tools to detect simple design issues, such as pragma violations or deadcode. Formal apps are used to solve a specific problem in the verification space, such as low power, connectivity checking, or clock domain crossing. Assertion-based verification (ABV) enables users to write targeted assertions about the design to verify simple register-transfer-level (RTL) functionality. End-to-End formal is used to verify complete functionality of design blocks under test (DUT) and replace simulation at that level of hierarchy, using End-to-End formal checkers.

All forms of formal technology are useful in the verification process. To achieve formal sign-off, where formal is the sole verification means used to verify the functionality of a DUT, End-to-End formal is a must-have. When I talk about formal, I usually focus on End-to-End formal because it offers the greatest impact to an engineer’s verification success.

2. Formal verification is a must-have in the verification flow.

No matter the type of formal, if a verification flow doesn’t contain formal technology, then it’s not complete. In system-on-chip (SoC) design, using simulation alone will leave large holes for bugs to sneak through to silicon. Benefits of formal technology are numerous, and include the ability to rely on its exhaustiveness to cover all input scenarios so that corner-case bugs can be found.

Oski Technology often is called upon to do formal-verification service projects because of post-silicon bugs. A better strategy would have been to deploy formal from the start of the project to catch bugs early and reduce verification cycles and, ultimately, the project cycle.

Often, the barrier of formal technology adoption is the lack of formal resources in a company to deploy it in a meaningful way. Formal requires different thinking compared to simulation. Therefore, engineering managers can’t expect a simulation engineer to take on formal and run with it from day one. The best strategy is to learn from experts, be it through growing an internal team slowly, or formal service providers such as Oski Technology. Once formal is deployed, verification engineers realize it is a must-have in the verification flow.

3. Bounded results are as meaningful as unbounded results.

Most often, verification engineers ask us what to do when a checker didn’t reach conclusive results. A formal tool can return three different statuses: Pass means all is good, the checker is exhaustively proven on the design. Fail means the formal tool found a failure and users can debug based on the provided counter examples. Both are conclusive results. The third category has different names depending on the formal tool used –– explored, undetermined, inconclusive, or terminated. All sound negative, though these results are meaningful.

This class of results says that the formal tool could not find a bug that takes a less amount of cycles than the proof bound reported. If the tool comes back and reported an inconclusive status of a bounded proof of 20 cycles, then the user knows there’s no bug in the design of less than 20 cycles. That’s plenty of confidence in the design, certainly more than what the name of the category makes it sound like. The question becomes, “How good is 20 cycles?”

An important step in formal verification is to qualify the required proof depth for a given DUT. There’s a well-defined methodology to follow to calculate the required proof bound, which was published at DVCON 2014 (for a copy, go here). Once the user calculates the required proof depth, then all inconclusive results are separated into two groups. The first is checkers that reached the required proof depth. In this case, their status is as good as passing. The other is checkers that haven’t reached the required proof depth. For example, the required proof depth is 30, but the formal tool only reached 20 cycles in its analysis. At that point, the user looks at what techniques can be used to reach the required proof depth. Most often, there’s something he or she can do for that, such as using Abstraction Models.

Bounded results can be as good as unbounded results, if the checker reached the required proof depths. If the user doesn’t get unbounded proofs, it doesn’t mean the results aren’t meaningful.

4. Achieving formal sign-off requires a sound methodology.

Formal sign-off means formal technology is given the same weight as simulation in the verification flow. On blocks where formal is the sole technology used to verify the DUT, formal engineers are given the same accountability and responsibility as other design and verification engineers to ensure that no bugs are in the design.

To achieve formal sign-off requires a sound methodology. This means the formal testbench needs to have all of the End-to-End checkers to cover complete design functionality. Constraints need to be valid and not over-constraining. Abstraction Models and other complexity-solving techniques may be needed to make sure all checkers reach the required proof depth. Formal coverage would be required to qualify formal-verification results. Lacking any one of the components will render the formal-verification work incomplete and not able to achieve sign-off.

Before taking on a formal sign-off project, formal engineers should understand what it takes to achieve formal sign-off. They also must learn the methodology supporting it in relation to the four Cs –– checkers, constraints, complexity, and coverage.

5. Becoming a formal expert requires dedication.

Learning formal and using it for sign-off is no small task. We like to say “Formal favors the brave.” Not only must one be brave when diving into all the depths of formal, it also takes dedication and commitment. This means taking the time to learn formal well and use it on a full-time basis.

Sometimes, due to project resource limitations, formal engineers are on a part-time formal basis. From our experience, two part-time formal engineers are often less productive than one full-time formal engineer. Formal requires orthogonal thinking. Switching between the different worlds of simulation and formal during the same week usually doesn’t work well.

If an engineering organization wants to add formal into the mix of technologies for verification, consider dedicating at least one full-time resource, and invest in training.

Vigyan Singhal, CEO of Oski Technology, founded the formal-verification services firm in 2005. Vigyan is the host of Oski’s “Decoding Formal” video tutorials on formal verification, offering a look at the technology and how it can be applied to chip design. Under his leadership, the company launched the Decoding Formal Club, a forum to foster knowledge-sharing and networking among formal-verification enthusiasts. He graduated from the University of California, Berkeley.

The company can be reached via Twitter at @OskiTech, Facebook, and LinkedIn. Also check out Oski's blog.