Designers test-drive formal verification tools

SAN JOSE, Calif.  The RTL verification startups that have sprung up within the past few years promise huge reductions in the time it takes to do verification, a job that has grown to encompass the lion's share of today's chip design cycle. Exclusive interviews with designers who are beginning to use some of the formal verification technology touted by these companies show that the new tools are very promising, albeit often rough around the edges.To break through the design bottlenecks, some startups in this field are applying formal technology outside its traditional niche of equivalency checking. Others offer new types of hardware-assisted verification.

Sometimes the gains can be dramatic. At video-compression chip maker Vweb here, Axis Systems' Xcite-1000 accelerator slashed four months off the verification cycle for an MPEG-2 encoder chip, said Sho Long Chen, Vweb's president and chief executive officer. That's money in the bank for a small company in a competitive marketplace.

At graphics-accelerator provider Nvidia Corp. (Santa Clara, Calif.), tools from Real Intent, Averant and Innologic are starting to make a difference. "I'm interested in any of the tools that use formal techniques," said Chris Malachowsky, vice president of engineering at Nvidia. "All of these have the promise of making an order-of-magnitude improvement. The design community needs to invest in these things and give direction to EDA developers."

Nvidia's Malachowsky: 'order-of-magnitude' gains.

While all of the designers interviewed gave the new tools positive reviews, all found room for improvement. Many designers want more performance or capacity. "More performance  always more, more, more  for the dollar," said Keith Rieken, director of VLSI architecture at telecom vendor Morphics Technology (Campbell, Calif.), a user of the Hammer accelerator from Tharas Systems.

Another frequent request is to make the tools easier to use, particularly with respect to debugging and tracing errors to their original source. Some of the new tools still need work on user interfaces, designers say.

Significantly, none of the verification-tool users interviewed plan to completely replace logic simulation with new techniques. The new tools are generally seen as valuable, time-saving additions to an existing verification flow, rather than harbingers of a revolutionary methodology that will mothball traditional tools.

Turning 'semiformal'
Formal verification is already well-established in the form of equivalency checkers from companies like Avanti, Synopsys and Verplex. Avanti and Cadence Design Systems are offering model checking as well. What's new, however, is startups applying formal techniques to other areas, producing what are sometimes called "formal analysis" or "semiformal" tools.

0-In Design Automation (San Jose, Calif.) was one of the first to bring formal techniques into new realms. Its 0-In Check is a so-called "white box" verification tool that helps designers insert checkers, or monitors, into HDL code. And its 0-In Search uses formal techniques to expand the stimulus combinations that might cause the checkers to fire during simulation.

John Andrews, a technical-staff member at Tensilica Corp. (Santa Clara), a provider of configurable microprocessor cores, is using both 0-In Check and 0-In Search to complement a fairly traditional verification flow that includes the Cadence NC-Verilog and Synopsys VCS simulators. But these simulators alone weren't quite enough, he said.

"It was basically becoming very difficult to flush out those last few bugs, and it will become more difficult in the future as our cores become more complex," said Andrews. "We wanted to try to shrink the time to find those last few bugs."

Andrews starts using 0-In Check around the "midpoint" of verification, after the easy bugs are out of the way and it's becoming more difficult to find new ones. His group uses a script to launch the tool, and mainly selects checkers from 0-In's library, so the use is fairly automatic. The checkers are then used during regression testing with VCS.

The advantage of 0-In Check, Andrews said, is that it finds bugs early in the simulation. "As soon as the error happens the checker will fire, whereas without it, you might simulate for 100 or 1,000 cycles before you actually see the error." 0-In Search, he said, is more likely to find bugs that simulation might not find at all  bugs that require a lot of unusual conditions to occur.

Andrews said that the 0-In tools require some training to use, and he noted that 0-In Search can significantly slow simulation. He'd like to see the search capability speeded up. "But generally, the tools are in pretty good shape right now," he said.

Another new verification technology with formal roots, symbolic simulation, is provided by the ESP-VX product from Innologic Systems Inc. (San Jose). This technique uses symbols or variables, rather than ones and zeros, as simulation input. The simulator then propagates Boolean expressions rather than binary values. The approach promises huge gains in coverage with a small number of input vectors.

Centaur Technology (Austin, Texas), a designer of X86-compatible microprocessors, is taping out a chip this summer using ESP-XV for block-level verification. Brian Snider, senior circuit designer, said the Innologic Verilog ESP-VX simulator has pretty much replaced Cadence's Verilog-XL for verification of small blocks.
The big win with symbolic simulation, said Snider, is that one can verify the entire range of logic for a block with minimal input. "Instead of relying on 10,000 random vectors you can use just one symbolic vector," he said. And where random testing with Verilog-XL might require 100,000 cycles to verify a block, symbolic simulation might take only three, he said.

One shortcoming, however, is that the errors reported by symbolic simulation are "vague," Snider said. ESP-XV has a binary simulation mode, and Snider runs that mode as a second step to obtain test vectors that can be loaded into a waveform viewer. But the symbolic mode is a much faster way of finding the errors in the first place, he said.

Snider said it's easy to set up a symbolic simulation, and that Innologic provides a testbench generator. But he noted that the symbolic testbench probably won't be portable to other Verilog simulators.

Symbolic simulation today is appropriate for relatively small blocks, such as adders, arrays, programmable-logic arrays and custom blocks. Snider had no problems verifying a 30,000-transistor arithmetic logic unit with ESP-VX, but the symbolic simulation couldn't handle a 40,000-transistor instruction translator that had several levels of shifting. Thus, Centaur uses traditional, binary simulation tools for large blocks and for chip-level verification.

Formal model checking is an attractive concept because it checks properties derived from an original specification. But model-checking tools have generally been difficult to use. Averant Inc. (Sunnyvale, Calif.), formerly known as HDAC, claims to have overcome that hurdle with Solidify, which functions much like model checkers but claims to be faster and easier to use. Averant classifies Solidify as a "static functional verification" tool.

Compaq Computer Corp. (Houston) has been evaluating Solidify and is targeting it for a production flow within the next two months. Chandra Moturu, electrical hardware engineer in Compaq's design automation group, said the company is seeking to "tighten the screws" around module-level verification with Solidify.

"The goal is to eliminate stimulus writing in the traditional flow and to think in a little more abstract manner than detailed bits and vectors," Moturu said. For example, he noted, a Solidify property could verify that no invalid state transitions occurred in a state machine.

Solidify has a Verilog-like language for specifying properties, and Moturu noted that it's fairly easy to use, with five predefined macros and seven operators. But, he added, "to write a good-quality property will take a little getting used to. Depending on the way you write them, run-times will be affected."

Problem finder
Simple properties may run in just a few seconds, while complex ones take far longer, Moturu noted. He thinks the tool works best for blocks in the 20,000- to 40,000-gate range.

In Compaq's experience, Moturu said, Solidify does not find problems that simulation wouldn't eventually uncover. He said the tool's real advantage is that it can find many types of problems with less effort, and better performance, than simulation.

Moturu would like to see a better user interface for debugging, however. "The navigation of the information is not very easy right now, and we have suggested some enhancements to Averant," he said.

While Solidify can replace block-level simulation to a "major extent," Moturu isn't ready to pull the plug on Compaq's use of Verilog-XL and VCS. "The coverage [Solidify] guarantees is only as good as the properties we write," said Moturu. "I feel that some randomly generated functional stimulus should be applied as well, in conjunction with Solidify."

While not necessarily positioned as a "formal" tool, the Verix product from startup Real Intent (Santa Clara) is another type of static functional checker.
Compaq, in fact, is planning to use Verix as an intent checker that will run prior to Solidify.

Verix reads Verilog RTL code, deciphers the designer's intent and automatically checks for eight types of violations  including nets with conflicting assignments and block-enable conditions that can't be met. A follow-on tool, Verix Pro, promises a broader range of checks. While Nvidia's Malachowsky looks forward to the expanded checks, he's already seeing some good results with Verix.

Malachowsky said Nvidia is using Verix on submodules for a 50 million-transistor chip. Like lint tools, Malachowsky noted, Verix requires no user input other than the register-transfer-level Verilog file.

"But having a basis in formal methods convinced me Verix could do a more thorough job than just syntax checking," he said. "I believe in the promise of what it calls intent-driven verification."

The reason for checking intent, Malachowsky said, is that designers inevitably have a "bias" when they do verification. They'll find the bugs they're looking for  but those might not be the ones that are really important. "Checking your designs statically for things that might go wrong really has an appeal to me," he said.

Malachowsky said that Nvidia has found some dead code, and some expressions that had no way of firing, using Verix. He said it's easy to use and can pinpoint a problem in just a few minutes. Malachowsky views Verix as a "good stepping stone" to more sophisticated static verification products.

Hardware approach
While some vendors are focusing on semiformal techniques, others are adopting new approaches to hardware-assisted verification. These include both simulation acceleration and emulation, and some new products, such as Axis' Xtreme, that bridge the gap between the two.

Established EDA vendors such as Cadence's Quickturn division, Mentor Graphics, Aptix and Ikos still hold most of the hardware-assisted verification market. But many newcomers have popped up in the past few years, including Axis, Tharas, Simutech, Simpod and the Korean vendor Dynalith.

Tharas Systems Inc. (Santa Clara) rolled out Hammer, an RTL Verilog accelerator, at this year's Design Automation Conference. Hammer claims to use a new parallel-processing approach to simulation acceleration, gleaned from the founders' experience designing networking products. Morphics is using Hammer to accelerate the Synopsys VCS simulator.

Morphics' Rieken said his company evaluated several types of hardware-assisted verification and decided that desktop acceleration products offered the best trade-offs in terms of price, performance, compile times and debugging visibility.

"We try to employ Hammer where it benefits us the most  high-activity event simulations that would slow our software event-based simulation," Rieken said. He noted that Hammer runs around 1.5 kHz for Morphics' most complicated test cases, and that the speed doesn't vary much. Software-only simulation with VCS, on the other hand, can vary from 400 Hz for "light" test cases down to 20 Hz for event-heavy test cases.

A primary advantage of Hammer, Rieken said, is that it fits easily into Morphics' existing verification flow. Little preparation of netlists is necessary, and the debugging capabilities are comparable to those of VCS. Hammer can generally run any synthesizable code, but doesn't support all constructs that might be used in testbenches, which are not accelerated by Morphics.

Still, Rieken continues to use some software-only simulation. That's because front-end design requires a highly interactive environment. Hammer requires compilation times that may range from tens of minutes to a couple of hours until a simulation run can begin.
Axis Systems Inc. (Sunnyvale), meanwhile, is making inroads with its Xcite-1000 accelerator, also positioned as an easy-to-use, moderately priced alternative to more expensive acceleration and emulation systems. Xcite-1000 comes with a proprietary Verilog simulator.

Vweb turned to Xcite-1000 on a 500,000-gate MPEG-2 video compression chip, said Vweb's Chen, because the company needed to find something faster than software simulation to verify video frames.
"To run a sequence of 150 frames would take 10 days, or two weeks, on NC Verilog," she said. "With Axis it only took us eight hours to run through one sequence of 150 frames."

Chen said she called in several vendors with hardware-assisted solutions, but Axis was the only one that could set up a system and verify a chip-level module within two weeks. Compared with NC Verilog on a 400-MHz Sun Solaris workstation, she said, Xcite-1000's RTL speed is 28 times faster and its gate-level speed is 70 times faster.

However, compilation times can be long  with eight licenses, a 500K module took one hour. A new incremental compilation feature helps, and Chen would like to see it made more robust.

"There's no timing information," Chen added. "It's only functional simulation. If they could take care of timing, that would be great."

The vast majority of the new verification startups are focusing on RTL functional verification, where the problem is perceived to be greatest. But timing is important too, and that may be a place for some other aspiring startups to look.