Emulation 2010

By Ann Steffora Mutschler
In an industry that was once fraught with patent infringement lawsuits, hostile takeovers and other exciting corporate warfare, the hardware-assisted emulation market has quieted down considerably. That doesn’t mean it has lost its luster, though. It still plays an integral, if not ever-increasing and expanding, role in the verification efforts of most semiconductor companies.

EDA Consortium statistics point to an emulation market that has been hovering around the $150 million or $160 million market for three years now, down from a high of about $250 million at the turn of the millennium. Facilitating the market contraction was the dot-com bubble bursting, which caused hardware emulation system prices to tank. Industry players remain optimistic that once the semiconductor industry recovers further, the emulation market will follow.

Where it’s used
Emulation fits squarely into the scenario of the massive SoC, and the fact that today’s SoCs include more and more embedded software. Traditional verification methods like simulation or HDL simulation can’t completely verify these designs and they take too long. This is where emulation comes into play. It can be used for hardware debugging at the block and system levels and even for embedded software validation.

Specifically, emulation addresses increasing hardware complexity in SoCs, particularly with multi-processor designs as well as increasing software content that is rising faster than hardware content. Both of these issues require hours of design validation and create a productivity gap that eventually translates into a profitability gap. And now with power being an intensely critical issue, especially for mobile devices and data centers, the need to analyze power consumption is critical both for different use cases as well as to verify the SoC after low-power techniques have been implemented. In addition, power analysis and power verification need to be done for the full SoC at RTL.

Depending on which vendor you speak with – and there are three main players (Cadence, EVE, and Mentor Graphics) – hardware-assisted emulation promises to improve hardware/software integration productivity by faster bring-up times, accelerating the verification environment by factors of 100 to 10,000, enabling earlier software integration, increasing program predictability while maximizing system quality before silicon, allowing hardware/software co-verification, deploying system-level metrics and coverage and confirming architecture power and performance through analysis and verification/validation at cycle accurate level with high performance.

Who Does What
At the heart of hardware-based emulation systems today reside either FPGAs or full-custom ICs to run the system. This is where things get interesting. EVE, the relati9ve newcomer to the emulation market (now 10 years old), is FPGA-based. Lauro Rizzatti, vice president of marketing and general manager of EVE-USA, said the company set out from day one to take a different approach from its competitors. EVE’s systems are based on commercial Xilinx FPGAs, which it has come under fire for using.

“On paper your tradeoffs are quite big when you do that which is essentially what Cadence and Mentor are saying when they are attacking us—that we are using commercial FPGAs and therefore compilation time is very slow and debugging is very limited,” Rizzatti said. “It might have been true in the early days but we have been working very, very hard in software to alleviate those known weaknesses.”

In terms of slow compilation times, Rizzatti admits a roadblock would be the fact that placing and routing an FPGA is really slow and EVE does not control the place and route. It uses Xilinx’s place and route. However, he pointed out, Xilinx has done quite a bit of improvement especially in terms of runtime. To improve debugging capabilities, he said EVE has added features for control of the internal of the FPGA in part through instrumentation that it built into the FPGA to provide visibility and additional software outside the FPGA to support that.

Mentor’s Veloce emulation product contains a full custom IC with an FPGA fabric as well as some additional capabilities to give the user full visibility. Specifically, for every clock at every node in the design, Veloce records the results for the user so when they’ve completed the emulation, they have a simulation-like waveform display that has all the information they need to debug.

“Cadence and Mentor invest lots of money in these full custom chips and we do it for good reason. One is that we can compile significantly faster and more predictably, more successfully than an FPGA-based approach and two, we have full unrestricted debug capability at speed,” said Jim Kenney, director of marketing for Mentor’s Emulation Division.

Moving forward, EVE believes that given the size of the emulation market, to develop custom silicon for an emulator is too expensive even if one vendor controlled the entire market. This path was the right one in the past but not in the future, Rizzatti said.

Kenney, meanwhile, said the solution is helping to grow the size of the emulation market. That includes hardware-assisted emulation for transaction-based test benches.

“The biggest change that has happened recently in emulation is the swaying towards transaction-based,” Kenny said. “Previously, design teams did a lot in circuit emulation that involved external hardware being physically connected to the emulator in order to provide interfaces and stimulus for their design. People are still doing that of course, but lately we’ve seen customers put a lot more focus on transaction-based especially with the advent of OVM and VMM so they are doing transaction-based test benches for their simulators now.”

Ran Avinun, marketing group director for the system design and verification segment at Cadence Design Systems, agrees. “There is a new opportunity to start to address the traditional RTL advanced verification market that was not addressed by most of the EDA vendors. Vendors of advanced verification—meaning HVL-based languages such as e, System Verilog, OVM—touched the surface in the past but didn’t really address it. We are getting to the point that we can take off and this is a $400 million or $500 million market that is mostly being addressed by simulation.”

Cadence defines the market as metric-driven verification. “We are starting to address it using OVM acceleration—actually taking a lot of the expertise and methodologies experience we have with OVM both through System Verilog and e and expanding this into acceleration,” Avinun explained.

This push to transaction-based is something of a swing-back to the late 1990s when there were simulation accelerators from IKOS and Zycad, but those ran out of steam because they couldn’t provide an advantage over the quickly accelerating workstation performance along with vendors like Cadence, Mentor and Synopsys tuning up their software simulators, Kenney recalled. But now with the advent of transaction-based test benches, simulation acceleration is coming back.

Mentor never actually left the space, he pointed out. IKOS (which Mentor acquired in 2002) had been developing a transaction capability because it was seen as the only way to get decent acceleration on an emulator. Fast forward to the present and Mentor has about 150 staff years into a product that grew out of that IKOS technology, TestBench XPress, a transaction-based environment for developing and running test programs, test bench on the workstation and the DUT running in the emulator.

“The key comes in helping to accelerate the testbenches, and this is where transaction-based comes in. It allows you to accelerate a significant amount of the testbench, primarily the transactors for the verification IP (such as a transactor for AXI, AMBA, PCI Express or Ethernet). If we can move some of that into the box and accelerate it, we start seeing some really good speedups,” Kenney said.

At the same time, customers also want to be able to leverage their pricey emulation purchases by allowing the system to be used as general-purpose resource. That means putting them in their data centers as opposed to their engineering labs. This would allow jobs to be queued up instead of the box being dedicated to a single project because of the external in-circuit emulation connections that are required. This is contributing to the push towards transaction-based verification, he noted.

Very closely related to emulation are FPGA-based prototyping and virtual prototyping. In fact, many semiconductor companies utilize two of the three in their verification efforts, Mentor and Cadence confirmed. While there are tradeoffs among the platforms, it is expected all three will co-exist in the market given the specific advantages and disadvantages to each approach. (For Cadence’s take on this, click here.)

Interestingly, running these system in conjunction with each other in what could be called a “hybrid” mode is garnering increasing interest among customers with usage models including emulation with FPGA-based prototyping; acceleration with virtual prototyping (SystemC, C/C++); and acceleration and emulation.

With more and more customers moving into simulation acceleration/transaction-based acceleration, the market for emulation in these areas could propel the revenue expansion vendors have been waiting for.

Editor’s Note: It has been suggested that Synopsys may have intentions to become an emulation provider as well, but the company disagrees. Synopsys does play in the FPGA prototyping space with the HARDI Electronics technology it gained when it purchased Synplicity in 2008 (Synplicity bought HARDI in 2007) and the CHIPit business unit of ProDesign that it also acquired in 2008. And, in terms of the virtual prototyping space, Synopsys pretty much cornered the market with its acquisition of both CoWare and VaST last month. The company stressed that it has no public intentions at this time to get into general-purpose emulation — right now; the focus is on HAPs and virtual prototyping.