Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.

by Peggy Aycinena - Contributing EditorPosted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

More profoundly, however, this past week was the week that Cadence pulled out of DAC (see excerpt from the press release below). Though the news was not completely unanticipated, it still came as a shock – even after 4 weeks of shock and awe. Others may not agree, but I believe this decision will have far-reaching ramifications for the company, the conference, and the industry.

Meanwhile, and not coincidentally, this past week was also the week I finally learned the truth about design automation. The truth is that the big IDMs would (probably) get along just fine without the EDA industry, and that’s because they do so much friggin’ tool development in-house.

The truth is that the chip designers in the big IDMs not only have easy access to their own internal CAD-specific IT support, they also have easy access to the internal CAD tool developers themselves. Hence, the internal tool developers at the big IDMs learn from their in-house chip designers, and the chip designers learn from their in-house tool developers and the living is easy. The truth is that the big IDMs are often (always?) way ahead of the third party CAD tool providers in technology, methodology, and implementation because it behooves the IDMs to have it that way. Their existence and survival depends on it. The truth is that the IDMs don’t even think to turn to
third-party CAD tool providers until their own CAD tools have reached such a level of sophistication that the IDMs can be comfortable letting third-party vendors provide the “commodity” sub-set of their total requirements. That the third-party tool vendors then undercut each other (to death?) on increasingly anorexic margins for those commoditized tools is of little concern to the IDMs. In fact, it’s to their benefit that the third-party vendors tear each other apart in attempts to get the attention and business of the IDMs. The truth is, if the price and service and accommodations aren’t just right, the IDMs can just go internal again and continue on with their own

in-house tool development and support. They’ve got the money, the knowledge, and manpower to do it.

And that’s the truth about design automation and the context for everything that follows here.

[Hint: Click on “Print this article” up there on the right if you want to read this article all on one page.]

**************************

First there was Shock & Awe

*Four weeks ago: Cadence released sharply disappointing Q4’07 revenue numbers and predicted Q1’08 revenues would be closer to $280 million than the hoped-for $390 million. Cadence stock promptly plummeted 33%, the company’s market cap fell to its lowest level in 10 years, $2.9 billion (in contrast to its $6.4 billion market valuation at the height of last June’s unconfirmed Blackstone/KKR buyout rumors), and the financial press went gaga with headlines like, “Cadence: What Just Happened in the EDA Business?” The entire industry was implicated; SNPS, MENT, and LAVA tanked, as well.

The world’s first 2-billion transistor chip from Intel, a 65-nanometer 16-core 32-thread processor from Sun, novel techniques for on-chip gain and filtering functions from Sony, power-down architectures for increased on-chip power efficiencies from IMEC, an implantable amplifier for extraction of micro-Volt “brain biomarkers” from Medtronic, an ultra-low power 45-nanometer multimedia processor from TI, a 56-nanometer 2-bit-per-cell NAND Flash device with 34Mbits/s program throughput from SanDisk & Toshiba, a 45-nanometer 4Gbit 3D

double-stacked multi-level NAND flash device from Samsung, a 1-Volt wearable SOC for a wireless body-area-network monitor from Toumaz Technology, and a full GSM SIP complete with RF, baseband, and power management from Infineon.

* Three weeks ago, as well: EDAC hosted its annual CEO Forecast Panel in Silicon Valley. Despite plucky, niche-specific, bravado-laden claims from some – Jasper’s Kathryn Kranen: “Design-based verification will take off!” Denali’s Sanjay Srivastava: “2007 was the Year of the SoC!” MIPS’ John Bourgoin: “Long-term trends play into the IP market!” Synopsys’ Aart de Geus: “No doubt, we’ll do very well!” – only Mentor Graphics’ Wally Rhines had the courage (and recursion

charts) to posit an actual growth number for the industry for 2008, an astonishingly anemic 2%.

Of course, Cadence’s long-suffering Mike Fister, beaten into submission by The Street the previous week, had to admit: “Our customers are in recession and we’re worried.” But, he also noted that many continue to undervalue the worth of the tools his industry provides: “We’ve created value, [but some customers see] an opportunity to destroy that value, which is criminal.”

Sitting next to Fister, Kranen sharply retorted: The all-you-can-eat deals that characterize the business practices of the large EDA vendors exacerbate pricing pressures for the smaller vendors. Fister responded, “It’s a cop-out to say all-you-can-eat deals are prevalent.” And later, in apparent defiance of The Street as well, he added, “It doesn’t matter what the market says!”

Lest the report of the EDAC event be incomplete, the evening’s speakers also offered concrete suggestions. From Kathryn Kranen: Customers are seeking the little ROI from less expensive tools, but they should be looking for the Big ROI from the massive improvements that come with tools that are worth breaking their flows for. From John Bourgoin: The high growth rate [ahead] is in analog. From Sanjay Srivastava (after running over prophecies of ESL with a freight train): We need to be bolder with the adjacencies. From Aart de Geus (just returned from the annual pilgrimage to the World Economic Forum at Davos): We

must learn to sell better, [because] we are an industry that adds value. From Mike Fister: Automation is the key, [as well as] educating the customer. And from Wally Rhines: Throughout the history of EDA, we’ve grown by solving new problems. There will always be profits from advancing the technology!

*Three weeks ago, also: IEC’s DesignCon took over the Santa Clara Convention Center. With more than 130 exhibitors on the floor, the exhibition hall was large, lively, and well attended. There was definitely energy there. [But, don’t take my word for it. Enjoy

Graham Bell’s video interviews with many of the vendors on the show floor at DesignCon.] Eight companies at the conference were honored with IEC DesignVision Awards, including Cadence (design tools), Mentor & Cadence together (for OVM), Amphenol TCS (interconnect), Altera (ICs), Rambus (IP), Lattice (FPGAs), Future Plus Systems (system-level design tools), and Agilent (test and measurement). Various panels at the conference were rich

in content and candor, including a discussion of the current ecosystem for semiconductor startups which was shockingly blunt in its evaluation of the shortage of both venture capital and profits in the industry today.

But the flip side of the emotional schizophrenia within the semiconductor industry was also on display at DesignCon, where a panel moderated by Forbes’ Elizabeth Corcoran discussed (read “showcased!”) exuberant employment and market opportunities in India.

Freescale’s Ganesh Guruswamy said they’ve got thousands of employees working at geometries ranging from 250-to-45 nanometers in the company’s centers of excellence for SOC and IP design in India. Cadence’s Jaswinder Ahuja said Cadence has been underway for 20 years in India and an operation that Joe Costello once termed a “hidden jewel” is now a huge contributor to the company’s R&D. Wipro’s Vasudevan Aghoramoorthy cited current statistics for his company: “We’ve grown significantly [over the last several years] and now have 18,000 engineers working for us in India.”

And he reiterated the power of the consumer in India: “India as a market has exploded since the late 90’s. Ten years ago, no one would have believed that there could be 10 million new cell phone subscribers per month in India, but that’s today’s reality. And that doesn’t include replacement handsets!” Ahuja added: “It’s an acknowledged fact that the BRIC economies [Brazil Russia India China] are now driving the global markets!”

* Two weeks ago: News spread like wildfire that EDN had laid off distinguished Senior Editor Mike Santarini, as well as Editorial Director Maury Wright. Even those who thought they understood the depths of the financial despair into which EDA ad dollars, and journalism, had fallen were caught by surprise.

* One week ago: Mentor announced estimates of $860 million in revenue for 2008, “stepped up cost controls,” and 5%-to-10% growth for 2009. Mentor stock rose 21% on the news. Synopsys announced Q1’08 revenues of $315 million, higher than the $313 million anticipated and a 2x increase in profits compared to the same period last year. At first blush, Synopsys stock rose 6% on the news. Cadence announced a $500 million stock repurchase plan. Cadence stock went up 5% on the news.

* Late last week: Cadence announced it will no longer exhibit at the Design Automation Conference. Per the Press Release: “DAC provides the whole industry with an opportunity to come together to discuss technical challenges and solutions. We appreciate and will continue to fully participate in this aspect of the show. We do not, however, see similar value or ROI on the sales booth aspect. Our CDNLive! series of global events provides us with a stronger, sounder platform upon which to engage more deeply with customers and partners.”

And then there was DVCon

********************

DVCon – Verification, Low Power, and Parallelism in the Real World

DVCon is about real-world engineering. Period. It’s not about fancy, schmancy academics. It’s not about huge, over-produced marketing hype. It’s about designers learning what their tool vendors are doing, and vice versa. There’s just one real keynote, everybody gets lunch courtesy of the sponsors, and everything’s either on the first floor of the DoubleTree Hotel in San Jose, or just up the stairs on the second.

Nobody feels like they need to be in two places at once at DVCon. There are, at most, only 2 technical tracks underway at any one time, and presenters have a full 30 minutes to talk and then answer questions. The exhibit hall is only open from 4:30 PM to 7:30 PM, after the last sessions of the day wrap up, and is always served up with beer and wine. Better yet, the exhibit hall is on a human scale – more science fair than vendor fair – and full of people who either already know each other, or want to get to know each and/or what the vendors’ tools can do.

The bottom line is, people actually learn things at DVCon – in particular, this year, about low-power design and formal verification – and then take that learning back to their place of work to share with others. It’s hard not to like DVCon.

* The Role of DVCon: As the sun rose over San Jose on the first day of DVCon, I shared breakfast with Conference Chair Steve Bailey from Mentor Graphics, Program Chair Tom Fitzpatrick, also from Mentor, Program Vice Chair Ambar Sarkar from Paradigm Works, Past DVCon Chair Gabe Moretti of EDA Design Line and DACeZine fame, and Publicity Co-Chair Wendy Truax from HighPointe Communications.

The committee expressed their gratitude to conference organizers MP Associates, and to all of the sponsors: Accellera, Cadence, Jasper, Mentor Graphics, Synopsys, nSys, Certess, OVM, and OSCI & Friends (including ARM, Cadence, CoWare, Doulos, ESLX, Forte, JEDA, Mentor, and Synopsys). They also spoke to the roll DVCon plays in the annual cycle of technical conferences for the design automation industry.

Steve said DVCon’s known as a gathering place for working engineers, that over 700 people were expected at the conference this year, and the number of exhibitors was up (35 booths, by my count). Tom said the value of the conference is proven by word of mouth as much as by anything, because the quality of the papers is consistently good and engineers know DVCon is a place to learn and network. Ambar said the selection of those papers is taken very seriously, and always done with the tool users in mind. Steve reminded me that the authors retain the copyright for their papers, which is particularly appealing to presenters. Ambar, Tom, and Steve all commented on the success of the

sponsored tutorials on opening day, and said the session on new research drivers was a welcome addition to the conference. (Speakers at that session came from DEIS Universitá di Bologna and the Technical University of Braunschweig, putting to rest the idea that academics weren’t welcome at DVCon.)

Before the breakfast group rushed off to the Opening Session, I asked for advice on what to look for at DVCon 2008. Tom said to anticipate many discussions on different aspects of verification, assertion-based verification, in particular: “It’s no longer, Can you do assertion-based verification?’, but what is constrained random and why is it good.” Ambar said there would be lots of talk about low-power design and verification, and Steve said it was interesting to note the number of new EDA startups in the area of verification. Hinting that standards and consortia are always a hot topic, Gabe Moretti said, “What I find remarkable is the acceptance by

the industry of OVM!”

* OVM & VMM at DVCon: So, what is OVM? It had a clear presence at DVCon – sponsoring a half-day tutorial on Tuesday, a booth on the exhibit hall floor, and a lunchtime panel discussion moderated by Electronic Design’sDave Maliniak on Thursday.

Well, per the organization’s own description, “OVM is the industry’s first, open, interoperable SystemVerilog verification methodology. Developed jointly by Mentor Graphics and Cadence [an effort that earned an IEC DesignVision award, as mentioned earlier], OVM provides a SystemVerilog class library, examples, development gridlines and other collateral that incorporate many years of verification experience from both companies to assist users in developing modular, reusable, transaction-level cover-driven testbenches.”

So, what’s not to like? It turns out that Synopsys donated a library of SystemVerilog assertion checkers to Accellera in 2006, checkers that come out of the Verification Methodology Manual (VMM) the company published jointly with ARM. Synopsys would like to see Accellera continue to pursue verification standards efforts along those lines, and so the beat goes on. [See Jack Horgan’s

Given the recent dust-up in the low power world – the Cadence-sponsored CPF (Common Power Format) versus the Mentor/Magma/Synopsys-sponsored UPF (Unified Power Format) – it’s not surprising that a collective groan has been rattling around in the industry for sometime over the specter of more Us-versus-Them, now in the verification world. Hence it was noteworthy that in the midst of DVCon, Synopsys Director of Quality and Interoperability Karen Bartleson, who has served for many years on the Accellera Board, issued this challenge in her blog:

More and more, users (customers) are demanding one standard verification library. As a result of this growing interest, Accellera will start investigating the feasibility of creating a single, SystemVerilog verification library. My company, Synopsys, is committed to support our customers’ interoperability requirements. We are ready to fully support an Accellera initiative, contributing our technology and expertise towards a single standard. Our initial technical analysis indicates that this can be accomplished in a reasonable amount of time, with a reasonable amount of effort.

As Accellera investigates the viability of a single standard, we welcome Cadence and Mentor to support our VMM. We will be happy to provide them access to our VMM base class library source code and resolve any licensing issues and objections they may have had in the past. I sincerely hope that Cadence and Mentor will cooperate in Accellera, preventing the infamous standards wars from recurring. This is a golden opportunity for EDA companies to work together, diminish the “war” that has been brewing, and serve our customers at large.

Despite this call for peace between factions, it’s worth noting that the DVCon conference bag came fully loaded with two recent publications: Synopsys’ Verification Avenue, which touts VMM, and Mentor’s Verification Horizons, which touts OVM. Other than keeping editors and printers in business, a merger of efforts across the industry doesn’t look like it will come anytime soon if the promises detailed in each pub are both pursued.

* Accellera at DVCon: Meanwhile, Accellera took advantage of DVCon (its spiritual home) to announce its Board of Directors has approved the VHDL 4.0 standard specification, which will be released to IEEE for balloting this year: “VHDL 4.0 addresses over 90 issues that were discovered during the trial implementation period for the VHDL 3.0 version. These encompass enhancements to major new areas introduced by VHDL 3.0 including generic types, IP protection, PSL integration, VHDL API integration, and the introduction of fixed and floating point types.”

Does this announcement answer John Cooley’s challenge to his DVCon panelists (see below) to prove that VHDL isn’t dead? I don’t know, but somebody’s using VHDL or I don’t think Accellera would be going to all of this effort for nothing.

* OSCI & NASCUG at DVCon: Co-located with DVCon this year, the North American SystemC Users Group (NASCUG) meeting on Tuesday had 70+ people in attendance. Also on Tuesday, the OpenSystem Initiative (OSCI) hosted a tutorial detailing the OSCI TLM-2 draft standard, released in November 2007, which “addresses the interoperability of memory-mapped bus models at the transaction level, as well as providing a foundation and framework for the transaction level modeling of other protocols.”

Over coffee on Wednesday with ESLX Co-Founder Jack Donovan, NASCUG President, and Forte VP of Technical Marketing Mike Meredith, OSCI President, I learned that OSCI is looking for feedback from any interested parties with respect to the TLM-2 standard.

Mike said, “Developing the standard has been challenging, in my view, because it really is multiple standards being built at the same time. It’s meeting the requirements of people who want to do detailed architectural and performance analysis, plus also those who want to do virtual platform development. Yes we’re targeting more than just a single goal, but we think this will prove to be the strength of the standard [in the long run]. There will be interoperable models that can be exchanged across the industry, and you’ll be able to mix and match depending on what you’re trying to accomplish.” Mike also noted that various OSCI events would take place

at both DATE and DAC, where additional opportunities to give feedback on TLM-2 will be available.

Meanwhile, Jack said, “There are a whole class of engineers out there who are looking at working at the ESL level, and a whole lot of people using the OSCI simulator and ModelSim. But [in general], those users are still under the radar of the EDA companies. The question [for many users interested in ESL] is how do you grow adoption of SystemC and ESL within the company without shutting down completely for a number of months.”

Jack added that the business models for companies in Europe and Asia provide a better chance to see the opportunities associated with SystemC, versus the fabless business models more common in the U.S. where it’s only about getting the chip out as fast as possible. Companies outside of North America will often give an employee several years to come up to speed on system-level languages and technologies, to essentially become an internal evangelist, and then give that same employee additional time to educate their co-workers in the technology. Jack and Mike said that’s why we’re seeing a different pattern of adoption of ESL and SystemC in North America versus elsewhere

in the world. Nonetheless, they remain extremely optimistic that the move to higher levels of abstraction worldwide is inevitable.

Denali’s freight train (see above) and Cooley’s doubts (see below) notwithstanding, I’d have to agree with Jack Donovan and Mike Meredith. It’s really not over til it’s truly over – and the fight for SystemC and ESL ain’t anywhere near being over yet.

* Formal Verification at DVCon: In a complex hour of conversation positioned at the heart of DVCon’s topic material, a panel that included Intel’s Limor Fix, IBM’s Avi Ziv, Jasper Design’s Rajeev Ranjan, Mentor Graphics’s Harry Foster, and Cadence’s Axel Scherer attempted to create some order out of one of the thorniest questions in verification. Is formal verification a reality or is it not?

Although there appeared to be agreement between the speakers with regards to the need for standards to establish structure among the different verification methods, there was a fair amount of disagreement in other areas of the discussion. In the end, after what seemed to me a confusing array of positions and counter-positions from the various speakers, the panel ended with one clarifying question from discussion moderator Richard Ho. His question: “Has formal verification come of age?”

The answers from the three EDA vendors were inevitable. Harry Foster said, “Yes.” Axel Scherer said, “Yes.” And Rajeev Ranjan said, “Absolutely!” The answers from the EDA customers were not so predictable. IBM’s Avi Ziv said, “I wouldn’t go so far as to say that formal verification’s come of age.” Intel’s Limor Fix got a round of applause: “Formal verification has finished high school, but not yet started university!”

* Low-Power Design & Verification at DVCon: There were essentially three sessions on low power at DVCon: “Verification of Low-Power Designs” featuring speakers from STMicro, Mentor, and Cadence; “Assertion-Based Verification of Low-Power Design” featuring speakers from Mentor, and Cadence; and “Trends in Low-Power Verification” featuring Synopsys Fellow Tom Williams. If you conclude from this list that Mentor, Cadence, and Synopsys are concerned about the verification of low-power designs, I think you’d be right. I attended all or part of all 3 sessions and came away with the impression that each

and every vendor laid claim to far more progress in the technology than the engineers in the audience were willing to acknowledge. The idea of verifying designs that can have more than 25 power islands on-chip is so daunting, it’s not a surprise that the technologies and tools suggested by the vendors are being greeted with skepticism by the users.

After speaking about static and formal verification of power-aware design using UPF, Mentor’s Amit Srivastava was stymied by a question from the audience: “So, this tool will generate assertions? Does is actually exist yet?” Harry Foster, session chair, answered: “This is a proof of concept talk!”

After speaking about power assertions and coverage for low-power verification in that same session, Cadence’s Bill Winkeler was equally stymied by a question: “You’re turning power on and off on a bus as specified, but how can we be sure it’s all covered?” Winkeler’s response: “We measure whether or not it’s a domain, a mode, or a transition. But other than that, there’s no automatic way to do what you want.”

Synopsys’ Tom Williams gave a dynamic early morning keynote on Thursday on trends in low-power verification, one in which he dramatized on stage the difficulties electrons are having these days making their way efficiently through narrow Cu interconnects (average width 600 Å) versus the much roomier Al interconnects of yore (average width 1000 Å). Although Williams made a terrific electron, he too was hit up with questions from skeptics of Synopsys’ strategy of including dynamic analysis in low-power design verification.

Question: “Even if you’re working on a mix of voltage domains, aren’t there clearly defined boundaries between voltage domains like there are with clock domains [making dynamic analysis unnecessary]?” Williams replied, “There should be, but there can be errors. And yes, one would hope for a global solution [that might arise] if you could shove everything into the static portion of the design reliably, but that’s just not possible.”

* Ending Endless Verification: Wally Rhines’ Wednesday afternoon keynote was addressed directly to a packed ballroom full of real engineers. He talked. They listened. He promised to talk about verification, but begged permission to start with DFT. He said on-chip complexity forced folks to search for better ways to test over the years. Bigger and faster computers had helped, as had testing for stuck-at faults, but the number of transition faults still got bigger. Test engineers beat that stuff back, Rhines said, by shifting to scan-based test, by introducing ATPG, BIST, and ordered-test patterns, and had increased test efficiencies by up

to 10x. But it wasn’t enough because even though the cost of components came down, the cost of test did not.

Then in 2001, Rhines said Mentor’s DFT guru Janus Rajski came up with a new approach based on the theory that folks should stop testing what they’ve already tested. Rhines said implementation of the algorithms behind on Rajski’s theory, in combination with test-data compression, has increased test capability 100x and will increase that capability 1000x in the next 5 years. He concluded, “We moved from a mode where we added more cycles of test, to a mode where we added more test per cycle.”

Rhines moved on to verification and reiterated what everyone knows – except those who’ve been on a different planet for the least 10 years – verification costs more than design, and that’s way, way too much. And it’s getting worse. Now there are added complexities due to low-power design, clock-domain crossing, mixed-signal and RF on-chip, and other such sorrows. Chips are missing schedules, going out late – no matter how late’ is defined – and server farms for simulation have exploded to include tens of thousands of CPUs: “Ultimately, you’ll need a nuclear power plant to keep up with the trend of expanding server

farms.”

However, Rhines said, verification engineers are just as smart as test engineers and have developed various strategies designed to end “endless verification” with things like FPGA-based prototyping (although there can be partitioning problems), full-scale emulation (even if it take lots of verification engineers), getting more clever with the software (problems there speak for themselves), using assertions to speed up debug (but how to write them?), constrained random testing, and coverage-based verification. These strategies, Rhines said, have increased verification efficiencies from 2x to 10x, but it’s not enough. Like the old conundrum in test, Rhines insisted that

verification efficiencies must increase by at least 100x or more if the world (of semiconductors) is to believe there is a tomorrow.

“I believe that there’s something out there that will do for verification what test-data compression did for test,” Rhines said. “Something that will allow us to stop the pain of endless verification We believe there are three possible candidates: intelligent testbench automation, formal methods, and transaction-level modeling. I’m not sure if we will take any one of them all the way, but we do have three strong candidates that will take us to more verification per cycle, rather than more cycles per verification.”

Rhines spent the remainder of his allotted time detailing the progress as he sees it on each of the three verification fronts. He noted that if intelligent testbenches become a portable part of a design, it would prompt the move into dynamic simulation. He gave a nod to Mentor’s formal verification guru Harry Foster and said “If there’s anything you can verify statically, rather than dynamically, you should do so because eliminating the testbench [will yield] near-infinite benefits,” and added that more realistically, a combination of simulation and formal verification may provide a great deal of success. Finally, with a nod to his long-time peer

and/or mentor Gary Smith, Rhines talked about the campaign to move design up to higher levels of abstraction: “Mixed abstraction level models are definitely needed, models that can be portable across environments and languages.” He said ultimately such models will become available and will go a long way to alleviate the current bottleneck in verification.

Rhines ended with a word of advice to the several hundred working engineers in his audience: “You should allow for 4 levels of change in abstraction over the course of your career.” Then he laughed and noted that even though he’s been in the industry for some decades, if one does the counting correctly, he’s got at least one more change in abstraction level owed to him before his career ends. Nobody in his audience disagreed.

Following the applause, Rhines fielded two crucial questions: “What do you mean by intelligent testbench?” and, “So we’re still going to have to buy licenses for EDA tools even after all of this is implemented?” He answered, “I’m talking about doing things deterministically using techniques that have evolved in the compiler world, and generating test sequences automatically.” And he got a big laugh with: “Are you guys still worried about the measly cost of software?”

* Why People Actually Come to DVCon: Yes, I’m the one who John Cooley branded “The Prudish Church Lady” last year, the one who Cooley said wanted to shut down free speech because I roundly criticized his 2007 panel for leveraging racism, xenophobia, and homophobia for the sake of a few cheap chuckles. Yes, I’m the one who said the Powers that Be at DVCon would clamp a Cease & Desist order on Cooley, that his 2007 panel would be his last. And yes, I’m the one who was proven wrong.

Playing to the same packed house that Wally Rhines’ keynote had played to the previous hour, the difference between Cooley’s 2007 DVCon panel and the one in 2008 was surprising, even remarkable. There was actually enough substantive conversation at Cooley’s panel this year to make it useful, and everybody remarked on it.

But was it funny? Not really. Was it riotous and hilarious? Not really. Was it a laugh a minute? Absolutely not. And everybody remarked on it. But, had Santarini just been laid off a few days before? Yep. Was Cadence on the verge of announcing their pullout from DAC? Apparently so. Did Cadence even show up for the panel? Nope. Is the population of designers in North America aging, per Cooley’s own words: “Fewer young faces doing chip design now.” Yep. Is the industry in a funk? Yep. Am I going to take the time to type up the entire script of the 2008 panel? Not on your life.

So what did Cooley’s Band of Brothers talk about last week at DVCon? Here’s list. Let me know if any of the following is news, or even accurate for that matter:

- Nobody knows why Cadence didn’t show up for the panel.

- Nobody knows if Cadence ever agreed to show up for the panel in the first place.

- Nobody knows why Cadence had to have an empty chair on stage, nonetheless.

- Editors are unbiased by ad dollars, but paid by ad dollars; it’s a conundrum.

- If EDAC ever extends an invite to Magma’s CEO to appear on the CEO panel, he would gladly accept.

- The Cadence acquisitions of Clear Shape and Invarium were brilliant.

- IC Manage provides data management software.

- NuSym has great verification technology; it just hasn’t launched product yet.

- NVIDIA likes Magma best.

- Broadcom is using Mentor’s verification technology.

- Synopsys’ IC Compiler had 100 tapeouts last year.

- Forte is seeing continued growth.

- ESL has already emerged.

- ESL is emerging.

- ESL will emerge.

- The cost of doing design is going up; it’s in the range of $100 million.

- Only big companies can afford to do designs these days.

- ESL is the thing that will bring productivity to 100 million gates without requiring $100 million.

- SystemC is the language of ESL hardware.

- C is the language of ESL software.

- ANSI-C is still hanging in there.

- SystemVerilog is catching up with SystemC.

- SystemC is better than SystemVerilog.

- SystemVerilog is better than SystemC.

- Everything’s better than ANSI-C++.

- VHDL is not dead.

- The Synopsys/Avanti merger was not painful; it was successful.

- Cadence’s Days of Dominance in the analog space are numbered.

- Synopsys will be releasing a tool in the custom-design analog implementation space.

- Magma thinks analog is important, but the ability to migrate analog to lower nodes is key.

- Gary Smith also thinks analog translators are important and they’ll be here by DAC.

- Companies lose productivity when they don’t use best-in-class tools.

- Companies cannot figure this out for themselves and blanche when others tell them.

- Nobody’s talking about DFM because VCs have stopped funding it.

- VCs have stopped funding DFM because it’s in the consolidation phase.

- Nobody is selling EDA software at a discount in India.

- It’s one universal price for all, no matter where you are on the globe.

- Huge numbers of engineers are coming out of colleges in India.

- Shrinking numbers of engineers are coming out of colleges in the U.S.

- EDA companies don’t hire U.S. grads because they can’t find them.

- It’s not a good idea to become a realtor in California.

- It is a good idea to become an engineer.

- Pick a career when you’re young that you’ll still like when you’re old.

These were the highlights of the 2008 DVCon Troublemaker’s Panel. There was only one lowlight. That was when Brett Cline whipped out a life-size cardboard head of the still long-suffering Mike Fister and waved it around, proving two things simultaneously: 1) Cline is an abysmal ventriloquist and should not give up his day job, and 2) tacky, sophomoric humor will still get you a seat on the Troublemaker’s Panel.

* The Final Word at DVCon goes to Gary Smith: “Cadence crashed and burned for two reasons. RTL is a commodity item and going to stay flat. It’s over, guys! The other issue is a Cadence issue. If you’re doing business with companies that did reverse buyouts, you’re in trouble because those people are getting cut to pieces. NXP went private, and so did Freescale. I don’t even balance my own checkbook, but I know the financial guys created a bubble and Cadence is paying the price.”

DVCon Troublemaker's Panel February 26, 2008Reviewed by 'Cedric Iwashina'Hi Peggy,At the DVCon Troublemaker's Panel, I thought that some of Gary Smith's observations/predictions were the most interesting.A few weeks ago at the EDAC CEO panel, you asked Mike Fister about the rumor last summer that Kohlberg Kravis Roberts and the Blackstone Group might possibly take Cadence private.On this year's Troublemaker's panel, Gary said that a large part of Cadence's recent financial shortfall was due to private equity (PE). Specifically, some of Cadence's largest customers were taken private over the last two years, e.g., NXP and Freescale. Now, the PE firms realize that they made a mistake and are losing money hand over fist. So, they've cut costs to the bone. As a result, they're not spending as much on EDA software as expected. Gary also said that, due to these financial debacles, PE firms were no longer interested in anything even remotely related to semiconductors, including EDA.Gary also said in physical implementation, Cadence needed to buy Magma, but probably couldn't afford it any longer after their precipitous drop in market cap. He said that Mentor/Sierra would be able to sustain a viable physical implementation business. And that AtopTech would most likely be acquired, probably by Synopsys.One last thing he said, that I found to be a bit strange, is that ESL continues to grow, but hasn't reached the "knee of the curve" in terms of market acceptance, yet. I think he said that it would reach that "knee" around 2012. That can't be good news for ESL companies.

8 of 9 found this review helpful.

What the *&!^? February 25, 2008Reviewed by 'Bob Smith'Peggy -This is a great article and dissection of the state of the EDA industry. No doubt the industry is going through very tough times and your article provides several different perspectives about why this is happening. Your obesrvation about the role the IDMs play in the food chain is very good, for example. While EDA may be on the "outs" now, it is clear that if semiconductor technology is going to continue to evolve, it will require new technical innovations to solve the tough problems in design, verification, and implementation. Some of this will come from within the large IDMs / semicos, but based on history we also know that a good part of the fundamental innovation will come from startups and academia. The question is ... can the fundamental business models that drive the industry change and evolve as well?