Meta

Posts Tagged ‘2012’

To finish off our series of predictions, I would like to point you to another series of interesting and informative prophesies. Click on the following topics to see these predictions collected by Brian Bailey, Editor of EDA DesignLine.

In 2012, we’ll see tablets and smartphones changing the world. That’s another way of saying Apple’s moves will have huge implications in semiconductors, foundries and EDA.

Apple’s use of the Samsung foundry has started an arms race between Samsung, TSMC and Global Foundries. Samsung is ramping up to meet the capabilities and capacity of TSMC. Intel is being pushed to stay ahead technologically and to consider new business models. Global Foundries continues to work to ramp its yields.

This situation will be good for semiconductor equipment and EDA vendors as well. Their tools will facilitate the new processes and the link between design and manufacturing.

Another element: in 2012, we’ll see the supply chain continue to consolidate. Why? The cost to design a complex SoC requires a big budget and a big market opportunity. Only the largest of semiconductor companies can tackle these designs. This increasing cost helps the FPGA vendors.

The foundries face increasing technology and capital requirements to move to new process nodes. Only a few will make it.

The public markets have been closed to EDA companies for a number of years making acquisition the most likely exit for EDA startups. Apache chose to be acquired by Ansys in 2011. It has been difficult for a new, large EDA competitor to emerge. This bodes well for Big EDA in its negotiations with Big Foundry and Big Semiconductor. In 2012 I believe there are several EDA companies poised to go public.

Who will be the beneficiary of these changes in 2012? Apple. Consumers should also benefit as new, leading edge fab capacity will be used to make exciting new devices.

The main technical breakthroughs we can expect this year will probably revolve around double patterning in lithography as EDA companies try to optimize the technique for density and performance. And it will probably have knock-on effects way up in the design flow, forcing designers to adopt much more regular designs. But, unless EUV sees a major breakthrough, double and further levels of multiple patterning is something people will need to get used to.

Regularity is likely to become a feature of low-power design as well. Although it hurts effective density, the drive to cut power consumption will see much more use made of on-chip redundancy – we’ve already seen some of that in the nVidia Tegra 3 and the ARM Big.Little initiative. We could see those techniques begin to extend into ultralow power circuits using near or subthreshold devices as engineers discover how to model circuits effectively and recover lost performance at very low voltages. Some of these techniques will also help reinvigorate older processes – using better EDA to trim power consumption instead of relying primarily on process changes to deliver better energy efficiency.

On the ASIC/SoC side of the fence:Reducing power consumption is becoming increasingly important — I anticipate that this is the year that power will finally come to the forefront of EDA tools — I know that they optimize for power now, but largely as a second thought — like synthesis, for example, optimizes first for area and timing and then for power — I think we’ll see a move to optimize for power as a primary consideration.

On the FPGA side of the fence:As we move to the 28nm node and below, radiation is increasinglyof concern with regard to electronic devices. It’s no longer just of interest for aerospace applications — at these low device geometries, radiation can affect chips in terrestrial applications. FPGAs are particularly susceptible because in addition to their normal logic and registers and memory cells they also have configuration cells. In the past, the only radiation-tolerant FPGAs were antifuse based — but these are only one-time-programmable (OTP) and trail the leading edge technology node by one or two generations. SRAM-based FPGAs offer many advantages in terms of reconfigurability and being at the leading edge of technology, but they are more susceptible to radiation events in their configuration cells. My prediction is that we will see more and more efforts from FPGA chip vendors and EDA tool vendors with regard to creating radiation-tolerant designs.

On the personal side of the fence:I predict thatpeople will come to realize that what the world needs is a book about creating radiation-tolerant electronic designs that can be read and understood by folks who DO NOT have a PhD in nuclear physics — a book that is of interest to the people who design silicon chips (both analog and digital), the people who create EDA tools, the companies who manufacture the chips, and even software engineers (have you heard of “radiation tolerant software”?). I further predict that someone will finally realize that I am the best person to write this book and will approach me with a really great sponsorship deal that will bring tears of delight to my eyes 🙂

A number of 2.5D IC designs will hit the market and demonstrate both the value of 2.5/3D technology as well as the importance of powerful and user-friendly tools for “Pathfinding”, to quickly identify the best (lowest cost) implementation alternative.

I believe that 2012 will be a challenging, but very interesting year. The pressure on the big EDA companies will definitely increase. Pressure coming from the shareholders, and from the users, who can play one against each other due to comparable offerings. From a technical point of view, the challenges keep rising at an even faster pace and less solutions are provided. I expect that the Synopsys/Magma merger will go through, which will take away an important piece of variety in the community. Consequently, this will increase the consideration of alternative solutions. This in return will help to improve or establish collaboration between EDA companies, leaders and smaller ones, and we might see teaming ups of some of the smaller ones to assemble packaged solutions to well-defined problems instead of proposing point tools. Despite this optimism, I expect that a lot of the smaller EDA tool providers will need to think out of the box in the future to survive.

Industry pressure is growing to deliver more mainstream 2.5D and 3D stacked die semiconductor products within the next 1-2 years, driven by the need to improve I/O bandwidth, reduce power consumption, and optimized choice of process technologies for different portions of a complex SoC. It is therefore quite possible that 2012 will see one of the large mainline EDA vendors broadly announce a full “platform” product suite targeting the design of 2.5D and 3D stacked die making use of through-silicon-vias (TSV’s). This design platform would likely incorporate tools from value-added niche vendors, and be endorsed in a large foundry reference flow. Open standards will later expand the range of choice and interoperability over time.

One lingering question for 2012 is what will become of the Magma back-end platform? I predict that Synopsys will phase out the Magma Talus platform in favor of ICC. Why? It makes no sense for Synopsys to continue to field and support two different systems although it is likely that there will be some transfer of technology into ICC. Converting the existing Talus user base over to ICC is no small task and will likely take several years to complete as well as require incentives and utilities to move the existing base over to the Synopsys platform.

Timing verification is another story. Synopsys will capitalize on the acquisitions of Extreme and Magma to leverage the technologies in those products to develop and deliver the next generation PrimeTime platform. Once they complete this, they will have re-solidified their position as the industry golden standard in static timing verification.

It will be very interesting to see how the consumer-driven SoC market will evolve. SoCs used to be comprised of a processor, memory, various IP blocks, and the on-chip infrastructure needed to support them such as clock, power and communications channels. Now SoCs have multiple processors, large numbers of IP blocks, multiple on-chip communications channels and multiple memories. In essence, today’s SoCs are comprised of multiple SoCs as we used to define them.

The 2012 SoC will beget big challenges in design and even more so in verification. IP will become more important. And even though hardware performance and power will matter, system design and software will become the differentiating items.

SoC system design and verification will be especially active, because it is what the system does that really counts. (After all, the point of building an SoC is to deliver a winning end product.) To a great extent that will require a huge software and verification effort — under the schedule pressures that come from a hugely competitive consumer products market.

In 2012, we will see a bigger presence from companies like Ansys and Dassault giving much competition to the big three. Cases in point are the Ansoft and Apache acquisitions by Ansys. We thus might see further consolidation among the top EDA companies. To handle some of the pressure, I do believe the big three will once again realize the need for new ideas and begin to look further into acquiring new and cool technology earlier.

I think that there will continue to be consolidation in the EDA industry. At each process node, fewer and fewer designs ship in high enough volume to recover the enormous investment in bringing them to market, which is a bad trend for EDA. Several companies in the ecosystem will go public if the market conditions remain favorable: eSilicon, Tensilica, Atrenta. Although, as with Apache, they may get acquired at the last minute (at high valuations). Mentor may get acquired, or sell off some business lines.