Archive for February, 2010

Thanks to all of the folks who responded to my original critique concerning Dr. Feeman Dyson’s December’09 lecture in Portland, OR [Freeman Dyson Talks About Biotech vs Nanotech] Perhaps the most engaging response was from Russ Baker, which I’ve posted below.

But before reading my discussion with Russ, here’s a short clip from a Nov09 interview with Sci-Fi Editor-in-Chief Lou Anders that provides a slight twist to the comparison between generic and semiconductor engineering.

Regarding your article and comments about Dr. Dyson’s statement on “computer domestication”, I respectfully submit that his statement was in fact intentional and accurate, and somewhat nuanced. I think the point of using the term “domestication” is to imply “the taming of computers for the service of humanity”. Viewing them as a commodity is primarily an economic judgment, implying access to all (and also a complete lack of differentiation and direct substitution – which doesn’t actually apply to computers. Just ask Apple.). But I think domestication was specifically chosen to imply that this complex technology has now been refined and packaged (i.e. “tamed”) to the point that the layman can use the tool in their own home without any real understanding of how or why it works – but simply to get a job done. I also think you might be reflecting a lack of appreciation for the hundreds of years of R&D and testing that were involved in domesticating animals and plants. One could probably argue that humanity’s animal/plant project was far more ambitious and costly relative to the resources and knowledge of the people doing it than the development of the computer. It certainly took far longer, no doubt with many more mistakes and dead-ends.

I also think his use of “domestication” differs from yours, since he was comparing it to the domestication of biotechnology. That is a very simple comparison to me, again one which I agree with him. As a layman, I have very limited access to biotechnology today, certainly not in the home (which is what domestication is getting at). Now, one could get into another semantics argument that the domestication of animals and plants is also “biotechnology”, which changes the whole argument. But I agree with you that biotech here is mainly tied to modern genetic engineering (rather than ancient genetic manipulations through breeding).

All to say that I thought your judgment of Dr. Dyson’s statement missed the point, and I suspect his words were actually very carefully chosen and bang-on. I think this was an unfair critique of his technical literacy, though there are definitely other examples to make your broader point. -Russ

Using Social media (SM) apps like Twitter and Facebook really does dumb down the conversation!

Here’s but one example. Today, I tried to post a simplistic discussion on Twitter, but it required three separate Tweets. Twitter has a 140 character word limit.

Next, I decided to post the same three Tweets on Facebook, but then I ran into a 420 character limit. My short message was 685 characters long – a tome in today’s SM world.

The only mechanism left was my blog which effectively has no character limit. But this instructive exercise highlighted the point of how much SM tools limit our ability to communicate while defocusing our attention and ultimately stealing our most precious resource – time! No wonder my engineering brethren do so little of their work on social media platforms.

Yet social media tools like Twitter, Facebook, MySpace, Google, Linked-In, Plaxo, and all the rest are terribly invasive. Once you start using them, you’re hooked. So instead of accomplishing meaningful achievements, we Twitter and Facebook our time away. Social media sites are like the Isle of the Lotus Eaters in ancient Greek mythology. Anyone who eats of the lotus becomes forgetful and happily indolent while time slips away.

Where is the Odysseus of old to free us from the grip of these time robbers? When some of Odysseus’s crew had eaten of the lotus, they forgot about their friends, homes, and duties. In the end, Odysseus had to physically drag them back to the ships.

Want to know what started this rank of mine? It began this morning, while I was purusing the headlines and came across the following articles which I twittered as shown:

List this among the dumbest “duh” polls: “85 Percent of People Worldwide Want Content to Be Free (NielsenWire)” http://bit.ly/9lAV6a

Google doesn’t help by giving the work of others away for free: Google Tightens FT.com’s Free-Article Loophole http://bit.ly/b56dDX

Content isn’t free. It comes at a price. Why would any good writer create meaningful content on a continuing basis for free?

From new 64-core processors and memory acquisitions to software applications, the world of electronics continues to change. Here’s a few of this week’s tid-bits that caught my attention:

ProcessorsIBM launched its Power7 chip sets, aimed at the midrange server market. The higher-end of the Power7 family of products varies from 64- to 32 cores, collectively using far less power while boosting the performance per core over the Power6 predecessor. As with its competitors, IBM’s computing platforms support advanced virtualization management and power controls. Additionally, the platforms support computing methods and analytic capabilities that are targeted for data-intensive applications from sensors in electric grids to supply chain management. More info can be found at the Power.Org website

MemoryMicron announced its intentions to acquire Numonyx Holdings, a spinoff from Intel and STMicro. This acquisition will broaden Micron’s NAND flash memory offerings to include Numonyx’s line of NOR flash and phase change memory (PCM) – a potential rival to both NAND and NOR flash. NAND flash is used by most computers and data storage devices, while NOR is typically used to store software applications in mobile phones and related products.

Software Apps
Apple’s recent announcement of the iPad availability in Mar’10 has highlighted ongoing tensions between Apple and Adobe’s Flash video technology. The iPad doesn’t support Flash, which Apple claims is too buggy. The problem is that Adobe’s Flash is used in more than three quarters of all web videos and interactive advertisements, according to the Wall Street Journal.

Apple is supporting a standard called HTML5, which it hopes to be a Flash replacement. The HTLM5 consortium includes both Apple and Google.

While all of this is technically interesting, what really grabs my attention about Apple’s iPad are the deals that Apple is making with book publisher and TV networks. The iPad business model may well reshape both eBooks and mobile TV in the same way that the iPod and iTunes reshaped the music industry. Those trends could have far reaching ramifications for both the business and chip-board design communities.

Design for the Consumer Era is seen as the next iteration of the infamous Design-for-X paradigm shift by keynote presenter at DesignCon 2010.

One seldom hears anything new or earthshaking at keynote presentations. Instead, good keynote addresses are like filters and amplifiers that simplify complex messages while refreshing their meaning. This is how I would characterize the message delivered by Dr Alex Shubat – CEO and Co-founder of Virage Logic - at Wednesday’s lunchtime keynote at DesignCon 2010.

His keynote focused on the technology and business trends that are pushing SoC designers and companies alike to move beyond the theme of reusability. Design reuse (DFR-Design for Reusability) was a big driver in the ’90s and ’00s. Reuse was part of the productivity era that started with the creation of design automation in the ’80s.

Shubat reminded his audience this productivity push was overlapped by today’s ongoing focus on manufacturability which is highlighted by such well-used acronyms as DFM, DFT and DFY, all of which led to the latest Design-for-X terminology for this new decade, namely, Design for Consumer Era (DFC). Interestingly, this seems very similar to the Department of Defense’s Design-to-Cost (DTC) realization during the military cost cutting era of the ’80s and ’90s – without the emphasis on consumerism.

Still, many would argue that the Design-for-Consumer approach is very similar to the Design-to-Cost method in the recognition that cost or rather shrinking profit margins are a key driver in design architectures.

Adding a slight spin to this latest “design” iteration came from a quick chat after the keynote with Brani Buric, executive VP of marketing and sales at Virage Logic. Buric suggested that Design-for-Profitability (DFP) might be an even better phrase to capture the latest reality adjustment for EDA design tool vendors and semiconductor companies.

Regardless of the “D-word” terminology, the SoC design challenges remain frustratingly the same, summed up by increasing complexity, shrinking Time-to-Market and (now) lower profit margins. Shubat concluded his presentation by noting the trend of shrinking size in electronics. Yesterday’s printed-circuit boards are now today’s complex chips that will become tomorrow’s reusable IP.

Some will note that ending suggests a return to reusability, in contrast to the keynote title of “Going beyond Reusability.” But as Shubat explained during his talk, reusability in the ’90s was intended to handle complexity. Today, reusability is seen as the best way to handle complexity as well as cost. In the growing world of electronic consumerism, volumes are high, profit margins are low, and cost (or profitability?) becomes the next “X” for which we need to design.