Looks like Samsung is going to be more nimble than GF at 28nm. Given them converting memory fabs to LSI, I wonder if they will try to pick up ProMos which was rumored to be of interest to GF. I don't recall seeing any conclusion to that supposed sale. It was well less than $1B which for a 12" fab is a pretty good deal if it can be converted to leading edge logic for not too much more.

TSVs are sort of reminiscent of "local interconnects" which AMD may have pioneered, drilling through rather than wrapping the edge to connect different layers. Stacking should play well with the ASICs and add ons from smaller designers .

But as forward-thinking as the company is, perhaps Apple hasn’t created a new path at all. Through a technique of observe, perfect and discard, Apple has been heading for some time now in one direction — along the pre-defined path into the era of ubiquitous computing.

Ubiquitous computing definedIn some ways, this path is as logical as Moore’s Law. Look at the history of computing — from the mainframe era where there was one computer for many consumers, to the personal computing era where there was one computer for each consumer, to this new era where there are many computers for each consumer – and compare of the number of computer chips to the number of consumers using those chips. At its foundation, ubiquitous computing could be summed up by this simple principle of ratios.

The modern concept of ubiquitous computing originally came from Mark Weiser in 1988 from the Computer Science Lab at Xerox PARC (sound familiar?). The theory proposed a seamless, almost invisible connection between consumers and computers that would help drive a change in ratios from one computer to many people, to many computers to one person.

Gesturing tabs: Mobile technology already had small chips, powerful batteries, geolocation services and wireless networking. But that was not enough to win over the masses and drive us all to purchase multiple computing devices. It was the way consumers interacted with these smaller devices that needed to change. For a long time, it was thought that voice recognition was going to propel us into the next era of computing, but that never happened.

Leveraging the fact that there were approximately 100 million iPod users, Apple was able to use convergence to its advantage as it introduced these iPod users to a series of simple touch-based gestures on a nearly buttonless device. In the early years of the iPod, we all were trained on the scroll wheel. With touch-based gestures on a wide open screen, this paradigm was taken one step further. Just as the mouse accompanied the transition from the terminal-based Mainframe Era to PC era, the post-PC era was ushered in by a new way of interacting with other computer chips, touch.

One human relation-chipMaking each device “aware” of how consumers use all of the other devices they own is the key to accelerating the adoption of more than one computing device. While Apple may in fact be the only company in the world to have constructed a homogeneous synergy between its personal and its ubiquitous computing platform, it is certainly not the only company trying to forge the relationship between the user and the computer chip. For the relationship between consumers and computing devices to become truly invisible, these new smart devices will need to know more and more about the consumers who own them. For instance, the devices will need to know everything consumers have done in the past, what they are doing now and even what they plan on doing later.

Perhaps this is the reason Tim Cook stated that Apple’s “ best years lie ahead of us.” With technologies like iCloud and Siri, Apple will likely play a larger and larger role in forging the relationship between consumers and the growing number of computing devices in our daily lives. It is not about selling more of these individual devices, it is all about enabling the relationship between an individual and a collection of specialized devices. And Apple knows this.

According to reports from various industry sources, the Chinese government has begun the process of picking a national computer chip instruction set architecture (ISA). This ISA would have to be used for any projects backed with government money — which, in a communist country such as China, is a fairly long list of public and private enterprises and institutions, including China Mobile, the largest wireless carrier in the world. The primary reason for this move is to lessen China’s reliance on western intellectual property.

There are at least five existing ISAs on the table for consideration — MIPS, Alpha, ARM, Power, and the homegrown UPU — but the Chinese leadership has also mooted the idea of defining an entirely new architecture. The first meeting to decide on a nationwide ISA, attended by government officials and representatives from academic groups and companies such as Huawei and ZTE, was held in March. According to MIPS vice president Robert Bismuth, a final decision will be made in “a matter of months.”

China has a long history with MIPS and Alpha. Loongson processors, which power millions of Chinese school computers, use MIPS — and the ShenWei processors (pictured right) found in China’s first homegrown supercomputer, the Sunway Bluelight MPP, are based on the Alpha ISA. MIPS Technologies (the company) hasn’t been doing very well recently, and it’s rumored that the Sunnyvale-based company could be up for sale — a purchase I’m sure the Chinese government could afford.

According to EE Times, there are some 34 ARM licensees in China, but at $5 million for a single Cortex-A9 core license, it’s unlikely that ARM will be China’s choice. The Power ISA is cheaper, but lacks the software ecosystems that ARM and MIPS enjoy. ShenWei/Alpha is also a possibility, but again it cannot compete with MIPS’ installed base.

The other option, of course, is developing a brand new ISA — a daunting task, considering you have to create an entire software (compiler, developer, apps) and hardware (CPU, chipset, motherboard) ecosystem from scratch. But, there are benefits to building your own CPU architecture. China, for example, could design an ISA (or microarchicture) with silicon-level monitoring and censorship — and, of course, a ubiquitous, always-open backdoor that can be used by Chinese intelligence agencies. The Great Firewall of China is fairly easy to circumvent — but what if China built a DNS and IP address blacklist into the hardware itself?

Taking a leaf out of South Korea’s hardcore gaming scene, what if the Chinese government decided to implement a hardware-level 10pm curfew for video games? Or some code that automatically turns negative mentions of Hu Jintao (the Chinese president) into positives, and inserts a few honorifics at the same time. Or a latent botnet of hundreds of millions of computers that can be activated upon the commencement of World War III. Or, or, or…

Two research groups have invented two new materials that may compete with graphene as the solution for faster, more powerful electronic devices of the future.

MIT researchers have created a thin film of bismuth-antimony that allows electrons to “travel like a beam of light” — hundreds of times faster than in conventional silicon chips.In thermoelectric generators and coolers, the faster electron flow (and ability to function as an insulator) might lead to much more efficient power production. The new material could even allow for designing electronic devices made of the same material with varying properties, deposited one layer atop another, rather than layers of different materials.Researchers at Technical University in Germany and Aix-Marseille University in France created silicine by condensing silicon vapor onto a silver plate to form a single layer of atoms,New Scientist reports. The new material may lead to smaller, cheaper electronic devices than graphene because it can be integrated more easily into silicon chip production lines.However, for solar cells, adding graphene to the titanium dioxide in dye-sensitized solar cells increases the current by more than 50%, Michigan Technological University materials scientists have discovered. Dye-sensitized solar cells don’t rely on rare or expensive materials, so they could be more cost-effective than cells based on silicon and thin-film technologies.Ref.: Patrick Vogt et al., Silicene: Compelling Experimental Evidence for Graphenelike Two-Dimensional Silicon,Physical Review Letters, 2012, DOI: 10.1103/PhysRevLett.108.155501