Archive for Hardware

A new glass electrolyte-based solid-state battery has been developed by the researchers at UT Austin. Led by the Li-ion battery inventor John Goodenough, the team demonstrated that their battery is better than Li-ion. It can hold an almost 3x charge, has more charging cycles, supports fast charging, and isn’t prone to catch fire.

For more than a decade, engineers have been eyeing the finish line in the race to shrink the size of components in integrated circuits. They knew that the laws of physics had set a 5-nanometer threshold on the size of transistor gates among conventional semiconductors, about one-quarter the size of high-end 20-nanometer-gate transistors now on the market.

Some laws are made to be broken, or at least challenged.

A research team led by faculty scientist Ali Javey at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has done just that by creating a transistor with a working 1-nanometer gate. For comparison, a strand of human hair is about 50,000 nanometers thick.

“We made the smallest transistor reported to date,” said Javey, a lead principal investigator of the Electronic Materials program in Berkeley Lab’s Materials Science Division. “The gate length is considered a defining dimension of the transistor. We demonstrated a 1-nanometer-gate transistor, showing that with the choice of proper materials, there is a lot more room to shrink our electronics.”

The key was to use carbon nanotubes and molybdenum disulfide (MoS2), an engine lubricant commonly sold in auto parts shops. MoS2 is part of a family of materials with immense potential for applications in LEDs, lasers, nanoscale transistors, solar cells, and more.

The development could be key to keeping alive Intel co-founder Gordon Moore’s prediction that the density of transistors on integrated circuits would double every two years, enabling the increased performance of our laptops, mobile phones, televisions, and other electronics.

“The semiconductor industry has long assumed that any gate below 5 nanometers wouldn’t work, so anything below that was not even considered,” said study lead author Sujay Desai, a graduate student in Javey’s lab. “This research shows that sub-5-nanometer gates should not be discounted. Industry has been squeezing every last bit of capability out of silicon. By changing the material from silicon to MoS2, we can make a transistor with a gate that is just 1 nanometer in length, and operate it like a switch.”

To chat with Andrew Ng I almost have to tackle him. He was getting off stage at Re:Work’s Deep Learning Summit in San Francisco when a mob of adoring computer scientists descended on (clears throat) the Stanford deep learning professor, former “Google Brain” leader, Coursera founder and now chief scientist at Chinese web giant Baidu.

[snipped]

Um, can you elaborate on studying time?

By moving your head, you see objects in parallax. (The idea being that you’re viewing the relationship between objects over time.) Some move in the foreground, some move in the background. We have no idea: Do children learn to segment out objects, learn to recognize distances between objects because of parallax? I have no idea. I don’t think anyone does.

There have been ideas dancing around some of the properties of video that feel fundamental but there just hasn’t yet been that result. My belief is that none of us have come up with the right idea yet, the right way to think about time.

Animals see a video of the world. If an animal were only to see still images, how would its vision develop? Neuroscientists have run experiments in cats in a dark environment with a strobe so it can only see still images—and those cats’ visual systems actually underdevelop. So motion is important, but what is the algorithm? And how does [a visual system] take advantage of that?

I think time is super important but none of us have figured out the right algorithms for exploring it.

[That was all we had time for at the Deep Learning Summit. But I did get to ask Ng a followup via email.]

Do you see AI as a potential threat?

I’m optimistic about the potential of AI to make lives better for hundreds of millions of people. I wouldn’t work on it if I didn’t fundamentally believe that to be true. Imagine if we can just talk to our computers and have it understand “please schedule a meeting with Bob for next week.” Or if each child could have a personalized tutor. Or if self-driving cars could save all of us hours of driving.

I think the fears about “evil killer robots” are overblown. There’s a big difference between intelligence and sentience. Our software is becoming more intelligent, but that does not imply it is about to become sentient.

The biggest problem that technology has posed for centuries is the challenge to labor. For example, there are 3.5 million truck drivers in the US, whose jobs may be affected if we ever manage to develop self-driving cars. I think we need government and business leaders to have a serious conversation about that, and think the hype about “evil killer robots” is an unnecessary distraction.

Anyone who has ever taken the term “laptop” seriously can attest to the extraordinary amount of heat they produce when the processor is cranking away. Despite years of advances in processor design, there is still a lot of heat produced as a by-product of running a CPU. This is all wasted energy that could be used for more productive purposes, but first we need a new approach to microprocessor design. A team of UCLA engineers might have figured out a way to make integrated circuits far more efficient by using a class of magnetic materials called multiferroics.

The standard processors in your computer, phone, and even your TV rely on millions or billions of transistors packaged as an integrated circuit. A transistor is essentially a tiny electronic switch that, when chained together, act as logic gates (AND, OR, etc.) Directing current through a transistor involves a certain amount of inefficiency, resulting in heat generation and the loss of electrons. There’s really no way around that as long as you’re moving electrons from one place to another, and the problem only gets worse as more transistors are packed into smaller spaces. A multiferroic material sidesteps the issue using a phenomenon known as spin waves.

A multiferroic material can be switched on and off at will simply by applying alternating voltage. Doing so allows it to carry power from one point to another through the cascading spins of electrons rather than by actually moving them. This complex magnetic effect is called a spin wave bus, but you can think of it a bit like an ocean wave. The energy of the wave moves in toward shore, but individual water molecules don’t have to go anywhere — they just move up and down as the wave passes.

It may seem like this would be an early April Fool’s joke, but the image above shows serious research in action. [Ben Lang] recently had the chance to interview the director of a program that wants to make the Holodeck a reality. The core goal of the research — called Project Holodeck — is to develop an affordable multi-player virtual reality experience outside of the laboratory. We’ve heard speculation that Sony and Microsoft will release their next-gen systems in 2013; we’d rather wait for this to hit the market.

[Nathan Burba] is the director of the program. It’s part of the University of Southern California Games Institute and brings together students of Interactive Media, Cinema Arts, and Engineering. The hardware worn by each player is shown off at the beginning of the video after the break. Most of the components are commercially available (a Lenovo laptop worn in the backpack, PlayStation controllers, etc.) but the stereoscopic display which gives each eye its own 90-degree view was developed specifically for the project.

After seeing the in-game rendered footage we can’t help but think of playing some Minecraft with this equipment. We just need some type of omni-directional treadmill because our living room floor space is very limited.

Kinda old, but forgot to post it. Now this is a Linux machine I really drool over.

NASA has selected an SGI Altix supercomputer to help it meet future high-performance computing requirements. The new system will be the first supercomputer to operate 2,048 processor cores and 4TB of memory under control of one Linux kernel, creating the world’s largest single-kernel Linux system, NASA and SGI announced this week.

Driven by 1,024 Dual-Core Intel Itanium 2 processors, the new system will generate 13.1 TFLOPs (Teraflops, or trillions of calculations per second) of compute power. Based hundreds of computer “blades” that each sport a pair of dual-core processors, the system provides an extremely high density of compute power per square foot, enabling NASA to pack more computing power into its supercomputing center. NASA also acquired an ultra-dense 240TB SGI InfiniteStorage 10000 system to efficiently handle the massive data storage requirements.

IMAGINE ONE THOUSAND thousand thousand thousand bytes. A terabyte, if you will. But more than just that—a milestone in storage capacity that hard drive manufacturers have been chasing for years. After more than a decade of living in a world of gigabytes, the bar has finally been raised by Hitachi’s terabyte-capacity Deskstar 7K1000.

Being first to the terabyte mark gives Hitachi bragging rights, and more importantly, the ability to offer single-drive storage capacity 33% greater than that of its competitors. Hitachi isn’t banking on capacity alone, though. The 7K1000 is also outfitted with a whopping 32MB of cache—double what you get with other 3.5″ hard drives. Couple that extra cache with 200GB platters that have the highest areal density of any drive on the market, and the 7K1000’s performance could impress as much as its capacity.

The DOD is developing a parallel to Planet Earth, with billions of individual “nodes” to reflect every man, woman, and child this side of the dividing line between reality and AR.

Called the Sentient World Simulation (SWS), it will be a “synthetic mirror of the real world with automated continuous calibration with respect to current real-world information”, according to a concept paper for the project.

“SWS provides an environment for testing Psychological Operations (PSYOP),” the paper reads, so that military leaders can “develop and test multiple courses of action to anticipate and shape behaviors of adversaries, neutrals, and partners”.

SWS also replicates financial institutions, utilities, media outlets, and street corner shops. By applying theories of economics and human psychology, its developers believe they can predict how individuals and mobs will respond to various stressors.

SEAS can display regional results for public opinion polls, distribution of retail outlets in urban areas, and the level of unorganization of local economies, which may point to potential areas of civil unrest

Yank a country’s water supply. Stage a military coup. SWS will tell you what happens next.

“The idea is to generate alternative futures with outcomes based on interactions between multiple sides,” said Purdue University professor Alok Chaturvedi, co-author of the SWS concept paper.

LAS VEGAS – Comcast Corp. Chief Executive Brian Roberts dazzled a cable industry audience Tuesday, showing off for the first time in public new technology that enabled a data download speed of 150 megabits per second, or roughly 25 times faster than today’s standard cable modems.

The cost of modems that would support the technology, called “channel bonding,” is “not that dissimilar to modems today,” he told The Associated Press after a demonstration at The Cable Show.

It could be available “within less than a couple years,” he said.

The new cable technology is crucial because the industry is competing with a speedy new offering called FiOS, a TV and Internet service that Verizon Communications Inc. is selling over a new fiber-optic network. The top speed currently available through FiOS is 50 megabits per second, but the network already is capable of providing 100 mbps, and the fiber lines offer nearly unlimited potential.

The technology, called -DOCSIS 3.0, was developed by the cable industry’s research arm, Cable Television Laboratories. It bonds together four cable lines but is capable of allowing much more capacity. The lab said last month it expected manufacturers to begin submitting modems for certification by year’s end.

SAN FRANCISCO (Reuters)—IBM has developed a way to make microchips run up to one-third faster or use 15 percent less power by using an exotic material that “self-assembles” in a similar way to a seashell or snowflake.

The computer services and technology company said the new process allows the wiring on a chip to be insulated with vacuum, replacing the glass-like substances used for decades but which have become less effective as chips steadily shrink.

It is the latest achievement for IBM researchers, who have announced a number of advances in recent months allowing chips to get smaller despite challenges posed by physical laws at those tiny dimensions.

“This is one of the biggest breakthroughs I’ve seen in the last decade,” said John Kelly, International Business Machines Corp.’s senior vice president of technology and intellectual property.

“The holy grail of insulators is to use vacuum … and we’ve broken the code on how to do this,” Kelly said.

The technique works by coating a silicon wafer with a layer of a special polymer that when baked, naturally forms trillions of uniformly tiny holes just 20 nanometers, or millionth of a millimeter, across.

The resulting pattern is used to create the copper wiring on top of a chip and the insulating gaps that let electricity flow smoothly. A similar process is seen in nature during the formation of snowflakes, tooth enamel and seashells, IBM said.

Technology from NeuroSky and other startups could make video games more mentally stimulating and realistic. It could even enable players to control video game characters or avatars in virtual worlds with nothing but their thoughts.

Adding biofeedback to “Tiger Woods PGA Tour,” for instance, could mean that only those players who muster Zen-like concentration could nail a put. In the popular action game “Grand Theft Auto,” players who become nervous or frightened would have worse aim than those who remain relaxed and focused.

NeuroSky’s prototype measures a person’s baseline brain-wave activity, including signals that relate to concentration, relaxation and anxiety. The technology ranks performance in each category on a scale of 1 to 100, and the numbers change as a person thinks about relaxing images, focuses intently, or gets kicked, interrupted or otherwise distracted.

The technology is similar to more sensitive, expensive equipment that athletes use to achieve peak performance. Koo Hyoung Lee, a NeuroSky co-founder from South Korea, used biofeedback to improve concentration and relaxation techniques for members of his country’s Olympic archery team.

“Most physical games are really mental games,” said Lee, also chief technology officer at San Jose-based NeuroSky, a 12-employee company founded in 1999. “You must maintain attention at very high levels to succeed. This technology makes toys and video games more lifelike.”

Boosters say toys with even the most basic brain wave-reading technology — scheduled to debut later this year — could boost mental focus and help kids with attention deficit hyperactivity disorder, autism and mood disorders.

The basis of many brain wave-reading games is electroencephalography, or EEG, the measurement of the brain’s electrical activity through electrodes placed on the scalp. EEG has been a mainstay of psychiatry for decades.

An EEG headset in a research hospital may have 100 or more electrodes that attach to the scalp with a conductive gel. It could cost tens of thousands of dollars.

But the price and size of EEG hardware is shrinking. NeuroSky’s “dry-active” sensors don’t require gel, are the size of a thumbnail, and could be put into a headset that retails for as little as $20, said NeuroSky CEO Stanley Yang.

By the second half of this year, Intel plans on producing the first of its Penryn family of 45-nanometer processors and the company also plans to move ahead with its next-generation architecture in 2008.

On March 28, Intel executives delved into some additional details of its Penryn line of processors and also offered a glimpse at its Nehalem architecture, which the company said could offer up to eight cores per chip, as well as integrated graphics and memory controllers.

The Santa Clara, Calif., chip maker has 15 different 45-nanometer processors—which use the company’s Hi-k processor technology—in various stages of design and will have two fabs dedicated to manufacturing these chips by year’s end. By the second half of 2008, Intel plans to have four fabs dedicated to 45-nanometer chip manufacturing.

Steve Kleynhans, an analyst at Gartner, said that in the past few years, Intel has been able to deliver on its promises of processor innovations, including reducing chips from 90-nanometer to 65-nanometer. This latest announcement will put additional pressure on its main rival, Advanced Micro Devices—as long as Intel can continue to deliver new processors on time.

“It’s good for Intel as a company and it puts more pressure on AMD,” Kleynhans said. “For several years, Intel was at a disadvantage in the market and that allowed AMD to capture market share. Now, with Penryn, Intel looks like they are coming back strong and they seem intent to stay on top.” [read on]

With little fanfare, Advanced Micro Devices released two, low-watt Athlon desktop processors on Feb. 20, along with a new dual-core chip model, the Athlon 64 X2 6000 , the company said in a statement.

The single-core processors, the Athlon 64 3500 and the 3800 , while not new models, will now be produced using the company’s 65-nanometer manufacturing process and have 45-watt thermal envelopes.

The Athlon 64 X2 6000 is the latest desktop model from the Sunnyvale, Calif., company. Unlike the other two models that were released on Tuesday, this chip, which is clocked at 3.0GHz, uses the older 90-nanometer manufacturing process and has a 125-watt thermal envelope.

The new chip releases come after AMD made several announcements about changes to its processor lineup. On Feb. 12, the company announced that it would lower the prices on several of its desktop processors. When the new prices were announced, company officials said the lower prices reflected the demands of its customers, not the ongoing price war involving its main rival, Intel. [More]