A Texas Instruments SoC and (probably) 1GB of RAM power the device.

Earlier this month, Google announced some of the key specs for its Google Glass headset: 16GB of flash memory, a 5MP camera, a "high resolution display," 802.11g and Bluetooth, and a battery that lasts all day. What was missing was any information about Glass' processor and memory, which will in no small part affect the kinds of things that Glass can do.

Because Glass runs Android 4.0.4, with a little work it can be manipulated via the same Android development tools used on phones and tablets. Using the Android Debug Bridge (ADB) tool, Glass developer Jay Lee has discovered two new things about the hardware: it runs a dual-core OMAP 4430 SoC from Texas Instruments running at an undetermined clock speed, and it includes 682MB of RAM. Lee speculates that the total amount is 1GB, of which 682MB is available to developers; this makes sense to us, since otherwise 682MB is a strange amount of memory to have.

These specifications put Google Glass roughly in line with a high-end phone from early 2011 or so—Samsung's Galaxy S II uses the same SoC and amount of RAM. Though by no means high-end, it's not surprising that Glass uses relatively old hardware, since it has much less space in which to cram high-end guts (and the battery these internals would require). As Lee notes, the things Google Glass is used to do are ultimately of more importance than what's inside it, but nevertheless we agree with him on one important point: "I'm a geek and it's still awesome to nerd out on the guts."

Promoted Comments

Though by no means high-end, it's not surprising that Glass uses relatively old hardware, since it has much less space in which to cram high-end guts (and the battery these internals would require)

Why? Older hardware doesn't mean lower power or smaller, normally it's the opposite, unless TI started making the OMAP 4 series on a 28nm process like OMAP 5 (kind of like what Apple has done with its chip, creating a smaller version later in it's life for the A5).

There are 4 revs of the OMAP4430 and this is rev 3. Had they shrunk it, it would have a new rev. It runs at up to 1Ghz, but they could have easily underclocked it. This chip is used in the Kindle Fire, Galaxy Tab 2 10.1, and a bunch of other 2012 era tablets.

Though by no means high-end, it's not surprising that Glass uses relatively old hardware, since it has much less space in which to cram high-end guts (and the battery these internals would require)

Why? Older hardware doesn't mean lower power or smaller, normally it's the opposite, unless TI started making the OMAP 4 series on a 28nm process like OMAP 5 (kind of like what Apple has done with its chip, creating a smaller version later in it's life for the A5).

The obsession that most people have about having the "latest gadget" components in everything is truly bizarre. These things, like them or viscerally despise them, have a wholly different use case and requirements than an enormous-screened smartphone. Expecting these things to have a power-hungry top of the line just released CPU and associated parts is like all those forum posters wondering why the new Mars Rover didn't have Canon 5D Mark III's driven by Intel i7's.

The hardware (and operating system!) will be a bit old because Glass has been in development for a while, it's virgin territory, and redesigning when the new sexy comes out will just introduce delays when they're really wanting to get the dev kits out.

Settle down, kids, the consumer version will probably have a faster CPU.

There are 4 revs of the OMAP4430 and this is rev 3. Had they shrunk it, it would have a new rev. It runs at up to 1Ghz, but they could have easily underclocked it. This chip is used in the Kindle Fire, Galaxy Tab 2 10.1, and a bunch of other 2012 era tablets.

How does TI's newer chips perform energy wise? Considering it's expected to get a 'day of use' currently, perhaps with a newer chip we can get even longer usage with the consumer model? I wouldn't imagine they would keep this chip in the consumer version since it's probably a year or more out...

It may not be the latest but thinking about it, you now have a computing device with as much power as a desktop in the early 2000's in the form of glasses, Moore's law at its best.

So when do we get implants?

Who here wants to be technologically implanted?

+1 = Yes-1 = No

Qualified yes: Not by any company I don't fully trust, which includes any company with something to sell me besides the primary product. Advertising based on my PC browsing habits: annoying and intrusive. Advertising based on my day to day activities IRL: completely unacceptable.

Though by no means high-end, it's not surprising that Glass uses relatively old hardware, since it has much less space in which to cram high-end guts (and the battery these internals would require)

Why? Older hardware doesn't mean lower power or smaller, normally it's the opposite, unless TI started making the OMAP 4 series on a 28nm process like OMAP 5 (kind of like what Apple has done with its chip, creating a smaller version later in it's life for the A5).

In many cases it is easier to shrink them to a smaller process. It happens a lot with Nvidia, ATI/AMD, and Intel. It's a good way to get the new manufacturing processes going, with out the usual issues with yield that happens with new processor architectures. Not to mention some architectures are incredibly easy to make/develop for (Pentium), and still make money. The problem is, if Intel did not reduce the size then you run into the issue of it becoming more expensive to produce because you also have to maintain the facilities to do it, ontop of the newer facilities you use for newer processors.

On top of that, the reduced complexity of the older processors, when compared to newer ones, tends to make them easier to shrink.

Cool! I worked a contract for Intel porting Android to their OMAP chipset... so I've got some familiarity with it that I can put to good use hacking on google glass. I've been dragging my feet about becoming a glass developer, but now I might have to make the jump.

How does TI's newer chips perform energy wise? Considering it's expected to get a 'day of use' currently, perhaps with a newer chip we can get even longer usage with the consumer model? I wouldn't imagine they would keep this chip in the consumer version since it's probably a year or more out...

AFAIK, TI has always been quite good in this regard. My original NookColor has TI guts, and with the right software support, I can leave it in suspend for weeks without it losing more than a few % of charge. And on a full battery, it lasts roughly a day of reading. (Though it has older, single-core stuff with 512 megs of ram in it)

Plus, my very, VERY old HP iPAQ also had a wonderous battery life. Again, TI.

It may not be the latest but thinking about it, you now have a computing device with as much power as a desktop in the early 2000's in the form of glasses, Moore's law at its best.

So when do we get implants?

Who here wants to be technologically implanted?

+1 = Yes-1 = No

Qualified yes: Not by any company I don't fully trust, which includes any company with something to sell me besides the primary product. Advertising based on my PC browsing habits: annoying and intrusive. Advertising based on my day to day activities IRL: completely unacceptable.

Well, we got Tom replying to Peter, now we just need Dan Rather to show up....

as a point of reference, this galaxy nexus uses an omap 4460, which is faster than the 4430. The biggest difference between the nexus 4 and the galaxy nexus is that the processor in the galaxy nexus is lacking.

However, the glass has less pixels, no cell radio, and the underlying software is different. (yes it's android but it's been customized)

i'm guessing, but since these are dev devices (and early in the product line), G may have used off the shelf hardware that's a little larger and more power hungry, but with the computing/graphics power minimum they want. the version we will eventually get our hands on might be rev.5/rev.GG, smaller, maybe faster and less power hungry.

As long as the device does what it's supposed to, reliably and efficiently, I could care less about its internals. Besides, I couldn't update those if I wanted.My desktops, on the other hand, are built by me and they have the latest, best hardware I can afford. Or I update or build a new one. An ongoing process.

A Ti OMAP 4430 generally prefers 48-64 MB be reserved for the GPU, and we know Android likes to reserve about 20 MB for its own purposes.

682 + 64 + 20 = 766. Now that's close enough to a nice round 768 MB that it's almost certainly what's in the hardware.

But wouldn't a single chip make more sense than 2 or 3? I'll admit I know less than it appears you do, but I would guess these go the same as your average RAM chips. You could do single 1GB dies, or 3x256MB or 1x512MB+1x256. For the space, seems like a single die would be preferable.

Or does RAM work like processors? That is, the dev hardware got some...less than top shelf components, and Google got a great deal on irregular RAM - 1GB dies with 128MB disabled. Much like AMD and Intel do for lower-end processors - bad core on a 980? Disable it and sell it as a triple-core 720. Bad L3 cache on an i5? Disable it and sell it as a Pentium.

anyone else out there who would be a bit concerned about sticking something that gets as hot as a modern phone with latest spec processor 3cm from your eyes? curious anyway ... that is surely a concern and I haven't seen mention of. I don't know how things will pan out technologically, but I'm sure you could palm off heavy lifting onto a nearby smartphone one way or another. Definitely not socially awkward here, and keen to have a look and a play with this beast ....

But wouldn't a single chip make more sense than 2 or 3? I'll admit I know less than it appears you do, but I would guess these go the same as your average RAM chips. You could do single 1GB dies, or 3x256MB or 1x512MB+1x256. For the space, seems like a single die would be preferable.

Honestly, I don't know. The OMAP may have some on board (256 or 512 would be my guess) and Google could have added another chip.

It may not be the latest but thinking about it, you now have a computing device with as much power as a desktop in the early 2000's in the form of glasses, Moore's law at its best.

So when do we get implants?

I'm eager to see some subvocal recognition, a la Jane, between now and implants.

It will be interesting to see if the Glass concept eventually gets studded with brain sensors along the frame. Even a simple (and silent) 'activate what I'm looking at' function would be spectacularly useful.

'Long-press' activation is just about the worst possible addition to user experience since Windows ME was released; sadly, my guess is that the Glass interface is going to rely on something like 'long-view' activation. Rate-limiting a user interface is bad, mm'kay?

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.