Exynos 5 Dual supports resolutions of up to 2560x1600 as well as SATA, USB 3.0.

Samsung has just released details about its new Exynos 5 5250 SoC for mobile devices. This dual-core, 1.7GHz chip is the first one on the market to feature the new Cortex A15 CPU architecture from ARM, which will provide substantially improved performance over the Cortex A9-based chips used in most of today's smartphones and tablets.

The chip also includes ARM's new Mali-T604 GPU designed to power Retina-class displays and support high-performance connectivity options like SATA and USB 3.0. These improvements make it a substantial upgrade over current-generation products like NVIDIA's Tegra 3 or Samsung's own Exynos 4. We'll look at different aspects of the chip to see not just how the Exynos 5 and other Cortex A15 SoCs will benefit current tablets, but also how those improvements could lead to more viable laptop replacements.

The CPU: ARM's Cortex A15

Most ARM processors in today's devices, including the NVIDIA Tegra 3 in the Nexus 7 tablet and all variations of the Apple A5 used in newer iPads and iPhones, use Cortex A9-based designs. The A9 excels in power usage, but is more limited when it comes to performance.

The A15, on the other hand, is designed for devices that need higher performance, and is expected to outperform competing designs like those used in Qualcomm's Krait architecture (which powers, among other things, the US versions of the Samsung Galaxy S III). As we discussed when we first took a look at the A15 architecture, it isn't really intended to replace Cortex A9, which will still have a place in the middle and lower end of the markets where power draw and price are more important than high performance (and in those applications, the A9 will probably be replaced by the Cortex A7 later on). Rather, it's intended to compete with Intel and AMD in performance and features as those companies look to expand into the burgeoning smartphone and tablet markets. We've already seen Intel processors show up in phones like the Xolo X900 and the recently announced Lenovo ThinkPad Tablet 2; Cortex A15 is intended to stem that tide.

A downside to the A15 is that it does increase power usage over the Cortex A9, but Samsung's 32nm process should help to mitigate that issue somewhat. When Apple's A5 processor was moved from 45nm to 32nm using this same "high-dielectric metal gate" (HK+MG) process technology for the $399 iPad 2, it was enough to reduce its power usage and increase its battery life to the tune of 20 to 30 percent.

The GPU: ARM's Mali-T604

While the A15 will increase the Exynos 5's CPU power, its GPU might be more important as Android and Windows tablet manufacturers begin shipping Retina-esque displays to compete with the most recent iPad. The Mali-T604 is purpose-built for such devices: its maximum supported resolution is 2560x1600 (1280x800 doubled), and its 12.8GB/s theoretical memory bandwidth (the same as the A5X in the 2012 iPad) and 800MHz LPDDR3 RAM give it the memory bandwidth it needs to draw an image that large.

In addition to driving high-resolution panels, this graphics power can be used to encode and decode HD video at 60FPS and to push an image to displays wirelessly, though whether this latter feature takes advantage of the Miracast standard (as NVIDIA's Tegra 3 does) remains to be seen. It also supports stereoscopic 3D.

Just as impressive is the Mali-T604's list of supported APIs: DirectX 11, OpenCL 1.1, OpenVG 1.1, and Renderscript are all here, as well as OpenGL ES 1.1, 2.0, and 3.0 support. OpenGL ES 3.0, which brings some features from the standard OpenGL 3.x and 4.x specifications to mobile devices, was just released earlier this week. Full Scene Anti-Aliasing (FSAA, at both 4x and 16x) is also supported.

Aside from the feature list, however, we don't know much GPU's actual performance level beyond its theoretical memory bandwidth. The only clue we have is a promise from Samsung that the Exynos 5 features twice the 3D performance of the old Exynos 4. Using average benchmarks from the GLBenchmark Web site for the Exynos 4 version of the Samsung Galaxy S III and the 2012 iPad, we can try to extrapolate performance from there.

All scores from the GLBenchmark Web site.

So, going from these numbers, a GPU with double the performance of the Exynos 4 would be playing in the same field with the GPU in the A5X, and could possibly outperform it by a decent margin—our theoretical Exynos 5 would win big in the Egypt test, but break just about even in the Pro test. Again, this is all speculation, but this gives us at least a vague idea of how the Mali-T604 is going to perform.

Going beyond tablets

In addition to impressive-looking performance, the Exynos 5 also supports some interesting connectivity options—SATA, UART, USB 3.0, and eMMC 4.5 are all listed as bootable devices on Samsung's site.

This says a lot about the kind of devices we might see the processor crop up in. Tablets like Asus's Transformer Prime and Microsoft's forthcoming Surface RT are two ARM-equipped tablets that attempt to replicate the experience of using a laptop using some sort of keyboard and pointing device (a keyboard dock for the Transformer, and Microsoft's promising but as-yet-untested screen covers for the Surface). The addition of support for high-performance and high-capacity storage devices that use USB 3.0 and SATA could make tablets equipped with the Exynos 5 and chips like it even more plausible replacements for low- or mid-tier laptops.

SATA and USB 3.0 support are also features that would be useful in many low-end servers or network attached storage devices—NASes, in particular, have been accumulating more and more server features as they've evolved over the years, and some (like the Iomega px12-450r we looked at a few months back) even include full-fledged Intel processors to power these different services. The Exynos 5's GPU is certainly overkill for this kind of device (and Samsung's product page definitely focuses on the more lucrative phone and tablet use cases), but both the Cortex A15 and the Mali-T600 series are pretty scalable—both can be scaled up or down from one to four processor cores, so as the Exynos 5 lineup expands, seeing it or a processor like it in some sort of server doesn't seem completely out of the question. The presence of hardware virtualization support in the Cortex A15 architecture also paves the way toward potential server usage.

Conclusions

On paper, the CPU and GPU specifications of the Exynos 5 Dual certainly look to be greater than or equal to Apple's powerful A5X in the most important ways. The chip's A15 architecture and 1.7GHz clock speed (compared to the A5X's 1.0GHz) should definitely give it a CPU advantage, while the Mali-T604 looks more than capable of driving a Retina-class display (and, at least according to our educated guessing, should be a bit faster than the quad-core PowerVR SGX543MP4 found in the A5X). Added support for SATA and USB 3.0, on the other hand, give the chip the kind of connectivity it will need to worm its way into the sort of laptop-replacement tablets ARM and its licensees would love to see storm the market.

Promoted Comments

Despite those being two completely opposing statements, the fact that it's a marketing term is precisely the point in using it.

malor wrote:

And it's not pixel-doubling anyway, it's usually pixel-quadrupling, but not always. And then it's not always addressable by software... the software may or may not be driving quarter-res, and then the OS is driving it at some other, much higher resolution, and then sub-sampling back down to the final output level.

Exactly the reason for using a marketing term to describe the end-user behavior. Shall we call them high-density displays instead, when that's not quite accurate either? The point is that the marketing term is a well-understood abstraction of a set of technical characteristics used to handle a single behavior. It's describing the behavior, not the technicalities.

Quote:

It has no meaning from a technical standpoint. It is a hopelessly confused term that doesn't apply to any one technology or approach. It should not be used in a tech-focused article, except as a nominative for Apple-branded displays.

I think it would be generous to say that it's picking nits to take exception with its use. It's a "hopelessly confused term" if you begin to pick apart its technical meaning; if you take it for what it is -- for what it was intended -- the meaning should be quite clear. You understood what it was describing, right?

I believe your dismissive is precisely why it was put in quotes -- it's not an Apple display, but it could power a display that has the same behavior as Apple's. And since Apple's use is one of the more mainstream and popular uses of the technology, it is fitting that more people would immediately comprehend what behavior "Retina" is meant to describe.

Either way, it's just one aspect of the article. It really doesn't seem like a topic that should merit as much disgust as people seem to be taking to it.

A little disappointing that we're not seeing big.LITTLE just yet. Particularly after a process shrink heterozygous cores looks extremely promising when it comes to enabling both very high performance and great battery life potential in the same package, by properly optimizing for widely divergent uses at different times. I guess integrating the A7 is going to take longer though.

Also, bit of a lightweight article, not quite sure how you reached conclusions like this:

Quote:

On paper, the CPU and GPU specifications of the Exynos 5 Dual certainly look to be greater than or equal to Apple's powerful A5X in the most important ways.

Eh? As far as CPU goes yeah, the A15 should cream everything before it (though you don't actually do much analysis of why, have to go to a place like Anandtech for that nowadays). But the GPU? It might very well, but you don't even mention PowerVR at all or make the slightest comparison beyond noting they're matching bandwidth at last. What's the theoretical fill rate? Triangles? Even the most basic synthetic predicted FLOPs? Come on. Mobile GPUs really are racing along at an exciting pace, and we all want to see it continue to push forward hard, but you skip out on the whole thing.

But the GPU? It might very well, but you don't even mention PowerVR at all or make the slightest comparison beyond noting they're matching bandwidth at last. What's the theoretical fill rate? Triangles? Even the most basic synthetic predicted FLOPs? Come on. Mobile GPUs really are racing along at an exciting pace, and we all want to see it continue to push forward hard, but you skip out on the whole thing.

I want to know more information too, but we can't say much before we actually have one in our hands - both ARM's and Samsung's product pages are filled with vague statements like "5x the performance of previous Mali processors" (which?) and bar graphs that show performance relative to the Exynos 3 and 4, but with no labels on either the X or the Y axes (or even a label to tell us what "performance" is supposed to be measuring). It's all pretty laughable marketing-speak.

You make good points about the amount that we don't know, though. I'll tweak that sentence a little bit to introduce more uncertainty. :-)

Edit: I've updated the article with some extrapolation based on Samsung's promise that the Exynos 5 will double the 3D performance of the Exynos 4. It shows that, indeed, the Exynos 5 *should* be playing in the same field as the A5X. Keep in mind, of course, that this is all guesswork at this point since we don't have hardware in hand, but it's better than nothing.

You must really hate Xerox machines and Kleenex, then. And Band Aids, Dumpsters, and Frisbees. Sucks when marketing terms enter the common vernacular, right?

Those are brandnames, not marketing terms. And they became popular when they entered the language of normal people, but I seriously doubt any normal person would have the slightest idea what you meant by "retina display" (and would probably be scared by it).

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something that isn't dependent on how far the device maker anticipates you'll hold the device from your face, but if we went with that the iPhone would just be HDPI which wouldn't make for cool enough of a title.

You must really hate Xerox machines and Kleenex, then. And Band Aids, Dumpsters, and Frisbees. Sucks when marketing terms enter the common vernacular, right?

It somewhat does, but this case is far more egregious. For tablets there are already high-density displays such as the Transformer Prime Infinity (a little shy of the iPad 3), which makes the comment in the article very annoying, as if it's a big deal when it really isn't. It's just another incremental improvement. Also, the original iPhone 4's display density was overtaken not that long after its debut, hardly justifying it becoming a common term. And in this case, the display is not even manufactured by the company that uses that marketing term. Xerox at least invented and made the damn thing itself (as far as I know), and didn't take credit for other people's work.

Bottom line, I find the use of the term unprofessional and distasteful, but that's just my opinion.

One of the key metrics (perhaps *the* key metric) for these chips is power usage. Aside from a casual mention about it being greater, I don't see anything more.

I honestly think that any discussion about the chip really needs to hinge on that. If a tablet bearing this chip gets, say, 3 hours max, then it won't matter how much beef it has.

You are somewhat overstating the importance of the chip's efficiency to the overall battery life. A huge percent is consumed by the display (for most use cases), which is unaffected by the chip itself.Not saying this isn't important, but requires some perspective.

High ppi screens are something I've been excited about for years! The sooner my computer/tablet/phone screen looks like a piece of 300+ dpi printed paper, the better. I commend Apple for being big movers in this area since the ip4. Hopefully by the time I'm ready to upgrade my monitor I can afford a retina cinema display ^^

This is a techsite I just don't think it needs to use a marketing term to dumb down what a high dpi tablet means thats all.

Dude, give it a rest. If it was anyone else's marketing term but Apple's you wouldn't give a shit, and don't pretend otherwise. Ars uses it because everyone understand what it means. End of story.

There's no need to strawman me. I'm not a fan of marketing terms in general. And it's funny that you say everyone understands it when average people usually assume it means the pixels aren't visible at any distance.

I don't really think that mention of Apple should be peppered throughout this article nearly as much as it is, especially considering the numerous players in the space. This isn't about Apple, or their licensed technology. From the technological standpoint which seems to be the focus of this article, this chip simply DOES NOT have a public equal in terms of performance in the metrics that matter, and I'd bet that it also has superior performance/watt as well.

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something [....]

Not to the average person. And using seven or eight digits with an "x" in the middle isn't much of an improvement.

By eschewing creeping incremental improvements, when Apple doubled resolution (and yes, that's four times the pixel count) in one jump, they could give it that marketing name and say, "You know what (our) screens looked like before? They're now so much obviously better, there's no point making them sharper." And as a bonus, it simplified adoption from developers, and produced accurate results (as opposed to the previous attempts at resolution independence).

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something [....]

Not to the average person. And using seven or eight digits with an "x" in the middle isn't much of an improvement.

By eschewing creeping incremental improvements, when Apple doubled resolution (and yes, that's four times the pixel count) in one jump, they could give it that marketing name and say, "You know what (our) screens looked like before? They're now so much obviously better, there's no point making them sharper." And as a bonus, it simplified adoption from developers, and produced accurate results (as opposed to the previous attempts at resolution independence).

I don't think anyone has a problem with Apple calling their high dpi monitors retina. It was a great idea to double the resolution across their lineup. Everyone else still has a lot of catching up to do outside of phones. It's just not a very good technical term to be used like some kind of industry standard class of screens.

And on top of that, we've got SATA and USB 3 support on that same chip, and I have to blink and remind myself that we're suddenly in a completely new era of computing, where universal ARM chips are finally unseating x86. Now, instead of everything being dictated by Intel/AMD, we can have independent chip companies all over the world designing their own processors for different applications--including Windows!

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something [....]

Not to the average person. And using seven or eight digits with an "x" in the middle isn't much of an improvement.

By eschewing creeping incremental improvements, when Apple doubled resolution (and yes, that's four times the pixel count) in one jump, they could give it that marketing name and say, "You know what (our) screens looked like before? They're now so much obviously better, there's no point making them sharper." And as a bonus, it simplified adoption from developers, and produced accurate results (as opposed to the previous attempts at resolution independence).

Actually the door has already been opened by at least some Tegra 3 models, which also support 2560×1600.

You're right that the resolution is supported, but the Tegra 3 can't even outbench the iPad 2 in most cases. You probably don't want it pushing a 2560x800 display. :-)

Most of the popular tablet games aren't very demanding anyways. Tegra 3 is fine for the web and the angry birds of the world at that resolution.

Sure, but it does go beyond gaming - you've got to be able to play videos at that resolution, render the UI smoothly at that resolution, etc. etc. I'm not saying the Tegra 3 couldn't do it (I don't know, honestly) but as far as I'm concerned you can't have too much GPU when you're talking about such large resolutions.

How helpful was being the manufacturer of Apple's A5x in the ipad3 in the development of this chip? Any chip experts around to shed some light?Don't mean to troll but i'm sure it pointed development in a particular direction if nothing else.

Actually the door has already been opened by at least some Tegra 3 models, which also support 2560×1600.

You're right that the resolution is supported, but the Tegra 3 can't even outbench the iPad 2 in most cases. You probably don't want it pushing a 2560x800 display. :-)

Most of the popular tablet games aren't very demanding anyways. Tegra 3 is fine for the web and the angry birds of the world at that resolution.

Sure, but it does go beyond gaming - you've got to be able to play videos at that resolution, render the UI smoothly at that resolution, etc. etc. I'm not saying the Tegra 3 couldn't do it (I don't know, honestly) but as far as I'm concerned you can't have too much GPU when you're talking about such large resolutions.

Is it any more demanding to play youtube videos on that resolution? There's no 2560x1600 videos all over the place so its just scaling 720p videos.

It is, though it's of dubious usefulness in the tablets the Exynos 5 is mostly targeting.

It would allow one to run two OSes simultaneously, such as Android and Ubuntu, or perhaps iOS and OS X if Apple were to engage in an OS X port to ARM. Throw in a wifi display option and a couple of bluetooth peripherals and there's potential there.

You are somewhat overstating the importance of the chip's efficiency to the overall battery life. A huge percent is consumed by the display (for most use cases), which is unaffected by the chip itself.Not saying this isn't important, but requires some perspective.

You're probably right.

My "press release BS detector" went off because of the somewhat hard numbers being presented about performance combined with the lack of detail about power-per-watt. I can't help but speculate it's being swept under the rug because it's worse than one might expect. (I did some googling just now and can't find any details about power-per-watt for the Cortex A15.)

You must really hate Xerox machines and Kleenex, then. And Band Aids, Dumpsters, and Frisbees. Sucks when marketing terms enter the common vernacular, right?

Those are brandnames, not marketing terms. And they became popular when they entered the language of normal people, but I seriously doubt any normal person would have the slightest idea what you meant by "retina display" (and would probably be scared by it).

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something that isn't dependent on how far the device maker anticipates you'll hold the device from your face, but if we went with that the iPhone would just be HDPI which wouldn't make for cool enough of a title.

There's a reason the average consumer says "Do you have a WiFi connection?" rather than "Do you have an 802.11n connection?"

"Retina Display" seems to have fallen into a similar terminology role.

You must really hate Xerox machines and Kleenex, then. And Band Aids, Dumpsters, and Frisbees. Sucks when marketing terms enter the common vernacular, right?

Those are brandnames, not marketing terms. And they became popular when they entered the language of normal people, but I seriously doubt any normal person would have the slightest idea what you meant by "retina display" (and would probably be scared by it).

Tthere are better names for this stuff. I like Android's LDPI, MDPI, HDPI, XHDPI naming since they actually mean something that isn't dependent on how far the device maker anticipates you'll hold the device from your face, but if we went with that the iPhone would just be HDPI which wouldn't make for cool enough of a title.

There's a reason the average consumer says "Do you have a WiFi connection?" rather than "Do you have an 802.11n connection?"

"Retina Display" seems to have fallen into a similar terminology role.

There's many wifi products but only Apple screens will be labeled as retina though. Wifi is basically an industry standard.

You are somewhat overstating the importance of the chip's efficiency to the overall battery life. A huge percent is consumed by the display (for most use cases), which is unaffected by the chip itself.Not saying this isn't important, but requires some perspective.

You're probably right.

My "press release BS detector" went off because of the somewhat hard numbers being presented about performance combined with the lack of detail about power-per-watt. I can't help but speculate it's being swept under the rug because it's worse than one might expect. (I did some googling just now and can't find any details about power-per-watt for the Cortex A15.)

Naturally, I could be completely off-base. :-)

Yeah it's definitely not going to be competitive with Cortex A9. It's another situation where we won't actually know until we see A15s in an actual shipping product, but I think it's telling that ARM has designed a whole other architecture (A7) with all of the same features (virtualization, etc. etc.), but for use either in situations where low power usage is desired or as a sort of "companion core" to the A15, to be used when doing tasks that don't require all of that power (that's what the big.LITTLE thing is about http://www.arm.com/products/processors/ ... essing.php).

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.