Posted
by
timothy
on Thursday November 10, 2011 @11:28AM
from the vapor-tastes-like-raspberry-foam dept.

l_bratch writes, quoting from the BBC, "'British computer chip designer ARM has unveiled its latest graphics processing unit (GPU) for mobile devices. The Mali-T658 offers up to ten times the performance of its predecessor." ARM claims that its latest GPU, which will be ready in around two years, will have graphics performance akin to the PlayStation 3. If this has acceptable power consumption for a mobile device, could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

Except there is NO WAY it can be done at 1w even at the best rate of computing improvements. Remember, they did not mention power usage in their press release, only the submitted did. While they are taking power into consideration, it seems to me more of scale in where idle usage is extremely low with the cores shut down. This is great news for moble devices that don't expect full usage most of the time (assuming the scale is extreme to where idle is extremely low power usage).

Remember, Arm has been slowly scaling up in speed while x86 scaling down on power usage. It wouldn't be surprising if this new gpu uses more power then traditionally known for arm. That said, alot remains to be seen. Press release and actual performance can be worlds apart. How many times have a company promised something-like performance only for it to not deliver. Hopefully, it's true though.

Consider that the PowerVR SGX543MP products support up to 16 cores, but nobody has shipped one with more than 2 (Sony's PS Vita will be the first with 4). I believe the Mali-400 in the SGS2 is a 4-core part.

Considering that PS3 -> 2013 (when the ARM GPU is supposed to come out) is seven years, so we should see ~4 doublings, or 16x the performance that we saw in 2006 when the PS3 came out.

If we make the out-of-my-ass assumption that a 4-core mali-400 uses 2W of power at full load, and a 16-core T-658 will

Think about it. When the PSX came out, your average homeowner's TV screen was a scant 20 inches (4:3 ratio) diagonal. When the PS2 came out, that was a "whopping" 24".

When the PS3 came out? Yeah. 37" or larger 16:9 widescreens. A lot of them, given initial price tag, to well above 40".

Now play a PS2 or PSX game on that humongous screen. Looks like shit, doesn't it? Load that PSX game up instead in the Popstation version on your PSP, or in an emulator on a 13" or even 15" laptop playing with a USB controller from a few feet away. Suddenly it looks a whole hell of a lot better.

"PS3-level graphics" can be fudged quite a bit when you're dealing in "mobile" devices of a tiny screen and not trying to push massive amounts of AA to get rid of "jaggies" on a bigger screen with bigger pixels.

It doesn't matter if in two years we would be experiencing better graphics, just imagine playing a PS3 like graphics on something that barely consume 1W ( or dunno how much a mobile device should), and I would bet it wouldn't cost that much either.

I still believe that PS3 graphics will be severely dated in two years and is probably dated now. However, if this chip is truly low power and cool running, why not put 10+ of them on a single card?

I don't really think GPU hardware is limited by legacy architecture design to the extent that CPUs are. Which means that I think current generation desktop GPUs will already be quite efficient (despite there being graphics cards out that require secondary power connectors). Even x86 is being made more efficient all the time, it seems..

I think you misunderstand ARM's market. ARM is not in the desktop market, or even in the laptop market except at the low end. They do, however, completely own the embedded market right up to the top end of the smartphone and table markets. This kind of core will end up in smartphones and tablets. You will be able to run PS2-era graphics on something that fits in your pocket and work from batteries (and probably has Thunderbolt or HDMI output for connecting it up to a big screen). It isn't competing wit

You do realize that the desk isn't why PCs use so much power right? You also realize that people will still use desks whether they have an x86 PC or not, just as they did before the x86 was invented right? ARM is absolutely working towards competing with x86. In what way is trying to get people to buy an Arm computing device instead of an x86 computing device not competing?

The ARM was just as much a desktop CPU as the x86 was. The difference is that ARM got crushed in the desktop market. At the time, the desktop market demanded computing power at any energy cost, and ARM simply couldn't keep up with Intel. With Intel's focus on the desktop, ARM proceded to pick up Intels scraps. All of the little markets that Intel decided were too small to worry about.

Fast forward to the 2000, and desktop speeds start outpacing most user's needs. The last 5-6 years of desktop speed improvements have basically been a CPU bubble. CPU speeds have increased faster than most people have any use for. We are currently seeing a state where people are realizing that they are vastly over paying in energy for their CPU processing power usage. The bubble is bursting. ARM is way behind in ramping up the processing power of their CPUs as well as way behind in ramping up their CPU power usage. Like many other bubbles, suddenly people realize that what they were chasing isn't worth it, and they would rather have what was available 10 years ago.

In today's post CPU bubble environment, we are seeing a situation where Intel's CPU's are not low power enough to cover the entire market, and ARM's CPUs are not fast enough. They are both racing to hit the sweet spot that gives them market dominance, but don't be fooled into thinking that they are not racing to the same goal. They are just at opposite ends of the field.

The question is who will reach the goal first. On the Intel side, you have dominance in the traditional computing environment as well as market mindshare. Most people know who Intel is and that their PC uses an Intel processor, but most could not tell you what kind of processor their ARM device uses. On ARM's side, you have a new market that did not care about Intel compatibility, and settled on ARM.

Intel will continue to push downward to smaller devices, while ARM will continue to push upward with larger ones.

Anyone who hasn't tried to run a modern browser with modern dynamic web pages on one.

That's sarcasm, right?

Yes. I was implying that anyone who still thinks a 1.3GHz Athlon is still usable as a desktop clearly hasn't tried to access any complex web pages with one, giving my laptop as an example of why trying to do so is painful at best.

Although in Minecraft, you can get some high res textures that make the game look a little more modern, and there are also modded shaders which can do some neat stuff as well. Even stuff like bump mapping.

I was playing with the default 16x16 for a long time, but I've finally got a little sick of it and made the switch up to 32x32.

Just did 256x256 just for shits and giggles. It looks pretty damn amazing with all that nice stuff in the texture pack I found. I also installed a mod that allows the HD stuff and a bunch of other performance options.:)

one thing i've noticed since returning to slashdot is that all the kids are claiming supercomputer specs are running slow on their tablets etc. seriously there was a time when 3 mhz was cutting edge. what happened to break all the 3 mhz codebase? gone cause some idiot flagged it as obsolete?

the only reason why hardware with low specs isn't running right is some form of virus like a rabbit program, or some form of power management that is crippling the hardwares specs. i know slashdot has always been a honey

Changes in CPU architecture, for one thing. Z80 bytecode won't run natively on an Athlon. Also an increase in standards for graphic design and internationalization. It's a lot slower to render bidirectional or ideographic text with stacked diacritics in dozens of writing systems using antialiased scalable fonts with color and shadow than to render monospace fonts from a 7- or 8-bit character set in one size and in black and white.

Well it is the point that the PS/3 is old in computer terms. While this is an advancement for mobile computing It still fits in the fact in terms of performance mobile computing is about 5-10 years behind desktop computing.

And it always will be, unless somebody devises as way to provide 15A of power to a mobile device, and a way to dissipate that sort of heat.

Now, we may eventually reach a state where it just doesn't matter - everybody will have enough computing power on their phone to raytrace a 4K HD stream in realtime and they will reach a natural equilibrium where it just doesn't make sense to make faster chips for desktop computers. Or, we might see such great Internet pervasiveness that everybody just has thin-clients and computes on a CPU farm, but until either of those things happen, desktops will be faster than mobile devices.

I am thinking about advancements in PC displays. I remember 110x100 graphics then 320x200 then 640x480, 800x600 then 1024x768 (on 14" screens) After that they just made displays larger to handle more pixels. the iPhone 4 was one of the first devices I have seen that offered a higher DPI.The same thing with color depth, Monochrome, CGA (4 color), Ega (16 color), VGA (256 Color), SVGAs that now offer the common 16 bit/24 bit/32 bit colors.

umm, look at the tegra 3. ARM graphics are catching up to consoles quite easily (consoles were always behind). Remember, it's been 3 years where we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years. All with devices that are more efficient with power than anything intel can offer. So what do you see for the next 12 months, let alone 3-4 years? Even if the increases slow down they're basically going to make x86 processors irrelevant.

we went from "ARM can barely handle nintendo emulation (single core/500mhz/125mhz gpu)" to "ARM is competing with PS3(4 cores, 1.5ghz, 300+mhz multicore gpu)". In *3* years.

Are you comparing emulating an NES to running native games? An emulator has to contend with the entire game engine being written in bytecode, and it has to process graphics a scanline at a time so that games' raster effects (parallax scrolling, fixed position status bars, Pole Position/Street Fighter/NBA Jam floor warping, etc.) still work. A native game running on a 3D GPU doesn't need the bytecode interpreter overhead, and it can render one object at a time because it doesn't need raster effects.

Not necessarily. Compatibility demands have increased since the Nesticle days and even since the FCE Ultra 0.98 days, and users are less willing to put up with known emulation glitches in specific games than they used to be. The "new PPU" engine in FCEUX is slower, but its behavior is more accurate to that of the NES than the old PPU, and some games demand this accuracy. For example, the Final Fantasy orb effect, text boxes in Marble Madness, and certain things in Sid Meier's Pirates are all done with cycle-timed mid-scanline writes to the PPU's I/O ports. The English version of Castlevania 3 and later Koei games use an IC called "MMC5" that's almost as complex as the coprocessors used in some Super NES games.

Compatibility demands have increased since the Nesticle days and even since the FCE Ultra 0.98 days, and users are less willing to put up with known emulation glitches in specific games than they used to be.

Maybe I'm misremembering, but NesDS seems a lot less compatible than FCEU did back in the day.

For one thing, I wasn't aware nesDS was still being maintained. For another, the DS has a 67 MHz CPU and therefore can't run the whole PPU in software, so it emulates NES video using DS's tiled video mode. This doesn't work for mid-scanline effects.

SNES emulation may itself be a poor example. Like most old systems, programmers used various low-level tricks to increase performance. Worse still, the cartridge based nature of the console allowed extra game-specific hardware coprocessors to be shipped with different games! These kind of tricks are much less used now due to better hardware and compatibility concerns, so programmers tend to stick to published APIs. This makes new hardware more amenable to emulation at the API level e.g.OpenGL - we can now e

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

Never underestimate the low-end. Imagine a dongle with an HDMI plug on one end that just plugs into a TV set, but inside it has a chip that can do PS3-level graphics, WiFi for downloading games, Bluetooth for controllers, and enough flash to cache them.

Most HDMI ports can provide 150mA at 5V, which is minimal for this sort of application, but within sight in the next several years.

this reminds me of dragonball where they eventually become so strong that they would blow up the earth trying to fight there.the numbers are being faked by someone, and nobody here seems to care that somehow we went from 3 mhz cpu with 3 mhz gpu all the way to 64bit 6 core 2.2 ghz cpus and 512 bit 800MHz 1408 Stream Processing Units gpus all in what 20-25 years?

it's beyond absurd and frankly i don't like it. this is why i have a 40(60 watt 3d gaming) watt(3.5amp) computer and a 70 watt tv set. if they cras

In two years, PS3-like graphics will be insufficient for the desktop and console market, and we will be in the same situation.

PS3 graphics are a bit dated already. Consoles (and console ports) are seriously limiting the graphics in current run games. It's a pity, really. Good that cell phones will have circa 2006 GPU capabilities soon, though.

I kind of hope for more stagnation in the graphics quality market. Let's just hang out where we are for a while and hopefully the game makers will start competing on interesting story lines, game mechanics, etc. rather than ripples in water in puddles.

I kind of hope for more stagnation in the graphics quality market. Let's just hang out where we are for a while and hopefully the game makers will start competing on interesting story lines, game mechanics, etc. rather than ripples in water in puddles.

Improved CPU and GPU capabilities and better gameplay are not mutually exclusive. There are physical limitations to, for instance, rendering a huge number of characters on the screen at once. Or the memory is simply not there to utilize all the interesting animations you need to support that interesting storyline you need.

Look at it this way... better CG technology hasn't necessarily made movies better, but it really expanded the range of what really good filmmakers could do with realistic budgets. Impro

Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

I think you're looking at the wrong side of the street. This isn't about the top-end computing power; it's about the efficiencies on the bottom end. So, now you can start churning out laptops and cheap PCs with pedestrian graphics cards that use low power and provide significant performance. No need to take the truck nuts off your Dell, sir.

could we be seeing ultra-low power hardware in high-end PCs and consoles soon?

I thought that was the entire point of the Wii. Because the "sensor bar" (IR position reference emitter banks) needed to sit by the TV, the console's case needed to be small. This meant Nintendo couldn't use a CPU and GPU with a high TDP, so it stuck with what is essentially a 50% overclocked GameCube. I guess Nintendo is trying the same tactic with the Wii U: take a roughly Xbox 360-class CPU and GPU and take advantage of six years of process shrinks to get the TDP down so it'll fit in the same size case.

Sure we will at least on the PCs although it won't cost a lot and it'll be from AMD and Intel, like with Brazos and Ivy Bridge CULV. I mean when I can pick up a netbook for $350 with 8Gb of RAM that will play L4D and TF2, get 6 hours on a battery while watching HD movies, outputs 1080p over HDMI, and all in a machine that only weighs 3 pounds and costs less than my craptastic Celeron laptop did 5 years ago? Now THAT is nice!

I think the next advance will be just how far what was once considered "gamer only"

"could we be seeing ultra-low power hardware in high-end PCs and consoles soon?"

Not soon, but inevitably. The equation is: better power efficiency equates to more stream cores. The number of stream cores tends to increase to compensate, so discrete graphics card power consumption stays about the same, near the maximum of what typical cooling systems can accommodate. This somewhat obscures the ongoing trend to lower power designs. However, power consumption per stream unit governs the maximum practical throughput (aka heat dissipation) of high end discrete cards. Therefore it is only a

Sure, PS3-like graphics... except the PS3 is doing it at 1280x720 or 1920x1080. This will be pushing probably 20-40%% of the pixels.. and doing so in 2 years, while the PS3 hardware is 5 years old (to the day).

So, no, I don't think that a chipset that will, in 2013, do 20% of the job that 2006 hardware does will be making its way into high-end PCs and consoles soon.

Except most phones released today have 1080p output via hdmi. So now what?

The Galaxy Nexus' built-in display is 720p. (That it's Pentile is irrelevant to this issue.) If it follows a similar arc to the original Nexus those screens will be showing up in low-end phoens within a couple of years.

There will still be some hard-core graphics intensive games that will require whatever the cutting edge in graphics is at that point.

However, as old as PS3 may be- the fact is, that, for most of us non-hard-core gamers PS3 quality graphics is more than enough (and will be still in another 5 years time) for the vast majority of games we'd want to play.

We're beginning to hit a point of diminishing returns on graphics anyway- you're always going to be limited by what the eye can process, and the ability of the

Graphics card companies always try to outperform their competitors. You can do that on price only, but no one is going to buy a new card that's exactly as powerful as the one you already have only cheaper. For that reason I suspect the current trend to continue.

In 2 years a phone with a 1080p display is a likely reality. We already have phones/tablets running at/near 1280x720 which is 50% of the 1080p pixel count. But to say that it would be acceptable on the high end PC side is a stretch, in 2 years we will probably have desktop expectations beyond 1080p. Entry level to mid market could see a benefit though, that market has been under served by horrible attempts at "integrated" graphics for years. It will be interesting to see if this GPU compares to the beef

The PS3 is 5 years old and based on even older graphics tech. Beating that on mobile is cool, but not surprising. The PS3 never was impressive, graphically, to PC users. Who had better than HD resolutions for years. Some console games are still limited to 720P. Oh, and people had 3D on PC like, 8 years ago (or more.) Sucked then, sucks now.

nVidia is commited to releasing a new Tegra chip every year. The Tegra 3, which is already out is 5x faster than Tegra 2 (which beats the Mali 400 which is at 1/10th the speed of the GPU ARM announced). So basically, by the time this ARM CPU is released.. Tegra 5 will be out.. and going by the roadmap of how fast Tegra 5 will be.. it will run at least 5x times faster than ARM's chip.

I hope ARM prices this cheap dirt cheap.. so that sub $200 (off contract) phones can have it.

Both Sony and Nintendo considered using it for their new consoles but the heat and power usage apparently made them turn away from it.

And Nintendo ended up using something just as hot and power-hungry for the 3DS. As I understand it, the reason Nintendo ditched Tegra for the 3DS had everything to do with the fact that Tegra wouldn't work with an ARM9 core (ARMv5), and Nintendo needed something cycle-accurate to the ARM946E in order to play DS and DSi games without glitches.

The more important limitation is not human perception, it's cost. Remember the models in Quake? Remember the mods? A fairly competent 3D artist could knock out something like the Quake guy in a day or two. Now compare that to a modern game. A single tree in a modern FPS has more complexity than every model on a Quake level combined. That all translates to vastly more artist time, which translates to greater expense. For a Pixar film, you can spend a huge amount developing and texturing every model, but for a game the upper limit is a lot lower.

Because we are getting to the point in technology that us humans won't be able to perceive the difference in graphics. You can only make something so lifelike, after that you might as well aim at efficiency.

Is there a single game out there that's so lifelike that you can't perceive the difference between it and a real video?

There's plenty more room for improvement, we're not getting anywhere close to that point.

High end was a dumb thing to add. PCs in general yes. If can pump out 1080p it will be good enough for 99.7% of current PC users. Are people going to run CAD or high end video games on it? Probably not.Gamers just don't seem to get just how small of a percentage of PC users they are. For a good long time PCs will probably be stuck at 1080p for the majority of monitors since TVs will keep the cost of the panels low for a good while.

Not really. The Wii was not start of the art when it came out and did very well. I don't hear people screaming for better graphics than the PS/3 or the 360. Combine that with the rise of casual games and yes it could run a console well enough for many users. The high end market could and frankly is shrinking. You can get good video cards and I do mean good cards for around $120 now that will run games very well on the average monitor. You only need the high end cards for 27" high resolution monitors like t

There was a story on CNN a few weeks ago that said that while PC sales are slowly increasing in the entire world, it's very tilted, and they are falling dramatically in the US, Canad, and Europe. The increase is coming from the developing world being able to afford computers as they fall in price.

The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do. OK, you'll always get a few neckbeards to say "But the cell phone can't run AutoSuperCadShop 390322.55!" But that misses the point. That's not what 99.9% of consumers DO with their computers. They play some games, including 3D games, they check their facebooks, they look at some news headlines, and so on. All that works fine on a device that they can fit in their pocket. For those times a keyboard is needed, a bluetooth keyboard will do just fine. And for those times a larger screen is needed, a wireless connection to the screen will work fine.

I don't know why people can't see this shift happening right in front of their eyes. Even the sales data bears it out now: mobile computing is on the upswing, and in the western world, PC sales are falling. It's a nice world: Instead of needing to lug around a desktop or even a netbook, you'll have the capability of a (say) 2009 vintage desktop in your shirtpocket in 2014. A 2009 desktop is nothing to sneeze at, and meets the needs of 99% of the population just fine. The rest will become a more expensive niche, but will still exist in some way.

The culture shift from desktop computing to mobile is happening in part because mobiles are becoming powerful enough to do most of the tasks that desktops used to do.

No, they're not. They're becoming powerful enough to check your email and play Farmville, which is all that many people used to do with their PCs; they're not much good for actual productive work.

Meanwhile PC gaming has stagnated due to Microsoft concentrating on pushing console games, so there's little reason for the average home user to upgrade. Word won't let you write stuff ten times faster just because you switched from a Pentium-4 to an i7, and when games are limited by being designed for an Xbox and

I think his point which you missed is that checking email and playing FarmVille is what the majority of consumers do. Most of them are not playing leet games or rendering animation. In businesses, they might write in Word or crunch a few numbers in Excel. It doesn't take a quad-core Core i7 to do that. The stagnation comes from the fact that a desktop made 5 years ago will handle the majority of their tasks and mobile computing is approaching the point where they handle a good majority. Also mobile com

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130659 [newegg.com]can you find 1024 stream processors for a console yet? in sli mode?and yet in a way i long for simple fun games like i used to play on whatever console was popular at the time.no i don't like 'social gaming' it's too whiny and spammy and it seems to exist merely as a reason to go to facebook.

the story isn't about the handhelds matching desktops, it's about the handhelds getting some very powerful graphics. besides, the reference was with consoles, not desktops. Just because consoles in a few years might be doing holographic displays it doesn't mean handhelds doing pretty nice 3D graphics on battery power isn't nice too.

holographic displays tanked when sega tried them in arcades. they did worse than laserdisc games. it was a load of special glass and small playing field. i only ever saw one once, but i know they were invented... oh wait it was a parabolic screen, http://www.definiteanswers.com/q/What-is-SEGA-s-HOLOGRAPHIC-game-4c120518a7a5b [definiteanswers.com] but as a kid it seemed like a holographic game!

the thing is, i don't like games like i used to, especially since some hardware is really flaky like cheap Chinese knockoffs. that and i h

Five years ago tomorrow the PS3 made it's debut, did you think that in the mean time everyone just sat back and basked in the glory of its infinite capabilities? Two years from now (if that pans out) will be 7 years since the commercialization of the Cell chip, so seeing a miniature version that uses dramatically less power is pretty much par for the course. Desktop chips that have similar (or more specific) capabilities are already available in many products. Remember, the first PS3 drew an amazing 200

good points but we are also talking about things in the single digits for power consumption. I agree, die shrinkage and advances in designs give lots of power savings. Still, having PS3 like graphics on a handheld will be nice.

I think this will make a huge difference in mobile gaming because of screen size. Assuming that this thing outputs to 720p like the Nexus Galaxy, I think this will be a big thing.

While the PS3 graphics are old and crappy compared to what a modern PC can do, don't forget about screen size. Seeing 720p on a 40 inch screen is a lot different than seeing 720p on a 5 inch screen. The best example of this is fonts that look fine at 5 inches will look like crap expanded to 40 inches. Artifacts and jaggedness on