Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

angry tapir writes "Development around the original Cell processor hasn't stalled, and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said. IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. 'We want to stay in the business, we intend to stay in the business,' he said. IBM confirmed in a statement that it continues to manufacture the Cell processor for use by Sony in its PlayStation 3. IBM also will continue to invest in Cell as part of its hybrid and multicore chip strategy, Menon said."

There are going to be some 120-140 million or so PS3 sold over its 10-11 year lifetime. Yeah, IBM is making 'next to nothing with Cell'.

IBM was so happy with landing PPC/Cell contracts for all three consoles that they immediately dumped Apple as a customer upon doing so after being fed up with the nightmare of dealing with Apple and Jobs over the years.

Hmm. I'd never looked into the Atari Transputer much. I figured it was a lot like an Amiga 2000/3000, but overhyped, and with GEM:) Turns out it was quite a machine, with a lot of innovation that's only catching on in PCs now. If it wasn't for the lack of an MMU, I might have liked to see it replace both Amigas and PCs:) Also, a lot of the stuff here:

There is an emulator, under active development, See posts on newsgroup com.sys.transputer.

The 260 megaflops must be for some kind of an array - they were designed to be used in arrays. The individual transputers never clocked faster than 25 MHz, though the FPU on the T800 was relatively fast for the time. Each transputer had four bidirectional links connected to DMA engines wired directly into the hardware scheduler, so that inter-processor communications were very low cost.

And of course Pixars little foray into world of computer hardware, the RM1. Not quite a raytraced renderer as such (REYES), but it was actually used for commercial films very briefly (e.g. TinToy, StarTrek2).

These days you fire off renders by invoking the prman executable.... 'p' being short for 'prototype' (it became quickly apparent to Pixar that Sgi's development and performance curve was outpacing their own hardware division. Rather than try to compete, they simply shut down the hardware division,

The story is not that IBM continues to manufacture chips but that the Cell design is not dead. This contradicts earlier stories to some degree.In all fairness, it contradicts only on the surface as IBM only stated in the older story that Cell as separate design will end and its co-processor-heavy design will merge with future POWER iterations.

There were also rumors that IBM won't manufacture PS3 Cell CPUs any longer, leaving it to contractors.

At the very least, you should acknowledge that the continued development of gaming devices (and associated technology) is spreading out into improvements in many other fields of technology, some of which you may find more interesting/relevant to your everyday life.

I acknowledge it if you like. But I fail to see how the Cell chip, in particular, has achived this: all the improvements in the technology, only Mercury computers [wikipedia.org] are not related to gaming.Yes, until some time ago,one could run Linux on PS3 (thus making use of the Cell chip outside the entertainment area)... but the rumors have it as no longer possible.Do you know otherwise?

It is actually one branch of what is appearing to be a fork in gaming machines: ultra-high-performance renderers like the PS3, and peripheral driven lower performance systems like the Wii, Some people have sid that the Wii is the way of the future, current generation renderers do all the graphics you need, gaming developments will be in the UI not in graphics. This is a step down the opposite path: we can ans should g3t better graphics.

I have been wondering just how long it will take for the "ooohhh shiny!" factor to wear thin. Hell I fire up Far Cry I or Wolfenstein on my $36 HD4650 and the people stand around and go "oooohhh". You really don't need any higher to have decent immersion in a game, and especially with FPS if the game is worth a damn you are too busy dodging fire to just stand around and look at the shiny. Then add in the spiraling costs and delays to market adding lots of "ohhh shiny" add, and it quickly becomes "get a hit, and on time, or we'll all out of business" and that simply isn't sustainable long term.

That is why I wouldn't be surprised if the next gen gaming consoles don't do something similar to the original Xbox, which I thought was a damned good idea at the time. You could take a cheap ULV Phenom II Quad, add a 5xxx Radeon GPU and some decent controllers and have the average Joe drooling at the "ooohhh shiny" for a long time, and the combination of cheap hardware, the ability for developers to easily code with tools they already have, and the quick time to market would probably make it a hit.

I just don't see the incredible amounts required to bring a new gen of consoles not seriously hurting any companies bottom line. With a more off the shelf approach all they have to do is cook up the DRM and a close to bare metal OS for it and let the economies of scale keep the price low out the gate and drive prices even lower as time goes on. While MSFT could blow the cash simply because they have twin cash cows in Office and Windows, I doubt Sony will be able to afford the needed capital, and Nintendo has made it pretty clear they aren't gonna play the "ooohhh shiny!" game at all with targeting the Wii to casual gamers. I just don't see a never ending ooohhh shiny arms race being good for anybody. Just look at how ATI is using Eyefinity to push new GPUs and Nvidia looking at HPCs with CUDA, even they know the "ooohhh shiny" can only go so far. Hell I figured when I got the HD4650 it would just be a stopgap until I could get a $150+ GPU, but now? Hell it plays Bioshock II and everything else I throw at it with plenty of ooohh shiny and doesn't turn my apt into a sauna bath, so why bother? I used to be a serious graphics whore, but even I got tired of the ooohh shiny and now prefer games that are actually...what's the word?...oh yeah FUN. I'm starting to wonder if the whole graphics race is starting to hit a dead end.

The reason your HD4650 is sufficient is because you're playing essentially console games on a graphics card thats slightly more powerful than the consoles themselves. Also you're only running at a low resolution, try gaming at 1920x1080 on your 4650, it won't get you too far.

That is why I wouldn't be surprised if the next gen gaming consoles don't do something similar to the original Xbox

There is a reason they didn't do that again, it's because it's more expensive than a dedicated console...

Except if you look at Tigerdirect and Newegg the most popular models are 1600x900, NOT 1920x1080. As the article on/. pointed out not too long ago we are losing screen height as the new monitors and nothing but LCD TVs without a tuner. I personally have a 1600x900 22in given to me brand new in the box as a thank you gift from a customer, and the "oohhh shiny" looks just fine and any bigger I would need to get a new desk.

So I have no doubt if you buy the biggest monitor you can find you're gonna need a big

Yeah until you see Modern Warfare 2 at 1920x1200, highest setting on an LCD monitor. It looks so sharp and the spectral and bump mapping and huge texture resolution really blew me a way. The hundreds of particles from various fires in the game, for instance the tree on fire in the sub urb map is an amazing sight I've never seen before. At 120hz for a more solid experience when you look around.

But yes gameplay is just as important. Monsters in Doom 1 and 2 that had pixelated blood splatters around the walls,

I wish I could buy a consumer-priced system with one of these CPUs. A very interesting system to develop for. After all, we all are going to use some kind of system with the separate memory model, more like this, when we will come to the end of scalability of the currently dominating multicore CPU with common memory space.

I hope that PS4 (or other console using it) will be linux-friendly as PS3 was until Sony blew it. Alas, however slim this chance is, there seem to be no better chance.

Why the insults? I'm perfectly aware that circumvention is only necessary because of Sony asshattery, but as of *now* you can get a machine with firmware up to 3.41 and jailbreak it. Initial booting of linux has happened and I would expect in a few weeks or a couple of months to see it made relatively simple.

game over more cores with less heat.
Great, but where is the software expert side going to come from?
It seems to take years for any 3rd party to work out how to optimise "anything" HD for the systems.
With a push for more cores how about a push for more developer support vs "cloud-based" and p2p servers.

The basic problem with the Cell processor is that it has 256KB (not MB, KB) per processor, plus a bulk transfer mechanism to main memory. Given that model, it has to be programmed like a DSP - very little state, processing works on data streams. For games, this sucks. No CPU has enough memory for a full frame, or for the geometry, or a level map. Trying to hammer programs into that model is painful. (Except for audio. It's great for audio.)
In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O. And, of course, Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering.

But if each Cell CPU had, say, 16MB, the Cell machines could be treated more like a cluster. Programming for clusters is well understood, and not too tough.

It's probably too late, though. Multi-core shared memory cache-consistent machines are now too good. It's not necessary to use an architecture as painful as the Cell. It's probably destined for the graveyard of weird architectures, along with data flow machines, hypercubes, SIMD machines, systolic processors, semi-shared-memory multiprocessors, and similar hardware that's straightforward to build but tough to program.

Well, IIRC it's intended purpose was for embedded devices. They were talking about smart fridges, security systems, etc. Basically networking your home with smart devices that were running on Cell and then being able to use that processor juice distributed across the devices since the Cell scales very well(which is why the PS3 makes a great supercomputer farm)

No; no you don't recall correctly, not even a little bit. Not a jot, not a tittle. Cell was designed specifically for the PS3, and maybe for other kinds of (repetitive streaming type) work that is mostly done by GPUs and/or CUDA in this day and age.

Yep, Cell is being used far outside its original design spec. Of course, if gaming consoles is its current largest market, the next generation will probably look much more like a standard POWER6 or 7 in its architecture - more emphasis on more powerful support cores, more memory per core, and all the other things that have made their way into every other CPU family currently popular.

Back in the early PS2 we would talk about what a next generation PS2 would look like. Those whiteboard diagrams looked almost identical to what Sony and IBM came up with.

The parallels between the PS2/EE/GS and PS3/Cell/RSX are almost identical:

Execution starts on the EE/PPUHeavy/parallel computation task is spawned off to the VUs/SPUsLight control code runs in parallel on the EE/PPUAs graphical elements become read to be rasterized they are spawned off to the GS/RSX

I found this article interesting. They write about Valves approach to multi-core CPU's and game engines.

The programmers at Valve considered three different models to solve their problem. The first was called "coarse threading" and was the easiest to implement. Many companies are already using coarse threading to improve their games for multiple core systems. The idea is to put whole subsystems on separate cores; for example, graphics rendering on one, AI on another, sound on a third, and so on. The problem

...processor is that the company selling it's flagship product decided to lock out people wanting to experiment with it.

Fail. The flagship Cell processor is a more-capable unit that IBM will sell you for exorbitant amounts of money. The Cell in the PS3 is a toy version and even mentioning that it is based on cell is only marketing for the real thing to IBM.

I bet you didn't like that I used the word "Fail" in the modern vernacular sense. But in case you thought I was being non-factual, here is information on the real cell processor [wikipedia.org] which IBM sells for truly incredible amounts of money. I've looked up the pricing, and it is scary.

That's gotta sting Xbox 360 developers - to have fanboys calling the chip that beat the shit out of you this gen called nothing but a 'toy version'.

The Xbox 360 is also powered by a 'toy version' of PowerPC which is a 'toy version' of POWER.

Also, I think Sony, Microsoft, and Nintendo are all evil, and I do my best not to give any of them money any more. That means buying everything used and not paying for Live Gold. If that makes me a fanboy, then your comment makes you my bitch. But we knew that already because you're an anonymous pussy.

And nothing of value was lost to them. The only thing related to the PS3 that interests Sony is the selling of games, Blu-Rays and stuff from PSN. A bunch of basement dwellers installing Linux on their PS3 was an afterthought at best.

"Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering."

My god. You are repeating that Beyond3d forum lie in late 2010???

"For games, this sucks""Trying to hammer programs into that model is painful. (Except for audio. It's great for audio.""In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O.""It's not necessary to use an arch

The bandwidth in and out of those tiny spu memories is great, much better than between main memory and cache on an x86 processor, or generally between cache and processor on a GPU. I don't know what anyone needs that for though.

Ok, seriously, what's the deal with this? Sure, there are plenty of 360 fanboys out there who say nonsensical stuff about the PS3 (and PS3 fanboys saying nonsensical stuff back), but why lump PC gamers in with them? I'm not sure PC gamers generally care one way or another which console does what graphically these days, considering that with the 4-5 years of advancements in hardware since they came out, you can now buy a video card that'll run any half-assedly (un)optimized console port at 1920x1200 at 60f

Please don't confuse the SPUs (the eight coprocessors on the Cell die) with the PPU (the main CPU core). The PPU is also part of the Cell, so don't call the SPUs "Cell CPUs". There is also no MIPS core -- the PPU is a 3.2GHz PPC core with two hardware threads. The SPUs also run at 3.2GHz, but are not considered "real" CPUs since they can't bootstrap themselves, they have to be given tasks from the PPU.
SPU programming forces a model on you as a developer -- modularize your tasks with as few synchronizat

I would not want to be betting against IBM for this marketspace. Their cell chip, which is an asymmetric multi-core CPU architecture, seemed bizarre when announced, but has proven to be quite good for these workloads. If IBM is looking to leverage their regular POWER chipset for the console market, they will probably build some screamers with them.
Cell and POWER both have Unix and Linux adaptations running on them, so having the capability seems trivial. Whether vendors will want you using their hardwa

If IBM is looking to leverage their regular POWER chipset for the console market, they will probably build some screamers with them

All of the current generation consoles use IBM chips. The GameCube and Wii both used PowerPC 4xx series chips - IBM's low-end 32-bit PowerPC line. The XBox 360 uses a custom 3-core in-order PowerPC chip. The PS3 uses Cell (PowerPC core + 7 SPUs - the PS3 gets the ones where one of the SPUs failed the tests, the ones where all 8 work go into blades and supercomputers).

A while back I was looking for one or two Cell CPU based machines as development boxes for inhouse geophysical software - basicly to see if it's worth going onto that platform. The three week process between contacting what appeared to be the only vendor of Cell based workstations and getting a price for an entry level machine was frustrating. It involved daily calls to a slimy bastard that appeared to just want to waste time trying to become my friend until he had carefully finished weighing my companies wallet.In the end the time window had come and gone (the developers got bored or gave up on the idea of using the Cell) before I could get even a hint at the price but I kept going for the sake of future projects. The price for one workstation with one processor was fairly similar to that of six of our cluster nodes. You would need some sort of black-ops budget where any Accountants coming close are shot on sight before paying that sort of price. An entry point machine no much different to a playstation with more memory cost a truly insane and unjustifiable price.

Really well here you go.http://beagleboard.org/hardware [beagleboard.org]http://gumstix.com/ [gumstix.com]There are a lot more but beagleboard is the closest I have seen to a mini ITX board.Just plug in a keyboard, mouse and monitor and you are good to go.

That is true. I don't know about now but a few years ago you couldn't even get a pinout for the Cell processor. You had to show both IBM and Sony your business plan and your market could not impact Sony. IBM has some cookie-cutter circuit boards with a cell on them that they want to sell for big bucks, along with a big down payment and minimum quantities. The reality is that the Cell processor is not THAT great. good, but not great, and requiring a big change in the way you factor out your software design,

odd, when we were working with cell we went straight to matrix vision and they LOANED us the hardware for about a year.. Nothing sleazy at all. IBM Also loaned us a server, as did Sony (a beautiful rack-mount job which will never see the light of day).

Bottom Line - the PPC part of the Cell is rubbish, terrible IO and generally 'weak' by todays standards, the SPEs are great, but not enough memory on them (256k) for the algorithms + tables we needed to pro

I can go one better: I do signal processing for a living - chewing on multi-hundred megasample/second streams of data in real time. The Cell looked like a perfect fit. We were looking at 1000's per year. Contacted IBM - sorry, not enough zeros on that number for us to sell you the chips. OK, are there any vendors that are targeting the uTCA form factor (that the Telecomms folks are are all over, so they would not have been targeting just us)? Nope, just large blades for mainframes.

I assert that IBM doesn't want to be in the chip business - at least, not "selling chips to anybody else". They don't mind making chips for their own use, but they really don't have the infrastructure to sell to anybody else.

Sony and Toshiba don't want to be in the high-end CPU market, they want to be in the mass-market stuff.

Had IBM licensed the Cell design to somebody like Freescale, they might have gone somewhere.

Sorry, but I RTFA - and what I came away with was "We will continue to support Sony for as long as Sony wants to make PS3's". I saw nothing that really said "We are going to be going someplace else with this."

It's more of a triangle. In one corner, you have general-purpose CPUs, optimised for branch-heavy code with lots of locality of reference. In another, you have streaming, often SIMD, processors optimised for non-branching code, with high throughput, such as GPUs and DSPs. In the third corner, you have specialised silicon dedicated to specific algorithms (e.g. building blocks for encryption algorithms or video CODECs).

Cell is along one side of this. It isn't particularly throughput-focussed, and it is

I know Slashdot is the enemy of good writing practices, so this post will be modded downto hell, but I feel I must point out that lately, the capitalization of titles of Slashdot submissions got completely out of hand. The rule is simple: if you want to capitalize your headlines, you capitalize every word except- prepositions ("of", "to", "in", "for", "with" and "on")- articles ("the, "a" and "an")- and some other obvious exceptions.

On Slashdot, the editors are so ignorant that they usually capitalize each

What would be a pretty cool chip would be an 8-core chip with 4 x86_64 cores, two graphics cores, and two Cell cores. (perhaps IBM + AMD working together)

After that, build a custom Linux with MeeGo as the front end / launcher.... It would be cool if game console makers embraced Open Source for everything up to launching the games....and if they don't want their SDK open source, that's fine, just make the Operating System so it can launch the games, then get out of the way. Run it on two cores (for better functionality with Multimedia capabilities, ebook reading, etc.) and use the rest of the cores (2 x86_64, 2 Graphics and 2 Cells) for gaming.

As for the other hardware, Composite, Component, HDML, VGA, WiFi, Ethernet, and a headphone jack.(maybe bluetooth for wireless controllers and the ability to use bluetooth headsets)..blu-ray, card reader, and USB.

This is all off the top of my head, and would be a pretty cool gaming console, which would truly capture the home entertainment medium and make most people looking for gadgets, consoles, or HTPCs drool appropriately.

What would be a pretty cool chip would be an 8-core chip with 4 x86_64 cores, two graphics cores, and two Cell cores. (perhaps IBM + AMD working together)

this is a bad idea because it's precisely the kind of ignorant crap that hypertransport is supposed to eliminate. Instead of cramming a bunch of crap into one package, you sell multiple packages so that people can customize their layout. Ideally you'd have the x86_64, cells, and graphics cores all communicating via HT links, and then it doesn't matter where they are physically located... but trying to put all that into one package would be a TDP nightmare at this point.

It's a new processor architecture, IBM and Sony (and possibly others) had a hand in it. Effectively two "Power" cores and a bunch of vector processing units. It's supposed to be very very good for vector operations. For a while (a few years back now) the world's most powerful supercomputer was a machine composed of nodes containing two cell processors and an Opteron each.

It's different to other parallelisation strategies as the vector units (SPU/SPEs) allow you to parallelise stuff at an operation level, unlike just stuffing more cores into the box which is the intel/PC strategy. For games and graphics this it thought to be good, hence its inclusion in the playstation 3. It's also supposed to be good for scientific computing.

I guess you could think of it as somewhere between a CPU and a GPU, or a hybrid of the two approaches.

For games and graphics this it thought to be good, hence its inclusion in the playstation 3.

Of course game developers tend to be a bit more sceptical. The Cell requires a very specific way of programming (don't align your data flow to the processor's capabilities and performance nose-dives), which doesn't go over well with people who have limited time to make their game/engine work on several different platforms, most of which work roughly the same.

Look at the PS2. Now look at the 1st gen games for it versus some of the latest ones. The differences are huge, and they are due purely to better programming techniques (same hardware.) I've no doubt that the PS3/Cell will have a similar lifespan.

Also, I know it discussed in almost every tech generation of consoles, but this time it might be true: Is the hardware finally good enough? This may be directly influenced by the popularity of Flash-

I don't particularly care about the XBox. The last non-portable console I actually was interested in was the PS2.

This comes from the point of view of a casual gamer who is not concerned with having the latest and greatest but has a brother who is. I've seen the X360 perform on a large HDTV set and I've seen the PS3 perform on the same set. Both look good. The Cell may outperform the X360 by a large margin if given enough time but that remains to be seen. Right now I'd put them as reasonably close (= to so

Actually, since console graphics are meaningless next to sales numbers (this is business, after all) the winner is Nintendo, with Microsoft and Sony being also-rans. Using a souped-up Gamecube (which sold as often as the PS3 and the X360 combined) and a portable system with four megabytes of RAM (which sold as often as the PS3, X360 and PSP combined) Nintendo has outpaced them.

It doesn't matter whether Microsoft's promo videos are pre-rendered and Sony's are not; Nintendo's look like they're from ten year

IMO this was one of the main failures of the architecture. Xbox360 developers just have to worry about parallelizing their code, Cell developers on top of that have to worry about writing code that can make use of the SPE's, let alone efficient use of them.

The Cell was designed back when Sony needed hardware that could decode their high definition blu-ray streams. I think this is why the SPEs are useful for decoding operations and little else in the gaming world.

Interestingly, Cell was tolerant of losing SPUs in manufacture. A lot of "bad" chips would've been used as lower-end Cells for cheaper devices, while being essentially the same platform as far as developers were concerned. I don't think much came of that though. One laptop with a 4-SPU Cell, talk of a 2-SPU Cell as a video processor in a high-end HDTV. A shame, really, as they had a lot of half-dead Cells rolling off the line when they were trying to crank them out for the PS3 launch. Wonder what happened t

But seriously, if this person has no idea what a Cell processor is, I'm pretty sure the concept of CPU optimization will be lost on them. You could say it was a new type of chip made by elves to regrow tissue and they would probably believe it. Just how out of touch would someone have to be to miss the Cell, and not bother to Google it before posting?