Posted
by
timothy
on Friday May 27, 2011 @09:33PM
from the ungrateful-wretches dept.

donniebaseball23 writes "Sony CFO Masaru Kato told investors this week that the company won't be looking to put the same kind of massive R&D into PS4 as they did with PS3. PS3's costs were astronomical because of Blu-ray and the Cell chip, but Sony's bottom line can't take another similar hit. Analysts are speculating that this will leave the door open for competitors like Microsoft. 'PS4's hardware could be less impressive than the PS3 at its launch. I think Microsoft will really be able to put the screws to Sony in the next console war,' Panoptic analyst Asif Khan commented to IndustryGamers."

All consoles makers use OpenGL - except Microsoft of course. If Microsoft takes greater advantage in the console arena, it'll mean less developer mindshare on open standards in place of MS's proprietary engines. Fewer GL developers on consoles could translate to fewer GL developers for desktops as well - which is one of the main barriers to companies writing games for Linux and other non-MS platforms.

I guess anyone could give their take on which company is less evil, but it would seem to me that the ramifications of MS dominating in the console arena could be a pretty bad turn for all other gaming platforms. Sure Nintendo is still around but their scope is somewhat different from the other two.

Why not reuse the cell design: use the exact same chip, but manufacture it with current lithography technology, smaller structures, higher clockrate, more SPUs. It may do the trcik, and there is no new learning curve for devs. I have programmed SPUs, and they can do wonders if used correctly.

define successful? Sure the Wii has moved a lot of units. But in terms of games sold, hours played, or in terms of money made for developers (not necessarily manufacturers) they are way behind. Good for nintendo does not necessarily equate to success as platform.

The consoles make it more like gaming was in the early days. Tweak the shit out of what you have, because you can't just make them buy a new machine to play your "super game". Consider the C-64... its lifespan showed that developers could make some seriously awesome game if they got to know the architecture.

What PC gaming did is make it easy for companies to write something that took more horsepower, and because of the architecture of PCs, developers could just require more this or more that. (believe me, it wasn't a conscious decision to make the architecture open... IBM was just in a rush.)

I like the idea that game companies work on an architecture and squeeze it dry. Why should we go back to the model that allow developers to be lazy and code for the "latest and greatest" because they can't be bothered to get into the architecture. One of the primary reasons I don't game on the PC anymore is the upgrade loop I can't get out of. Now that my computers are not for gaming, I get MANY more years of life out of them.

Only LAZY developers make inferior games.... great games come from great programmers, not from great hardware.

I don't think phones would replace consoles, either. But aside from that, which OS will the consoles run 5 years in the future.

Some food for thought: Android 3.1 supports Xbox 360 and Wii controllers (among other things). It's up to the apps to make use of that, of course... but the support alone makes me think that Google looks at more form factors than just phone and tablet.