Software rather than hardware developers largely dictated the new console's design.

We've already taken a reasonably detailed deep dive into the PlayStation 4's internals based on Sony's specifications for the console, and we know it looks more like a PC than anything in the current generation of consoles, right down to the eight-core AMD-supplied x86 CPU. In a discussion with Gamasutra, PS4 Lead Architect Mark Cerny emphasized the extent to which these decisions were driven by game software and middleware developers.

"[C]learly we had had some issues with PlayStation 3, in that a very developer-centric approach to the design of the PlayStation 4 would just make things go more smoothly overall," Cerny said, a reference to the PlayStation 3's complicated Cell CPU. The developer outreach process began as early as 2007, when Cerny was put in charge of the new console's hardware, and was done before any work had commenced on the actual hardware design.

"The biggest thing was that we didn't want the hardware to be a puzzle that programmers would be needing to solve to make quality titles," said Cerny. Feedback from developers directly influenced several major specifications that we'll see in the finished console: a large pool of unified memory shared by both the CPU and GPU was a common request, and developers also asked that the hardware use no more than eight CPU cores because "the consensus was that any more than eight, and special techniques would be needed to use them, to get efficiency."

This developer-centric approach is quite a turn-around for Sony; Sony Corporation President Kaz Hirai once infamously said of the PlayStation 3 that Sony didn't aim to "provide the 'easy to program for' console that (developers) want" because the company wanted developers to be able to squeeze more out of the hardware as time went on and their familiarity with it increased. This viewpoint is more difficult to defend in an age of cross-platform games that must be ported to as many platforms as possible in the shortest amount of time possible. In contrast, this cross-platform porting is something that Cerny wanted to account for when designing the new console's hardware.

"It definitely was very helpful to have gone out and have done the outreach before sitting down to design the hardware," he said.

Just IMO, but this is why every PS3 exclusive looked gorgeous, and basically almost all cross-platform titles looked like complete dog shit on PS3 in comparison to 360 and PC. Subject to opinion I know.

So basically they wanted to see better and better looking games on the PS3 as time went on so it didn't seem as much like an aging console. Instead they got inferior looking graphics (in general) on ported titles up until the last couple of years. I'm glad they're not putting these artificial barrriers in front of the developers this time.

Just IMO, but this is why every PS3 exclusive looked gorgeous, and basically almost all cross-platform titles looked like complete dog shit on PS3 in comparison to 360 and PC. Subject to opinion I know.

No dissenting opinions here, that dedicated RAM for XMB really threw Bethesda for a loop and was the sole reason why the expansions for Skyrim took such a long time

Software rather than hardware developers largely dictated the new console's design.

It is really hard to fathom that a company was making a product that should have been supported by software developers while not even asking them what they want. /s

Also, too little, too late -- Sony Playstation 4 is a PC which is not even trying to hide that fact.

Doesn't that tell them that software developers wanted a PC to begin with instead of a console?

What is sad is how much money and how many hours were wasted developing for limited and perpetually dying platforms, not to mention how much damage was done to PC gaming in the process.

How is this too little too late? It's not like any of us devs are going to stop developing for Sony hardware because the PS3 was a pain in the ass. We did it anyway before, now we're more likely to do it.

No dissenting opinions here, that dedicated RAM for XMB really threw Bethesda for a loop and was the sole reason why the expansions for Skyrim took such a long time

That may be a contributing factor to why PS3 DLC took longer but it's not the only reason. Microsoft also had a timed exclusivity deal for the 360. Don't believe me? PC DLC was delayed as well. In the case of Dragonborn it was delayed just as long as the PS3.

So basically they wanted to see better and better looking games on the PS3 as time went on so it didn't seem as much like an aging console. Instead they got inferior looking graphics (in general) on ported titles up until the last couple of years. I'm glad they're not putting these artificial barrriers in front of the developers this time.

They wanted an architecture that could squeeze more out of the current hardware used at the time, essentially leading to this. However, it makes it much harder to develope for the system in order to squeeze out that extra than what could otherwise be used. Much like how the PS2 went..

How is this too little too late? It's not like any of us devs are going to stop developing for Sony hardware because the PS3 was a pain in the ass. We did it anyway before, now we're more likely to do it.

Why would you bother developing for a PC with an identity crisis?

You will get a title which you can port to real PC but it will be optimized for AMD graphics pipeline and is bound to work poorly on NVIDIA hardware.

Also, why pay the licensing fees to develop games for a fake PC, when you can develop games for a real PC for free or at much lower cost?

I honestly fail to see the point.

Consoles need to die, if someone wants to make consoles, then why not make specialized PC hardware which runs standard OS and which can run all existing PC games?

Why develop for a console?Because people will always buy them, because they are a *known* target, because you don't have to worry if they have a new cpu but are three generations back on video, or have a mobile GPU. Because You can buy a console for say, $400, whereas a similar gaming rig will be more, be bulkier, not have a TV oriented UI etc. Sure, I have my PC hooked up to my TV but not everyone will.

You have poor understanding of the economics of game development if you think this isn't already the case. These changes will actually be *better* for PC games.

So basically they wanted to see better and better looking games on the PS3 as time went on so it didn't seem as much like an aging console. Instead they got inferior looking graphics (in general) on ported titles up until the last couple of years. I'm glad they're not putting these artificial barrriers in front of the developers this time.

They wanted an architecture that could squeeze more out of the current hardware used at the time, essentially leading to this. However, it makes it much harder to develope for the system in order to squeeze out that extra than what could otherwise be used. Much like how the PS2 went..

At the time of the PS3 launch the Cell processors design (with 1 full and 7 partial cores) was highly buzzword compliant in the high performance computing world. Two major things have happened since rendering its design obsolete. Increasing transistor density has allowed large numbers of full cores in a single CPU die allowing for a much simpler programming model for applications which can benefit from a few extra threads, and GpGpu capabilities allow much higher performance for applications that can scale across a huge number of threads.

Because it will be tens of millions of PCs with the exact same identity crisis owned by people who bought the things specifically to play games.

This has been hashed out over and over since consoles existed. You geeks need to start paying attention.

igor.levicki wrote:

Also, why pay the licensing fees to develop games for a fake PC, when you can develop games for a real PC for free or at much lower cost?

Ah, the "time has no value" argument. Classic.

igor.levicki wrote:

I honestly fail to see the point.

Who cares?

igor.levicki wrote:

Consoles need to die, if someone wants to make consoles, then why not make neat and compact PC hardware which runs Windows (or Linux now that Valve ported Steam) and which can (or will be able to) run all existing PC games?

It must just drive you nuts that other people exist who like things you don't like and have different lives that call for different devices with different use cases. How can you stand to live in this world of diverse individuals?

It is really hard to fathom that a company was making a product that should have been supported by software developers while not even asking them what they want. /s

Not if your previous effort was the highest selling console of all time, also with a non-PC architecture, and you are a hardware company, not a software company.

igor.levicki wrote:

Doesn't that tell them that software developers wanted a PC to begin with instead of a console?

Game devs would like to develop for a machine with infinite RAM and infinite clock speed and infinite storage. So we should give them that, right?Oh no wait. The money console manufacturers earn is from software sales, not from the developers. And so they have to sell something to the public. And if you want to hit 155m install like PS2, you will need to sell something to the "come home, press power button, pick up pad, start playing COD" demographic.

igor.levicki wrote:

What is sad is how much money and how many hours were wasted developing for limited and perpetually dying platforms, not to mention how much damage was done to PC gaming in the process.

Many developers like consoles as a target, because they are static. No worrying about the vagaries of the user's PC config etc. A static hardware target forces developers to work hard on optimization. That often feeds back to better performance on more forgiving platforms.

How is this too little too late? It's not like any of us devs are going to stop developing for Sony hardware because the PS3 was a pain in the ass. We did it anyway before, now we're more likely to do it.

Why would you bother developing for a PC with an identity crisis?

You will get a title which you can port to real PC but it will be optimized for AMD graphics pipeline and is bound to work poorly on NVIDIA hardware.

Also, why pay the licensing fees to develop games for a fake PC, when you can develop games for a real PC for free or at much lower cost?

I honestly fail to see the point.

Consoles need to die, if someone wants to make consoles, then why not make specialized PC hardware which runs standard OS and which can run all existing PC games?

(1) Hardware baseline uniformity - no need to worry about corner case hardware, you have a consistent target.

(2) Other running software relative uniformity - no need to worry about what else is loaded and running on the platform.

(3) Streamlined OS - because it's not a general purpose OS, overhead is reduced.

(4) It's a living room device for the general consumer. Enthusiasts may have HTPCs, some of which may even be capable of running higher end games, but the average consumer does not.

(5) buy-in: once someone has paid multiple hundreds of dollars for a console, they want games for it to "justify" their hardware purchase.

I'm far more of a PC gamer than anything else, when it comes to my own gaming, but... seriously? Consoles are hugely favorable targets for game development. And once you've dev'd something with a console target, it's often relatively easy to re-target for PC, especially if you don't mind having a delayed (and possibly still somewhat buggy/limited options) PC release for an also ran. I could keep going but this is already ridiculous enough.

I know this is not strictly PS4 related, but I'd like to take this chance to bring this (plainly obvious) subject to the table that pertains gaming in general and consoles in particular.

I have multiple computers (Including an HTPC) at my house and have over eighty games in my Steam account. Some I bought for myself, others for my GF, etc. The issue becomes quite clear: She wants to play some casual game while I play DOTA or whatever on my other PC. With few games and the likes, it's easy to just go to offline mode on one pc, but games these days are increasingly "social", so offline mode is just inadequate for today's use cases.

It's pretty simple actually. I want to do what every other service out there allows me to do, including the much beleaguered consoles. With one PS3/Wii/XBox360/Any-Console, I can play one game at a time; with two, I can play two; with three, three; and so on. It's pretty obvious that game playing scales linearly with the amount of devices (no matter which one as long as it's not a PC with Steam), why can't we do that with Steam? Beats me... Even the iDevices permit this scenario.

Now that there's also software being sold on Steam, it should be plainly obvious that I should be able to have many devices using the same account (impose a hard limit, I don't care as long as it's not 1) and each should be able to open a different game.

Steam has been great over the years, but this point has been a big gripe. I always credit Steam for allowing me to not fall into piracy and for the most part, it's been as competitive as you can get. But times have changed.

This is not something new and people complain about it a lot, but efforts are usually pretty scattered. It would be great if Ars did a piece on this. Gaming services have become much more important than the devices they run on, and the most popular service for PCs should catch up with times.

Long story short, if you agree with the image, +1 this. It's way overdue. Hopefully it'll catch on.

Cheers!

PS: The image is not mine and despite the best of my efforts, I couldn't find the source. So I just give it to the internet.

One thing I am curious about is how much better PCs will be in 5 years when compared to the PS4/XBOX720? The substantial improvements in PCs over the last 7 years made the consoles seem fairly long in the tooth. But PCs have started to substantially slow down from a performance improvement standpoint on a year over year basis. New PCs have slightly better performance but more of the improvements seem to be directed toward power consumption. If that trend continues then you could easily see a console game looking pretty darn good after 5 years (and it is going to take a while for developers to really use 8 cores - most current games don't have a huge benefit after two cores). So this could be the first time that PC gamers will not have anything to crow about after 3 or 4 years. Sure, their 200 watt GPUs and faster cores will be able to push more polygons, handle higher resolutions and be easier to get good performance out of for the first year or two. Over time, however, this generation of consoles may age really well and in the end if a game looks good at 1080p I won't care if a PC can handle higher resolutions - from my couch I won't be able to perceive the difference.

The PS4 looks fine and I would never begrudge someone for wanting to own one. 8GB of GDDR5 is expensive though. It will give the console the necessary "legs" to carry it through the 10 year dev cycle, but will still probably launch at around $499. The lack of horsepower/TFLOPS compared to even current sub-$300 graphics cards is probably not a big issue for now.

With that being said, this next console generation really seems like a time for the PC to flourish. For those of us that already own very capable gaming PCs, buying another console seems unjustifiable -- at least being a first adopter of one. Here's hoping that devs focus on PC first while the x86 architecture helps PS4 owners get a seamless port.

It must just drive you nuts that other people exist who like things you don't like and have different lives that call for different devices with different use cases. How can you stand to live in this world of diverse individuals?

I've done my fair share of bashing Sony over their "reveal". In fairness though, THIS is excellent news. This is something that is much more revealing about the direction Sony is taking the Playstation experience than their PS4 dog-n-pony show. There was nothing particularly earthshattering about the hardware "specs" they discussed deploying or any of their "social network" crud.

This, knowing the base philosophy, the idea that they reached out to devs to develop a platform that devs would see as being the "next step forward"..... I think that speaks volumes about Sony's commitment to making the system that gamers would want, and that devs would WANT to work with.

Now, if they can convince me that gameplay will not be hindered by internet connectivity, they'll be another step closer to convincing me this is a "must have".

How is this too little too late? It's not like any of us devs are going to stop developing for Sony hardware because the PS3 was a pain in the ass. We did it anyway before, now we're more likely to do it.

Why would you bother developing for a PC with an identity crisis?

What makes you say there is an identity crisis?

This is just, as you put it, a 'neat and compact PC hardware', except it doesn't run Windows or Linux. Why do you think Windows or Linux is some kind of advantage when playing a game?

Quote:

You will get a title which you can port to real PC but it will be optimized for AMD graphics pipeline and shared memory and is bound to work poorly on NVIDIA hardware or even AMD PC hardware unless you spend additional time and resources to tune the port for real PC.

Let us imagine instead that Sony announced the PS4 as exactly the same as it is today, but would boot Linux by default and have Windows available for download from the App Store for an additional $30.

Now you have a full PC, except it still has all the problems you described. No identity crisis.

Quote:

Also, why pay the licensing fees to develop games for a fake PC, when you can develop games for a real PC for free or at much lower cost?

Because with the fake PC you don't have to worry as much about OS updates, different versions of OSes or HW, different FW, or security.

Quote:

I honestly fail to see the point.

The point is simple. The PS4 is standardized. It's a reference platform and will be a relatively fixed target.

Quote:

Consoles need to die, if someone wants to make consoles, then why not make neat and compact PC hardware which runs Windows (or Linux now that Valve ported Steam) and which can (or will be able to) run all existing PC games?

Again, why do you think Windows or Linux is some kind of advantage?

It would make more sense to have Sony pay Valve to port Steam to the PS4 than for Sony to actually release a PC that has no benefit except it can run Office, connect to a domain, or something like that.

So they're reiterating what was said at the reveal. I hope this helps quash all the timed exclusives we saw with this generation, at least greasing the wheels for SONY. I don't want SONY to get them; just would rather see timed exclusives DIAF.

Also: Please, for the love of gaming, will someone address igor.levicki's gripes?

- "This viewpoint is more difficult to defend in an age of cross-platform games that must be ported to as many platforms as possible in the shortest amount of time possible. "

A: The PS3 was not designed and launched in 2012, it was designed almost 10 years ago and launched in a time when High Performance Computing was the buzz and just by specs you could sell a console, the mobile market was nothing like now, there was another world economy and developer approach from Sony CE.

The only thing that worries me is the amount of layer cakes of SDKs used for cross platforming will make all the games look the same and end dulling the visuals because the others platforms can't handle so much details like in the case of many PS3 titles.

So they're reiterating what was said at the reveal. I hope this helps quash all the timed exclusives we saw with this generation, at least greasing the wheels for SONY. I don't want SONY to get them; just would rather see timed exclusives DIAF.

Also: Please, for the love of gaming, will someone address igor.levicki's gripes?

Exclusives are always part of the business, and a way for the console mfgr's to differentiate themselves.

What about his gripes have we not already shown to be mindless blathering?

So they're reiterating what was said at the reveal. I hope this helps quash all the timed exclusives we saw with this generation, at least greasing the wheels for SONY. I don't want SONY to get them; just would rather see timed exclusives DIAF.

Also: Please, for the love of gaming, will someone address igor.levicki's gripes?

Exclusives are always part of the business, and a way for the console mfgr's to differentiate themselves.

What about his gripes have we not already shown to be mindless blathering?

Are there any major 3rd party exclusives anymore? Heck, the reason why there were exclusives in the past was because one console each generation had a CLEAR lead over the other two. For most games, they weren't exclusive because Microsoft, Sony, Nintendo, or SEGA paid them, but because either a) the developer really liked a platform (Ninja Gaiden on Xbox) or because developers didn't want to bother making games for a system with only 20 million units (most PSOne and PS2 games).

So they're reiterating what was said at the reveal. I hope this helps quash all the timed exclusives we saw with this generation, at least greasing the wheels for SONY. I don't want SONY to get them; just would rather see timed exclusives DIAF.

Also: Please, for the love of gaming, will someone address igor.levicki's gripes?

Exclusives are always part of the business, and a way for the console mfgr's to differentiate themselves.

I get it. Still, DIAF.

Quote:

What about his gripes have we not already shown to be mindless blathering?

One thing I am curious about is how much better PCs will be in 5 years when compared to the PS4/XBOX720? The substantial improvements in PCs over the last 7 years made the consoles seem fairly long in the tooth. But PCs have started to substantially slow down from a performance improvement standpoint on a year over year basis.[...] Over time, however, this generation of consoles may age really well and in the end if a game looks good at 1080p I won't care if a PC can handle higher resolutions - from my couch I won't be able to perceive the difference.

General computing performance most certainly has not stagnated. The amount of computing power you can get for $1000* has doubled every two years since the 1950s. This is a historical fact. The forward projection of this theory is known as Moore's law. This trend has continued and likely won't stop for another decade or more.

Spoiler: show

What your actually describing is a diminishing return on polygon count. The same idea holds true for textures and other art resources.

Spoiler: show

It won't be like growing up and seeing the evolution from 2D to 3D. However more resources ultimately allow better lighting, more advanced AI, realistic physics, particles, etc. Not to mention techniques we haven't even invented with current technology.

Doesn't this, to an extent, provide Sony with the cover it needs when they inevitably announce that, "yeah, about that whole allowing used games thing...yeah, well, we aren't going to be allowing them"? After all, they were working with their software developers when building the system, and this was a big thing for the developers. And, we were just trying to do whatever it takes to bring the best quality games to the PS4, and, well, we didn't really have a choice. I mean, we wanted to allow used games but you do what you have to do to give your customers the best.

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.