I see a major problem here but it's not so much about hardware unless you have some developers who will want to bump up texture and model resolution. How will quad meshes be handled? It's one thing to use triangles in your modelling and I get the spiel about it being more familiar but why would any experienced modeler continue using triangle meshes when quad meshes are much better and cleaner to work with? It seems to me that this next generation will still have a lot of the problems associated with every preceding generation. Higher res models and textures will be used for cinematic experiences and lower res textures and models will be used in-game.

One thing that actually pisses me off is the sacrifice in the huge amount of bandwidth that the PS4 could have had. 176GB/s is rather paltry in comparison to the 1TB/s developers could have had to work with. Yes, it would have required a little bit more thinking and work on the part of the developers but at some point, the massive gain is worth just that little bit more that would have been required of them. I swear, it's like every generation, people are willing to think less and less and think in shorter and shorter amount of cycles. It's all about, "give me what you got right NOW", and not, "I'm willing to work to get that much larger piece that will be so much more awesome in a year compared to what I could get now". It's a rather sad state of affairs. This isn't to mention the rather slow Blu-ray drive that will be slapped on compared to what is currently out on the market now. The more I look, the more corners I'm seeing as being cut in order to give an over-hyped experience.

Don't get me wrong, I'm sure the system will do some pretty sweet things but all of that will be paltry compared to what it could do with a little more work and sacrifice. I'm actually rather glad I haven't been pulled into the hype. As things stand now, I'm gearing up for the next generation and to be honest, my current PC which is going on 3 years old still spanks the system. I'm rocking 16GB of system memory and 4GB of video memory as well as a 10x Blu-ray drive. As far as I'm concerned, I could spend a couple hundred dollars and upgrade my GPU setup and I could continue spanking the console market. I can even hook my PC up to the rather large TV out in my living room through HDMI to the 7.2 channel receiver and completely and utterly rock out.

I see a major problem here but it's not so much about hardware unless you have some developers who will want to bump up texture and model resolution. How will quad meshes be handled? It's one thing to use triangles in your modelling and I get the spiel about it being more familiar but why would any experienced modeler continue using triangle meshes when quad meshes are much better and cleaner to work with?

You can work with quads, subdivision surfaces, splines,Catmull-Clark or what ever you want.

I see a major problem here but it's not so much about hardware unless you have some developers who will want to bump up texture and model resolution. How will quad meshes be handled? It's one thing to use triangles in your modelling and I get the spiel about it being more familiar but why would any experienced modeler continue using triangle meshes when quad meshes are much better and cleaner to work with? It seems to me that this next generation will still have a lot of the problems associated with every preceding generation. Higher res models and textures will be used for cinematic experiences and lower res textures and models will be used in-game.

One thing that actually pisses me off is the sacrifice in the huge amount of bandwidth that the PS4 could have had. 176GB/s is rather paltry in comparison to the 1TB/s developers could have had to work with. Yes, it would have required a little bit more thinking and work on the part of the developers but at some point, the massive gain is worth just that little bit more that would have been required of them. I swear, it's like every generation, people are willing to think less and less and think in shorter and shorter amount of cycles. It's all about, "give me what you got right NOW", and not, "I'm willing to work to get that much larger piece that will be so much more awesome in a year compared to what I could get now". It's a rather sad state of affairs. This isn't to mention the rather slow Blu-ray drive that will be slapped on compared to what is currently out on the market now. The more I look, the more corners I'm seeing as being cut in order to give an over-hyped experience.

Don't get me wrong, I'm sure the system will do some pretty sweet things but all of that will be paltry compared to what it could do with a little more work and sacrifice. I'm actually rather glad I haven't been pulled into the hype. As things stand now, I'm gearing up for the next generation and to be honest, my current PC which is going on 3 years old still spanks the system. I'm rocking 16GB of system memory and 4GB of video memory as well as a 10x Blu-ray drive. As far as I'm concerned, I could spend a couple hundred dollars and upgrade my GPU setup and I could continue spanking the console market. I can even hook my PC up to the rather large TV out in my living room through HDMI to the 7.2 channel receiver and completely and utterly rock out.

I'm skeptical of the TB\s claim, even with Esram that's seriously fast. I've not heard of any such figures being possible up until this point. How much Esram would they have been able to afford at those kinds of speeds? Further, doesn't that take up significant die space?

I'm skeptical of the TB\s claim, even with Esram that's seriously fast. I've not heard of any such figures being possible up until this point. How much Esram would they have been able to afford at those kinds of speeds? Further, doesn't that take up significant die space?

Cerny is a software developer it shows. He sometimes makes some statements that are a bit left field.

Cerny is a software developer it shows. He sometimes makes some statements that are a bit left field.

Doubting someone who's put years into designing a hardware and sifting through multiple builds are we?

Originally Posted by Vulgotha

I'm skeptical of the TB\s claim, even with Esram that's seriously fast. I've not heard of any such figures being possible up until this point. How much Esram would they have been able to afford at those kinds of speeds? Further, doesn't that take up significant die space?

This isn't to mention the rather slow Blu-ray drive that will be slapped on compared to what is currently out on the market now. The more I look, the more corners I'm seeing as being cut in order to give an over-hyped experience.

PlayGo should get around that issue.

“So, what we do as the game accesses the Blu-ray disc, is we take any data that was accessed and we put it on the hard drive. And if then if there is idle time, we go ahead and copy the remaining data to the hard drive. And what that means is after an hour or two, the game is on the hard drive, and you have access, you have dramatically quicker loading… And you have the ability to do some truly high-speed streaming.”

given consoles need to be relevant for a number of years, i'm surprised they missed 'stacking' completely.

with the bandwidth available by comparison it kinda looks bad for consoles going forward....granted i know nothing of 'stacking' other than the few tidbits i've gleamed here on the forums.

does manufacturing stacked ram increase cost? its new tech so surely there must be some reason why Sony and Microsoft seemingly haven't gone that route.
also, how does it fit into the current PC scene? does it require a new motherboard to be designed to hold the stacks?

irrespective, i have fun on my PS3 and regardless of tech, i'll have fun on the PS4 too so c'est la vie

I'm sort of surprised Sony didn't go for anew custom stack CPU/SOC, though understandable, with issues given from this gen with the cell. Seems odd, Sony not pioneering new tech hardware in the PS4.

Yes they usually do lead with with some advanced forms of manufacturing, but perhaps they are going with a more dependable technique, which really is still going to be competitive for the foreseeable future, rather than have issues trying to have enough stock to build the PS4s.

Don't get me wrong, I'm sure the system will do some pretty sweet things but all of that will be paltry compared to what it could do with a little more work and sacrifice. I'm actually rather glad I haven't been pulled into the hype. As things stand now, I'm gearing up for the next generation and to be honest, my current PC which is going on 3 years old still spanks the system. I'm rocking 16GB of system memory and 4GB of video memory as well as a 10x Blu-ray drive. As far as I'm concerned, I could spend a couple hundred dollars and upgrade my GPU setup and I could continue spanking the console market. I can even hook my PC up to the rather large TV out in my living room through HDMI to the 7.2 channel receiver and completely and utterly rock out.

nice... nice... how much did that cost? Yeah thought so.

Originally Posted by mynd

Cerny is a software developer it shows. He sometimes makes some statements that are a bit left field.

And you are? Mr Anonymous "look at me I'm a dev" on a forum. It's a bit rich of you to belittle Mark Cerny, who is RENOWNED and well respected among developers across the industry, and a lead programmer at Sony's ICE team. I'm very interested to know some of your achievements, what qualifies you?

I still remember your "what does he know?" attacks on Timothy Lottes, a guy that works on graphics cards day in day out and is now working at Epic.

And you are? Mr Anonymous "look at me I'm a dev" on a forum. It's a bit rich of you to belittle Mark Cerny, who is RENOWNED and well respected among developers across the industry, and a lead programmer at Sony's ICE team. I'm very interested to know some of your achievements, what qualifies you?

Not much more than what would qualify Cerny as an SOC engineer or CPU architectural engineer. Please I never cast doubts on Cerny qualifications, but he is a software designer, and lead the engineers from a software point of view. That is no bad thing, but even Cerny himself will tell you didn't create the system specifics. He certainly
Would have lead them in the direction he would want them to take however.

I still remember your "what does he know?" attacks on Timothy Lottes, a guy that works on graphics cards day in day out and is now working at Epic.

Oh please Lottes then retracted and admitted what he got wrong (all on presumption of specs). So take that spoonful of "i dont know what I'm talking about" and swallow it.

Regards to Cerny, he knows the hardware inside out. Most likely a lot better than you, so I ask again, what have you achieved apart from your post count?

Originally Posted by mynd

Oh please Lottes then retracted and admitted what he got wrong (all on presumption of specs). So take that spoonful of "i dont know what I'm talking about" and swallow it.

He retracted it due all the flaming he got from Microsoft and PC fanboys, as well as Nvidia's management pressure [we've seen their saltiness flourish in the last few weeks]. And those presumptions of specs based on the leaks turned out to be correct. In fact Sony delivered more ram than expected. He qualified his original blog post that his opinion is based on the leaked specs.

You were not protesting about the specs being wrong. You were doubting he had knowledge of programming and utilising hardware, as though he was a nobody, not worthy to comment. I've been lurking on this forum for almost 9 years and this the type of behaviour I've seen you repeat.

Regards to Cerny, he knows the hardware inside out. Most likely a lot better than you, so I ask again, what have you achieved apart from your post count?

you make little sense. We aren't talking about what he went with we are talking about the options they didn't choose. And of course he knows the architecture better than me, or you for that matter. That was not what we were discussing.

So, why did you get your knickers in a twist?

He retracted it due all the flaming he got from Microsoft and PC fanboys, as well as Nvidia's management pressure [we've seen their saltiness flourish in the last few weeks]. And those presumptions of specs based on the leaks turned out to be correct. In fact Sony delivered more ram than expected. He qualified his original blog post that his opinion is based on the leaked specs.

You were not protesting about the specs being wrong. You were doubting he had knowledge of programming and utilising hardware,

of ps4 and xbox 720 specific hardware? Of course i was. He assumes that the direct x you work with on pc is the directx that is supplied to devs on the xbox360. Je also assumed that ps4 was gcn but 720 was not. Also i never once questioned what he was talking about on the PS4, i questioned why he would be giving it the benefit of the doubt whilst giving the 720 non

as though he was a nobody, not worthy to comment.

no i ripped him apart for the same reason others did. Making half assed assumptions, and being blantently anti MS.

I've been lurking on this forum for almost 9 years and this the type of behaviour I've seen you repeat.

Pmsl, if you've been lurking here for so long you tell me what I've done? Because I've posted what I've done many times before.

And just for the record, I love the PS4 specs, but talk of 1 TB bandwidth, is something that would have required a massive amount of investment and matching CPU and GPU grunt. to make use of it

The "Cube" stacking of RAM so far has a working standard benchmark of like 160-ish GB\s with DDR3. Which is bananas, but a far cry from 1TB\s.

Look I'm not pissing on Cerny, but when I first heard\read that remark about "1TB\s" my jaw hit the floor. I mean sure maybe its possible (?) but what would you have to sacrifice just to get there? To me it sounded like "Well at one point we thought of putting 32GB of DDR3 in the box but decided against it"- yea, sure, could be done. If the box cost a small fortune and\or had literally nothing else in it lol but a tiny little ARM processor.

Look at my posting history. This is really the only time I've come out and talked smack about Cerny, overall I think he's the best thing to happen to Playstation since.. uh, The Playstation. But this claim just sounds sooo outlandish to me. If somebody can show me some production-grade ESRAM that has a bandwidth near or above 1TB\s I will gladly eat crow and apologize for second guessing the man's claims.

As for the 720, I am dubious it uses 3D cube stacking. But it's possible, the rumored bandwidth matches the claims of cube memory.