Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable?

Click to expand...

They said a custom chip based on the A10. We don't actually know what the specs are so it is very possible that it could sport a larger iGPU than what the A10-5800k has. We will have to wait and see what Sony churns out.

After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.

why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.

Let's all look on the bright side. The console gamers can finally enjoy what PC gamers have been spoiled with for a long time. I can't wait to see what developers can come up with using a directx 11 capable GPU when this gen of consoles only had dx9 capable cards. It will no doubt be exciting for everyone. I'm tired of half-assed ports looking like shit and having all sorts of bugs because a developer didn't care enough to spend the hours needed for a fantastic gameplay experience. There's plenty of them out there unfortunately.

why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.

Click to expand...

Eh, the Playstation 3 was pretty high end tech gear when it did launch.

Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
And the STI produced Cell CPU was pretty high end for the time.

why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.

Click to expand...

Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.

Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
And the STI produced Cell CPU was pretty high end for the time.

Click to expand...

Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray. The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed. The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per core\per thread performance was down so much, it was a huge disappointment. The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.

Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.

Click to expand...

This is true. This generation Consoles launched with 1 generation old GPU's that had features of their successors. The Xenos was basically a streamlined X1950XT with some of the features that appeared in the 2900 series, and similarly the PS3 GPU was 7800-based with some features from the 8800 series. The saving grace for the Xenos was its incredible usage of memory and implementation of eDRAM. In order for this upcomming generation of consoles to impress, it would have to use as he said, a 7800 series or 660 GPU, which is nearly impossible.

Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray. The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed. The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per core\per thread performance was down so much, it was a huge disappointment. The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.

Click to expand...

For a product that was AIO I say it was pretty damn impressive at its time.

The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.

Click to expand...

They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.

They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.

Click to expand...

Just like Nintendo having a hold on gamers hearts with nostalgic names and the reason to purchase a Nintendo product; Sony has been on the offensive when it comes to first party offerings.

I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.

I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.

Click to expand...

That and the fact that I don't have to pay $50 a year just to play my games online. Sony did a good job when it decided not to charge it's customers to pay in order to play online which is great and is another reason why I own a PS3 and not a XBOX.

After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.

Click to expand...

Big meanie!!!!!!

My first A8-5600 is together and based off the tiny cooler they included and very cool running temps I am almost scared of what it is actually going to be able to do.

Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win? Yes this is the thread that never...

I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite. That is all.

Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win? Yes this is the thread that never...

I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite. That is all.

Hello everybody! This is my first time posting on TPU and I've been a lurker guest for years. I decided today I'm rather going to start adding my bit to the community here and this seems like the perfect place to start.

2 - A10 is today's top trinity apu, but won't be the only one. Think about A12, A14, A16, I mean, we don't really know how AMD will refresh its line of apus, we only know the line will be compatible.

Click to expand...

AMD won't refresh the entire lineup until the generation after Steamroller hits, possibly unifying the entire family under one socket, all with GPU and ARM components. But for now lets assume that with Sony giving AMD a large dosh of cash and telling it what is needed that this drives AMD to accelerate production of Piledriver cores on a 22nm process with VLIW4 GPUs on a 32nm process. They've arguably had enough time to get something like that tapered out because the Sony deal has been in the works for years. But since it doesn't look like Steamroller will be on 22nm, it wouldn't be a train smash if the components were stuck on 32nm and 40nm respectively.

I don't think they'll change from the existing lineup of A4; A6; A8 and A10 cpus because that would introduce more complexity. But that's on the desktop which, for this thread, doesn't fit into the equation. What will be in the PS4 is going to be a different beast to what we're used to and it could be based off the A10-5700 and a HD6670 GPU. So lets leave the desktop out of this for now because, as everyone in this thread is eager to point out, they're not directly comparable. Similar in terms of hardware, yes, but with software they are very different performers.

3 - Developers had to optimize multithreading on the PS3 very complex architecture. So adopting the A10 should make things easier, and dual GPU would be a breeze.

Click to expand...

I also think that coding for an x86-64 architecture and more modern instruction set will make the developers life easier but I'm not sold on the dual-GPU portion just yet (even on the desktop I'm hesitant to recommend SLI or Xfire to anyone). I haven't seen results from an APU and GPU combo that shows frame rates over time and from what I've seen with SLI and Xfire, the stuttering issues are enough to put some people off dual GPUs completely. However, I do know that with a strict hardware configuration the Xfire rendering could be tweaked so that instead of rendering alternating frames the devs could choose to divide up the frames between the two. Unfortunately, only developers with access to these machines can answer our questions and until then, everything else is just assumption.

Why Sony choose APU?Definitely Sony knew something that we don't.Console are console,targeted for most casual gamer whose doesn't even bother about upscale.

Click to expand...

TDP requirements and less complexity, mostly (along with cost, which is going to be a big factor). You can build a APU setup into a thin client-like/ITX chassis without having to worry too much about cooling and your GPU requirements are mostly catered for already. When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch. I had to re-flow the boards, clean out the cooling systems and lap the heatsinks so that the older units wouldn't overheat.

With desktop-class APUs, the stock cooler is perfectly fine. In fact, cooling requirements for the A10-5700 peaks at only 65W TDP which means there's much less work required in designing the console's cooling system. I think we might even see a launch version that is as slim as the PS3 Slim (not the recent swanky one which, IMO, looks fugly) and in future could become as small as the PS2 Slim. Considering that the APU in question might even be the mobile A10-4600M, its plausible.

Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?

Click to expand...

No-one's ever said that the PS3 can't run games at 1080p. In fact, there's a handful that can, Prince of Persia being the only recent one I can remember. Its down to the developers that have to figure out what they want to sacrifice the most: visual fidelity and potentially higher performance or more stuff on the screen but potentially lower performance.

To those of you who say that a game running at 30fps at 1080p is crap, I'd have to agree with you initially. If I notice it, it becomes a problem until I play the game enough times to not notice it and then its smooth sailing from there on. Even Forza Horizon, which I got to play recently, runs at 720p and minimum 30fps. Technically the developers could run the game at 1080p and get similar performance, but they'd have to sacrifice some visual fidelity and the beautiful world the game is rendered in. Personally, I don't have a problem with the speed at which the game is rendered, only that it looks good and doesn't suffer hiccups.

Likewise, you can't directly compare today's consoles with desktops. Well, at least not the PS3 because the RSX GPU lacks some components and instruction sets that make it comparable to a desktop-class GPU. The Xbox 360 is closer to a proper desktop setup but again can't be compared directly because it can't render anything in DX10. A good deal of games today include a DX10/DX11 render path so that makes the comparison even more moot. 720p30 DX9 and 720p30 DX11 with their highest settings will look different and will behave differently due to the rendering mode. With the PS4 and the Xbox 720 being based off modern hardware, at least we'll have consoles and computers on the same footing again in terms of graphical ability, if not in performance.

And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.

When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch

Click to expand...

yes,i remembered around January 2009 i bought 40GB PS3 CECHJ NTSC-J,been enhanced to 65nm from the previous 90nm process.Still produce large amount of heat from rear-leaf blower fan,after 30 minutes of playing it sounded like a jet ready to take off

Likewise, you can't directly compare today's consoles with desktops. And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.

Click to expand...

And yet fellow TPU member here say they need bla-bla gpu to do 1080p,mumble-mumble about so called rendering or giving mambo-jambo calculation for kindergarten exam.
Are they forget?This is a console,a consumer electronic,a set-top-boxes attached to a TV set.Sony had to make their console compatible for a wide range TV,so they attached A/V RCA output legacy for analog TV and sporting HDMI for SD/HD/Full HD TV.They didn't have to do 60fps because if doing so it violates NTSC standards (29,97fps),PAL standards (25fps) or non-popular SECAM.

Current PS3 using Linux kernel 2.4,i bet Sony will use Linux 3.0 with some HSA optimization

I am wondering how much information on the Wii U's gpu is available that can be confirmed, most of what I have seen are rumors, anyone know any facts, all we have are some pictures that show that the gpu and cpu are on the same package and unconfirmed sources just like when the wii launched and it took years to verify that after the wii's release.

why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.

Click to expand...

Agreed completely, and it only takes a little bit of research to confirm this.

Lets look at the PS3. It was launched in Nov/2006. It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument. At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of. Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.

The Xbox360 wasn't that much different. It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT). And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.

Agreed completely, and it only takes a little bit of research to confirm this.

Lets look at the PS3. It was launched in Nov/2006. It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument. At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of. Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.

The Xbox360 wasn't that much different. It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT). And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.

Click to expand...

And again, the PS4 will not release until 2014 or very late 2013. By that time HD8900 refresh will be released if not HD9900. Following the logic above the PS4 should use at the very least an HD7950 (lower clocks, maybe 256 bit) or simply a HD7870 or whatever OR as said a HD8950, because HD9900 is around the corner.

They are talking about an APU or in the best case scenario maybe paired up with HD6770 or something like that. It's as if PS3 had shipped with a GeForce 6600 instead of a 7800. Plus in 2004/2005 when the PS3 was spec'ed the 7800 was the fastest card.

So yeah let's stop with that argument. No one's saying it had a high-end GPU when it launched, but it definately had a (slower) variant of the high-end GPUs when they were designed. PS4 by the time it launches, it will have a low-end GPU 3 generations behind.