I should start bashing Intel products more often; then I could say whatever I wanted without getting it deleted. As for Piledriver, I read someone (nt300 I think) say there was a reason they were keeping it hushed, implying it was because its performance was a sizable improvement. It's important to remember AMD did the same thing with Bulldozer, so I don't buy into that theory too much. I hope AMD makes me eat my words, because on the CPU side of things--aside from APU's for laptops--I have not been too impressed since I replaced my Athlon 64 X2...

Yes AMD did fabulous marketing with Bulldozer. That awesome high clocked 8 core magic processor that performed as well as a mouses fart. Yes I know, AMD now has a better management team. Though I do find it interesting that they are not talking much about Vishera? We shall see soon enough

Yes AMD did fabulous marketing with Bulldozer. That awesome high clocked 8 core magic processor that performed as well as a mouses fart. Yes I know, AMD now has a better management team. Though I do find it interesting that they are not talking much about Vishera? We shall see soon enough

Click to expand...

I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.

I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.

Click to expand...

This i agree with. We can only hope the improvements allow for the chip to run cooler be faster in single and multi thread apps and draws less power than BD across the PD lineup

sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps .

sometimes i want to ask you people who already compared amd and intel cpu

Click to expand...

Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.

I agree I said he was lying because he misrepresented the benchmark results. I do not think he is a liar. A liar is somebody who repeatedly tells falsehoods and is conscious of that. A person can tell one lie and that doesn't make him a liar. We are all human. Sometimes in the heat of argument people say things they later realize they did not mean. It happerns to all of us. I just hope that one doesn't make it an every day practice.

sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps .

Click to expand...

My ancient Xeon 3440 beats thuban and bulldozer in every game I tried. Mind you I am probably one of he most consistent AMD purchasers around. I skipped bulldozer. I tried it on a couple of friends rigs and laughed as my lower end card rendered more fps when coupled to a phenom x4@4 while he ran his bd@4.3.

Bulldozer sucked it was one of the biggest let downs I have seen in my lifetime in computers. The idea is great parallel computing is awesome. AMD however needs to step up their game for it to catch on. The server market is already turning ears towards bulldozer it works well with the correct programs. Give them a few mind you I said a few more generations and we will see some shoe exchanging. Intel has had it happen to them on multiple occasions AMD innovates and Intel copies.

Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.

Click to expand...

thanks, i haven't ever used multi gpu setup on my FX8150 (with ASRock 990fx extreme4) so i couldn't say much about that, but i did it on my phenom ii x4 955 BE with cf hd 6870 in the past.

I don't believe their marketting team or strategy has really changed all that much since the management change. I have spoken plenty about Vishera, but whenever I bring up that it's improvements are less than a million fold people accuse me of lying despite sources saying otherwise and I'm just tiring of arguing about it. I think Vishera will be what Bulldozer should have been, and hopefully corrects a lot of the awful issues BD suffered from (reported high cache latency, unusually high power consumption for many users, poorly handled scheduling, etc.).

P.S. That question mark is grammatically incorrect since you're not asking a question; rather making a statement.

Click to expand...

If I recall correctly there were Bulldozer shortages even months after the reviews hit, why wouldn't the same occur with Piledriver being a better product and all that? (rhetorical question, not directed at you BTW)

I don't think that good marketing matters for AMD; it's not like they could just churn out an extra million of CPUs every month even if the demand was there. And they always sell out. People might want them to be present in every tier but does AMD want that too? Why did AMD become fabless in the first place?

If I recall correctly there were Bulldozer shortages even months after the reviews hit, why wouldn't the same occur with Piledriver being a better product and all that? (rhetorical question, not directed at you BTW)

I don't think that good marketing matters for AMD; it's not like they could just churn out an extra million of CPUs every month even if the demand was there. And they always sell out. People might want them to be present in every tier but does AMD want that too? Why did AMD become fabless in the first place?

Click to expand...

Now that is objective and constructive criticism that I can relate to and accept. The trolling and baiting that some people indulge in is ridiculous though. Let us all maintain some level of intellectual honesty and respect and we will all get a lot more positives out of this forum. I appreciate your post.

sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.

Click to expand...

I've seen (on Tom's Hardware) and heard (around TPU) that AMD CPU's tend to be a bottleneck for Multi-GPU setups a lot quicker than Intel's offerings. I don't much care for SLi\Crossfire, so it's not a huge concern for me. I can definitely say going from my Q6600 to an i5-2500K made a huge difference for me in a lot of games, even with the same GPU. My only conclusion is that my HD5850 was being bottlenecked, so the CPU can matter a lot more than most people are ever willing to admit.

I would imagine the difference between rendering something in 1 hour rather than 1 hour and 20 minutes isn't a huge deal for most people. I would never say BD was completely unusable, in fact it did pretty well for certain programs, but in my mind there was not enough reason to get it over Intel's offerings at the time.

My ancient Xeon 3440 beats thuban and bulldozer in every game I tried. Mind you I am probably one of he most consistent AMD purchasers around. I skipped bulldozer. I tried it on a couple of friends rigs and laughed as my lower end card rendered more fps when coupled to a phenom x4@4 while he ran his bd@4.3.

Bulldozer sucked it was one of the biggest let downs I have seen in my lifetime in computers. The idea is great parallel computing is awesome. AMD however needs to step up their game for it to catch on. The server market is already turning ears towards bulldozer it works well with the correct programs. Give them a few mind you I said a few more generations and we will see some shoe exchanging. Intel has had it happen to them on multiple occasions AMD innovates and Intel copies.

Click to expand...

There were a lot of situations where Phenom II was better than Bulldozer (especially of the 4xxx and 6xxx variant). Phenom II had the higher IPC, so with the same number of 'cores' Phenom II performed better clock for clock. It was only really when optimizing new instruction sets that this wasn't true. It also depends greatly on the game, obviously certain games are way more CPU dependent--primarily RTS and MMO's.

Parallel Computing is definitely the future, but it's a slow progression because software developers are slow to support increased numbers of cores as it would require an increase in development costs. As for the innovation, AMD definitely innovated a lot, their implementation of x86-64 was definitely the smarter move. The way I see it historically AMD introduces new concepts but Intel really executes them.

Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.

Click to expand...

respectfully ,your an exception(ie that you did actually bother to try it out ) rather then the norm ,and multi Gpu puts you in a smaller bracket yet again, so your sumary and experiance might not best fit another users experiance even at that time, not a dig just saying.

Now that is objective and constructive criticism that I can relate to and accept. The trolling and baiting that some people indulge in is ridiculous though. Let us all maintain some level of intellectual honesty and respect and we will all get a lot more positives out of this forum. I appreciate your post.

Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.

Click to expand...

I can report the same, but with an added twist: I use to run crossfire with Radeons on my X6 1100t rig, which went very well, then when I replaced those with Geforces (even on a SLI-compatible motherboard), I had tons of trouble and could not get any real performance increase over a single geforce, but instead got lots of performance issues. Then I changed my mobo/RAM/cpu for intel and things have been flying.

From my experience, I deduced that multi-GPU with video cards built around nVidia GPUs works a lot less well with AMD systems than those built around AMD GPUs. If history is any guide, intel has paid nVidia a fortune to make that happen... Now I'm not saying that is the case, I'm just saying Intel has such a history of monopolistic, illegal, rotten, evil business practices, that it wouldn't surprise me overly if such were the case.

Either way, using multiple GPUs will tend to move the bottleneck from the (single) GPU to the CPU, so that's where the CPU's power becomes more relevant.

It remains that in my case, it wasn't simply a situation where the second GTX 570 didn't add any performance, it just made everything stutter like complete madness.

I can report the same, but with an added twist: I use to run crossfire with Radeons on my X6 1100t rig, which went very well, then when I replaced those with Geforces (even on a SLI-compatible motherboard), I had tons of trouble and could not get any real performance increase over a single geforce, but instead got lots of performance issues. Then I changed my mobo/RAM/cpu for intel and things have been flying.

From my experience, I deduced that multi-GPU with video cards built around nVidia GPUs works a lot less well with AMD systems than those built around AMD GPUs. If history is any guide, intel has paid nVidia a fortune to make that happen... Now I'm not saying that is the case, I'm just saying Intel has such a history of monopolistic, illegal, rotten, evil business practices, that it wouldn't surprise me overly if such were the case.

Either way, using multiple GPUs will tend to move the bottleneck from the (single) GPU to the CPU, so that's where the CPU's power becomes more relevant.

It remains that in my case, it wasn't simply a situation where the second GTX 570 didn't add any performance, it just made everything stutter like complete madness.

Click to expand...

Mine has none of those issues could have been an AMD mobo issues with drivers or BIOS.

Rumours abound Am3+ is still going to be used for steamroller too, happy days

to above and bellow posts, i cant vouch for nvidia on Amd as i havent used it but i have xfired on Amd(this) and intel platforms with minimal issues(mostly app specific) for years now and the main thing xfire users need to know is AFR on,,, this brings a near doubleing of fps in all but nvidia optimised games and metro 2033 but is the most crashy

Mine has none of those issues could have been an AMD mobo issues with drivers or BIOS.

Click to expand...

Might, we'll never know. All I know is I first tried with an ASUS M4A79 Deluxe, which isn't SLI certified, using patched drivers. With old, then new BIOS. Then with a newer mobo, a high end MSI that had USB3, with stock, then new BIOS. Then I just gave up.

sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps .

Click to expand...

just reminds me of my co-worker friend.he just switch thuban to sandy-bridge platforms.his jobs mainly to draw 2D,editing small clips of video,and represented them in beautifully presentation.and he wonder why "those lag" is still persist although he "made a good decision".
then he came to my workshop and tries my computer.6 hours torturing my computer,and he just comment "did you use highly overclocked ivy-bridge?"

Yes I have. I had a 8150 and a Crosshair V. Multi GPU gaming wasn't even close to what the other company has to offer (2500K at the time). Gaming benchmarks from various review sites (using actual games) show the same results.

Click to expand...

all-in-all,if you crossfire'd or sli'ed high-end GPU,the choice was obvious.
but when it came to mainstream,i.e my system with just two HD7850,there's is no noticeable different with highly overclocked 2600k + GTX 680 @1080p gaming

sometimes i want to ask you people who already compared amd and intel cpu, have you tried to play the games, or working with fx or bulldozer cpu? does this bulldozer cpu already make you suffering a lot while you're on gaming and working? i already tried to compare using fx8150, intel i5 2500k and intel i7 2600k with gtx680 on real gaming and working apps, i didn't see any differences using those chips except an unrealistic of benchmark numbers from benchmark apps.
personally i want the upcoming amd's piledriver can draw less power, cheaper price compared to competitor, and maybe can perform better on benchmark apps .

Click to expand...

I used to have an FX-8120, certain games like Starcraft 2 and even Bad Company 2 I used to suffer from frame dips to as low as 40fps in BC2 and 30fps in SC2. After I got my 2600K, everything was locked 60fps with v-sync. Although SC2 is a worse case scenario for Bulldozer seeing as how it only utilizes 2 threads. Other games like Operation Flashpoint barely hit 45fps while the i7 was 60fps locked with zero dips. There are a bunch of other games that the FX-8120 bottlenecked my GTX 580 at the time and god knows it would bottleneck my current GTX 680.

I also extract a lot of .rar files and do a lot of video conversion with handbrake. The i7 was dramatically faster in extraction and compression. Although I do say the FX-8120 was great in Handbrake and overclocked it was the same as the i7. As for photo editing, it would be hard to tell a difference.

Gaming though Bulldozer really does suck though in comparison. If you REALLY cant wrap your head around that, then take a look at some of the Borderlands 2 CPU scaling benchmarks or the Multi-GPU benchmarks (tweaktown.com) There are tons and tons of proven benchmarks that aren't just "bars and graphs" that prove this. It really DOES just suck and is behind even the Phenom II.

However I will note that everybody's gaming experience will be based on their own standards. Some people will be fine with 30fps and claim they see no difference with 60fps (probably need their eyes checked)