AMD FX-4350 and FX-6350 Piledriver CPUs

AMD issued a small refresh on its lineup of FX processors with the launch of the FX-4350 and FX-6350. Both based on the "Piledriver" architecture, feature an AM3+ socket, and support the "latest instructions including AVX and FMA3." The new CPUs can essentially be viewed as faster and less energy-efficient versions of their predecessors since both the FX-4350 and FX-6350 feature a 400 MHz increase in base clock and 30 W higher TDP than the FX-4300 and FX-6300, respectively.

AMD is rolling out two new members of the legendary FX CPU family, and as always, they bring the unlocked performance you’ve come to know and love from AMD [

The AMD FX-4350 processor, for enthusiasts who like to watch crisp HD video and run multiple compute-intensive apps, is a 4-core CPU that clocks at a 4.3GHz Max Turbo operating frequency and 12MB of L2 and L3 cache in a 125W thermal envelope. With up to 10% more performance[ii] than our previous generation[iii] and a competitive price of $122,[iv] this is a hard-to-beat part.

Also announced today, the AMD FX-6350 is a 6-core CPU with up to 4.2GHz Max Turbo operating frequency and 14MB of L2 and L3 cache. This 6-core CPU gives PC users capabilities to accomplish breathtaking 3D modeling and HD video editing. This part is all about bringing maximum multitasking with up to 10% more performance[v] than our previous generation[vi], all for the great price of $13

#4587973 Posted on: 05/01/2013 03:21 PM
What are their stock speeds? I was considering getting a 6300, which I intend to overclock. Whichever overclocks better on air is likely the one I'll be getting.

Neo Cyrus
Senior Member

Posts: 8751
Joined: 2006-02-14

#4587998 Posted on: 05/01/2013 04:20 PM

I was considering getting a 6300

Protip: Don't.

Haswell, June 4th.

schmidtbag
Senior Member

Posts: 2535
Joined: 2012-11-10

#4588010 Posted on: 05/01/2013 04:53 PM
Protip: Don't.

Haswell, June 4th.

The only thing that interests me about Haswell is the power consumption. Otherwise, I'm not paying that much for something that I have little need for. I'm actually choosing the 6300 over the 8350 because I don't even need the extra performance of the 8350, and the 6300 seems to be the best valued AMD CPU today in terms of processing performance. And I'm choosing the 6300 because it'll make a nice last-upgrade to my AM3 system (I have a beta bios that supports AM3+ CPUs).

Also, considering that PS4 and likely the new Xbox are both going to be Piledriver based, it won't surprise me games will perform better in a 6300 than any Haswell i5, maybe even i7. I doubt the first year of games will do that though since devs likely won't figure out the kinks in micro-optimizing.

Anyways, I currently own an Athlon II x3 at 3.7GHz and so far it's difficult to justify upgrading that. It doesn't max out in any live tasks such as gaming. It's relatively slow when it comes to other things like compression or encoding, but I can deal with the wait since I don't do those very often. I have no need for 8 cores but I'm likely going to need more than 4 pretty soon. Also, being mostly a Linux user, BD/PD processors tend to perform a little better in linux than they do in Windows.

k1net1cs
Senior Member

Posts: 3783
Joined: 2010-11-14

#4588017 Posted on: 05/01/2013 05:05 PM

Also, considering that PS4 and likely the new Xbox are both going to be Piledriver based, it won't surprise me games will perform better in a 6300 than any Haswell i5, maybe even i7. I doubt the first year of games will do that though since devs likely won't figure out the kinks in micro-optimizing.

Your system and Piledriver-based desktop CPUs, however, won't have it.
The games are likely unable to perform better on a 6300 than a similarly spec'd i5 system, much less an i7.

schmidtbag
Senior Member

Posts: 2535
Joined: 2012-11-10

#4588021 Posted on: 05/01/2013 05:11 PM
PS4 and Xbox are likely to have this technique implemented.

Your system and Piledriver-based desktop CPUs, however, won't have it.
The games are likely unable to perform better on a 6300 than a similarly spec'd i5 system, much less an i7.

I don't see how huma would have that black and white of a difference. it's still based on the same general CPU architecture, so the only performance I would lose due to not having huma intel would lose as well. So I don't see how that's a valid point. Also note that huma is not ideal for discrete GPUs - you would lose performance if you attempted to make a discrete GPU use your system memory.

Taint3dBulge
Senior Member

Posts: 1122
Joined: 2008-10-31

#4588031 Posted on: 05/01/2013 05:43 PM
In games like bf3 crysis 2 using an I5 compared to a 6300 is barley noticeable. Some games the AMD will be faster by a few frames and some games the i5 will have afew frames more, not like having an intel will give you 50fps more lol.. So unless your playing old games that only use 1 core i wouldnt spend the extra $100+ bucks on the intel. Plus old games that only use 1 core will prolly run 200+fps on any newer system..

schmidtbag
Senior Member

Posts: 2535
Joined: 2012-11-10

#4588032 Posted on: 05/01/2013 05:46 PM

In games like bf3 crysis 2 using an I5 compared to a 6300 is barley noticeable. Some games the AMD will be faster by a few frames and some games the i5 will have afew frames more, not like having an intel will give you 50fps more lol.. So unless your playing old games that only use 1 core i wouldnt spend the extra $100+ bucks on the intel. Plus old games that only use 1 core will prolly run 200+fps on any newer system..

Very valid point, which is one of the reasons I'm doing a CPU upgrade rather than a system upgrade.

k1net1cs
Senior Member

Posts: 3783
Joined: 2010-11-14

#4588035 Posted on: 05/01/2013 05:49 PM

I don't see how huma would have that black and white of a difference. it's still based on the same general CPU architecture, so the only performance I would lose due to not having huma, intel would lose as well. So I don't see how that's a valid point. Also note that huma is not ideal for discrete GPUs - you would lose performance if you attempted to make a discrete GPU use your system memory.

My point was the games are designed to be run on systems likely to be based on hUMA, and not on systems (with discrete GPU) like yours.

You just said it yourself that you'd only lose whatever advantage(s) hUMA would provide just like any other Intel systems, so why would a 6300 has the advantage over a similarly spec'd Haswell i5, or even i7, when the games aren't likely to be optimized for non-hUMA systems?
Besides, your first argument was that games would run better on (vanilla) AMD systems because PS4 and Xbox are based on AMD hardware, but that was before hUMA entered the picture.
Care to elaborate?

And it seems you kinda miss the point with hUMA.
It's not about the GPU sharing the same memory being used by CPU, it's more about the CPU being able to read what the GPU has already processed (and vice versa) without having to wait for the result being copied back and forth between VRAM and system's RAM.

On a traditional shared memory system, which you were thinking about, CPU and GPU still can't see what's in each other's share of the memory space.
So there's still the overhead to copy over what's in the GPU's memory share to the CPU's, whenever there are CPU-GPU processes that need to be worked on, actually resulting in duplicate contents whenever there are such processes.
With hUMA, it's both reducing the memory usage and removing the overhead of copying values.

Also, hUMA would still be relevant on systems with discrete GPU.
The gfx card can still keep their VRAM for graphics processing, but it'll be faster for CPU-GPU processes if the GPU can just read off and write on the system's RAM, which means no more wasted cycles to copy values back and forth between system RAM and VRAM.

schmidtbag
Senior Member

Posts: 2535
Joined: 2012-11-10

#4588044 Posted on: 05/01/2013 06:08 PM

You just said it yourself that you'd only lose whatever advantage(s) hUMA would provide just like any other Intel systems, so why would a 6300 has the advantage over a similarly spec'd Haswell i5, or even i7, when the games aren't likely to be optimized for non-hUMA systems?

Because I said it's the same CPU architecture. When you optimize a program to work on Ivy Bridge, whether you have an i3 or an i7, you'll see a noticeable performance gain in both platforms, but an AMD CPU with the same instruction sets likely won't get that performance bonus because it's architecturally different.

Besides, your first argument was that games would run better on (vanilla) AMD systems because PS4 and Xbox are based on AMD hardware, but that was before hUMA entered the picture.
Care to elaborate?

Again, huma is not going to make that immense of an impact. On an APU, it would give a noticeable performance improvement, but you're acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma. That's like saying a sprinter gets nearly all his speed through his shoes. Shoes make a difference but it's the legs that do all the work.

And it seems you kinda miss the point with hUMA.

I know the point of huma, but I don't think it's going to have as dramatic of an impact as you think it will on higher-end systems. At best, it fixes latency problems - which are a big deal, but there's probably a point where huma can't improve performance.

Also, hUMA would still be relevant on systems with discrete GPU.
The gfx card can still keep their VRAM for graphics processing, but it'll be faster for CPU-GPU processes if the GPU can just read off and write on the system's RAM, which means no more wasted cycles to copy values back and forth between system RAM and VRAM.

Agreed. But I don't see that happening any time soon since such a scenario would be way too restricting or variable. huma works nicely on an APU because the CPU and GPU are fused together and you don't get another option. Suppose there was a time when there was a CPU, north bridge, and discrete GPU that were huma compatible. It wouldn't surprise me if the next generation of any one of those parts would break compatibility.

Neo Cyrus
Senior Member

Posts: 8751
Joined: 2006-02-14

#4588073 Posted on: 05/01/2013 07:07 PM

It doesn't max out in any live tasks such as gaming.

I dunno what games you're playing, but even a first gen i7 (my CPU) at 4GHz gets maxed out in some games. You probably think your CPU is not a bottleneck because you see it at 12 or 25% usage spread across all the cores. That's 1-2 threads maxed out then the times slices distributed among all the cores, you're still limited by the maximum 1 or 2 cores can pull off.

Do whatever works for you, but the single threaded performance of AMD CPUs is too much of a handicap for my uses. And it's just too weak in anything that requires high FPU usage, take a look at the mighty FX 8350 in this FPU benchmark compared to my CPU at the same frequency with screwed up RAM timings (no idea why, don't care, upgrading to Haswell):