Computerbase.de recently published an update (translated) to an article outlining the differences between AMD’s AM4 motherboard chipsets. As it stands, the X370 and B350chipsets are set to be the most popular chipsets for desktop PCs (with X300 catering to the small form factor crowd) especially among enthusiasts. One key differentiator between the two chipsets was initially support for multi-GPU configurations with X370. Now that motherboards have been revealed and are up for pre-order now, it turns out that the multi-GPU lines have been blurred a bit. As it stands, both B350 and X370 will support AMD’s CrossFiremulti-GPU technology and the X370 alone will also have support for NVIDIA’s SLI technology.

The AM4 motherboards equipped with the B350 and X370chipsets that feature two PCI-E x16 expansion slots will run as x8 in each slot in a dual GPU setup. (In a single GPU setup, the top slot can run at full x16 speeds.) Which is to say that the slots behave the same across both chipsets. Where the chipsets differ is in support for specific GPU technologies where NVIDIA’s SLI is locked to X370. TechPowerUpspeculates that the decision to lock SLI to its top-end chipset is due, at least in part, to licensing costs. This is not a bad thing as B350 was originally not going to support any dual x16 slot multi-GPU configurations, but now motherboard manufacturers are being allowed to enable it by including a second slot and AMD will reportedly permit CrossFire usage (which costs AMD nothing in licensing). Meanwhile the most expensive X370 chipset will support SLI for those serious gamers that demand and can afford it. Had B350 supported SLI and carried the SLI branding, they likely would have been ever so slightly more expensive than they are now. Of course, DirectX 12's multi-adapter will work on either chipset so long as the game supports it.

X370

B350

A320

X300 / B300 / A300

Ryzen CPU

Bristol Ridge APU

PCI-E 3.0

0

0

0

4

20 (18 w/ 2 SATA)

10

PCI-E 2.0

8

6

4

0

0

0

USB 3.1 Gen 2

2

2

1

1

0

0

USB 3.1 Gen 1

6

2

2

2

4

4

USB 2.0

6

6

6

6

0

0

SATA 6 Gbps

4

2

2

2

2

2

SATA RAID

0/1/10

0/1/10

0/1/10

0/1

Overclocking Capable?

Yes

Yes

No

Yes (X300 only)

SLI

Yes

No

No

No

CrossFire

Yes

Yes

No

No

Multi-GPU is not the only differentiator though. Moving up from B350 to X370 will get you 6 USB 3.1 Gen 1 (USB 3.0) ports versus 2 on B350/A30/X300, two more PCI-E 2.0 lanes (8 versus 6), and two more SATA ports (6 total usable; 4 versus 2 coming from the chipset).

Note that X370, B350, and X300 all support CPU overclocking. Hopefully this helps you when trying to decide which AM4 motherboard to pair with your Ryzen CPU once the independent benchmarks are out. In short, if you must have SLI you are stuck ponying up for X370, but if you plan to only ever run a single GPU or tend to stick with AMD GPUs and CrossFire, B350 gets you most of the way to a X370 for a lot less money! You do not even have to give up any USB 3.1 Gen 2 ports though you limit your SATA drive options (it’s all about M.2 these days anyway heh).

For those curious, looking around on Newegg I notice that most of the B350 motherboards have that second PCI-E 3.0 x16 slot and CrossFire support listed in their specifications and seem to average around $99. Meanwhile X370starts at $140 and rockets up from there (up to $299!) depending on how much bling you are looking for!

Are you going for a motherboard with the B350 or X370 chipset? Will you be rocking multiple graphics cards?

Waiting... now for Naples (while completing another X99 build ha ha). Storage connectivity is pathetic on Ryzen platform. Only 6 SATA ports on top of the line model?! WTH?

Sounds familiar... *cough* X58 *cough* (yeah I know these were SATA2, but at that time it was top of the line chipset). Seriously it looks like AMD took Intel ideas from 2008 (24 lanes/6 SATA ports) and sprinkled it with more modern CPU architecture. Job done.

Really Intel has had more resources to devote to MB variants and you really need to get AMD's Zen/whatever workstation/server MB variants and I'm sure that there will be some 16 or 12 core Zen lower cost than Naples(32 core) Zen Pro CPU/MB options. Ryzen/AM4 is for the consumer market so what do you expect at this early point in time from AMD. AMD's Ryzen/AM4 consumer SKUs are not what you are looking for so move along and wait for any Zen Workstation/Server variants that will come with plenty of PCIe lanes and other ports that you may need.

I don't think I've ever run either Crossfire or SLI (of the Nvidia variety, I did have 2 Voodoo²s back in the day), at least not for more than a test or two. Yet I still usually choose among the cheapest boards that do support both, which had me landing in the $125 to $175 range in the past, and X370 boards seem to be in line with that.

Where is AMD being propitary, they are offering SLI on the X370 chipset based MBs and having to pay Nvidia to even offer that SLI compatibility. AMD is having to license the SLI IP for its X370 MB chipsets and here you are throwing out your negative spin. If you want SLI and AMD has to pay then AMD was good enough to pay Nvidia to license that SLI IP and include that in its X370 chipsets. It's not like Nvidia itself even cares about SLI just look at Nvidia's product lines that even offer SLI and those are not any mainstream GPU offerings from Nvidia.

And what about Explicit GPU multi-adaptor from the Vulkan/DX12 APIs why don't you ask JHH over at Nvidia for any infromation and examples of that usage on Nvidia's part for GTX 1060/1050 GPUs. Yes start asking the real questions about any new Vulkan/DX12 features that Nvidia may not be supporting before you try and spin negative any AM4 motherboard for Nvidia's proprietary IP that AMD was good enough to pay Nvidia for to be able to use SLI on AMD's AM4/X370 chipsets.

AMD has been more than fair with its SLI support on its AM4/X370 chipsets and Nvidia does not even offer SLI support across all of its GPU product lines! So who is the only one offering the more limited SLI support, AMD or Nvidia!

those 1060 and 1050 will work just fine with DX12 multi gpu. two 1060 work in tandem in Ashes is fine proof to that. now developer just need the willingness to do the optimization themselves instead of relying to AMD and nvidia. this is some of the problem of multi gpu with DX12. developer will enable them but some of them still depending 100% on DX11 CF and SLI support first before they can "activate" them in DX12 because they need the profile supplied by AMD and nvidia instead of doing the job themselves.

I think that's a cop out. Three years ago when I was running an SLI setup, most of the games supported SLI. Nvidia stopped pushing mult-gpu around that time because they determined that people were able to afford more expensive cards up front and were willing to buy cards more frequently. I'm sure the lack of highend competition factored into this.

People claim that AFR is no longer viable with modern game engines because data is preserved frame to frame, but that doesn't explain why the old fashioned split frame rendering can't be implemented.

because using SFR defeating the purpose going with the second gpu to begin with (scaling is terrible and CIV: BE using SFR in mantle shows that gpu maker still have no way to work around the scaling issue). that's why despite it's latency disadvantage both GPU maker push AFR. you spend double the money on gpu so many people want that extra spending worth the trouble. yes high frame rates with terrible frametimes also a problem that's why nvidia work on the issue and introduce frame pacing with their hardware and drivers.

going multi gpu on the mid range card start lose it's appeal during GTX960 generation i think. in the past the x60 SLI have always been faster (and cheaper) than nvidia single x80 card. 660SLI for example can be faster than a single 680 if the scaling is good. but in 960 case you need perfect scaling if you want to match 980 performance. so the appeal is lost. not to mention people in this price category rarely do multi gpu.

It makes virtually no noticeable difference. So, not true at all. If someone wants SLI and buys X370, anyone would be hard pressed to tell the difference between that and X99 with a 40-lane CPU, based strictly on PCIe lanes.

RE: Anon, you do need to remember that these are PCI-E 3.0 lanes so not as much of a bottleneck as one might think. Yeah, two x16 would be nice and it's irksome that PCI-E lanes are so limited on both AMD and Intel's newest stuff but dual card SLI/xFire/DX multi adapter still work fine. I forget who did it but I remember at least one site did some testing on how much the x8 vs x16 slots work out with regards to gaming...

I wonder how much their server products will cost and when they will be available. The dual chip MCM they are supposed to be making is 2 16-core die, each supposedly with 64 PCI-e lanes. It seems like they would sell some salvaged versions in the consumer market like Intel does it you absolutely need more PCI-e lanes. The X370 is plenty even for most enthusiast though. As long as I can put an m.2 SSD or two, I don't care too much about the number of SATA ports. I think the most I have used is 3.

Hopefully so, I don't expect them to be cheap though. Perhaps if AMD comes out with some kind of new chipset in between X370 and Naples along with using say 16 core / 32 thread (realistically, from the leaks it seems possible to do any even # of cores so long as the compexes are balanced with regard to number of cores per complex) or 10 or 12 core CPUs binned from not quite 32 core Naples parts and brand this prosumer platform under... what was it... AMD's PRO R Series (the hypothetical successor to the PRO A-Series APUs).

I could see that happening, and then these chips would fill in the price and feature gaps between the $499 R7 1800X and the base $X,000+ Naples CPU. It would mostly be for workstations and SOHO but consumers that, say, needed a lot of storage IO or PCI-E lanes could also get in.

Butthurt Much Intel Stockholder! Those expensive to maintain Intel chip fabs will bleed billions as AMD racks up the Ryzen sales figures and nets AMD more market share!
The ones with more money in their bank accounts are AMD's customers and not the fools that pay through the nose to Intel! Who's The Real SUCKER Now!

Interesting point, but nVidia realized the best business move was the not support SLI for the 1060 because all the "people who are penny pinchers or make living delivering newspapers" would buy two of them to make better performance than the 1080 for less money.

Some people have kids and such to support and their kids are not web-footed and web-handed offspring of a questionable matrimony in the closely related gene pool! So those people with real and not disfunctional lives are always penny pinching to properly support their families and such.

Now you, Cletus and your Swife Brandine, well you are so easily fooled by JHH's marketing monkeys and are so quickly seperated from you kid's food/clothing/housing/medical money. And that's all to feed that superego conflict that you developed living in the land where canoeing can lead to some serious involuntary and under great duress squeal like a pig moments and other madness that comes from such recessive genetic traits passed down over many generations. Why I can hear the banjo picking every time you post!

Personally I'm not even interested in SLI. I don't plan to play on super high resolutions or use extremely high supersampling. If you play on FullHD, something like a single GTX 1060 or RX 480 should suffice for most games.

The Law Journal article is from 2011 and it covers Intel's legal history with the x86 ISA/CPU designs very well, other than the last part which is in need of some updates due to the latter part having some market/product speculation that did not work out. But for a legal history of Intel's legal issues up to the Jan 2011 date of the article it is very useful to have this bookmarked for future refrence regarding the legal/market issues of the x86 ISA and the custom designes engineered to run the x86 16/32 bit ISA from Intel and AMD(x86 64 bit ISA extentions) as well as all the other x86 licensees since the dawn of the PC/IBM PC era.

Intel's current market methods are back in the news. But that's in addition to the need to have a view of Intel's past history regarding legal market/trade issues in the x86 ISA based market place.

Is this even really a big deal? Who's going to spend extra money on a second GPU, and then get a shit mobo? I don't know how much adding SLI capability adds to the cost margins, but B350 are budget boards. If they can make them more budget friendly, they're maximizing cost savings (in theory) for the consumer. They're under no obligation to support their competition's standard, but it's up to the company to decide on if it is more cost effective in the market to target budget users for a budget part, or a feature that only a small amount of the userbase use at that price point.

Definitely some weirdness going on with X300. We know Ryzen has a bunch of PCH features on the CPU die itself (e.g. SATA ports, extra 4 PCIe lanes for PCH link, USB ports), and X300 appears to have pretty much the same capabilities as the bare CPU with the exception of gaining come USB ports. And it's the only one to have PCIe 3.0 ports on the PCH rather than the PCIe 2.0 ports on all the others (WTF there AMD, m.2 has been around for some time now!).

It very much appears that either X300 is a 'non-chipset', or it's just a weird way of branding "we're hooking two of the Ryzen PCIe lanes to a USB host chip".

This has worked out well for me, I don't need the extra features of the X370 boards, and as a lot of the B350 have a legacy PCI slot, I don't have to replace my precious M-Audio Delta 66 card either - probably its last move to a board! Being able to save money on those fronts will let me pay for a higher end Ryzen CPU.