Mini-ITX is an exciting form factor, as with every CPU generation we can make something small and faster with better features over the previous gen. GIGABYTE recently released a mini-ITX motherboard for AMD APUs, and we have it in to test – the F2A85XN-WiFi.

At the time of writing this motherboard is $105, and the main competition comes from ASRock and MSI under a variety of chipsets. We have covered the A85X before, taking one motherboardfromeachofthe main manufacturers, but this is our first look at an A85X mini-ITX board, and this time with a Richland CPU (read Anand’s review here).

GIGABYTE F2A85XN-WiFi Overview

Connectivity is always at a premium with smaller form factor boards. The ability to fit everything into a smaller space might mean spending an extra $0.50 on the motherboard to get smaller NICs and other ICs to fit even more features on board. This gets compounded on the AMD side by virtue of the larger socket area used by AMD CPUs, requiring a good percentage of the PCB space. As a result, many of the different boards in this area are split on the most minor of features – if you want one with WiFi, then get WiFi. Conversely if you need SATA ports, find one with SATA ports. But due to the size, and the platform, the main differentiators are going to be in the minor hardware choices, and software.

GIGABYTE motherboards never leave the factory without the Ultra Durable seal of approval, and the F2A85XN-WiFi gets the 4th edition PLUS, one under the F2A85X-UP4 we reviewed previously, which means using 40A IR3550s power phases, rated to run at lower temperatures due to high efficiency ratings. This goes alongside two 4-pin fan headers, four SATA 6 Gbps ports, a Realtek NIC, Realtek ALC892 audio, and dual HDMI ports on the rear IO. Dual HDMI is starting to become more of a feature on motherboards, even when manufacturers have a choice between DVI-D, HDMI and DisplayPort. Again, the choice of motherboard in this area often comes down to the minor hardware choices specific to the setup required, and how well they are executed by the manufacturer, and in that context we get the WiFi on AMD. The module is dual-band enabled (useful for inter-city areas) with 802.11 b/g/n. At this price point it will be a while before we see 802.11ac for sure, but the current WiFi solution onboard is ready for AWD (AMD Wireless Display), an alternative to WiDi.

Performance wise, as long as the user is happy with an FM2 CPU, then the GIGABYTE F2A85XN-WiFi plays ball in all of our testing scenarios – even when paired with a high end GPU in our gaming tests. Typically when a user sprouts for this platform they are either wanting a mini gaming machine (perhaps AMD Dual Graphics) or something small and usable – in this scenario there are some nice placements on board for hardware (24-pin ATX and SATA are on the edge) but a couple that are confusing (4-pin CPU connector, front panel connections). Our overclock results were pretty good, with the motherboard keeping our CPU stable all the way up until 4.9 GHz – the majority of the trek to that speed was spent at stock volts.

The main competition for the GIGABYTE motherboard is going to be from the ASRock FM2A85X-ITX. While the ASRock offers seven SATA ports and the XFast software range, with the GIGABYTE we get dual band WiFi, dual HDMI and Ultra Durable on the power delivery, for an additional $10. This is the front GIGABYTE is competing against, and the WiFi edition is a good option for general usage if you do not need major storage.

Visual Inspection

Every time I open up a mini-ITX motherboard, I still feel strange that this 17cm x 17cm PCB can do most of the stuff my big PC can do. Then I hook it all up to my test bed, and the motherboard almost disappears underneath a mix of GPU, memory, cables and heatsink. The heatsink area is where I want to start, and with this being the AMD socket on a mini-ITX, there are going to be issues. As the cooler to the socket is fixed on the same axis, there is only one real way to put a cooler in, and when I put my Corsair H80i on, I accidentally did it the wrong way:

I should have made a mental calculation before installing the cooler... it worked the other way around though

The pipes were clearly in the way of the first memory slot, and as such I had to place the cooler the other way around and it fit perfectly. It should have been obvious from the start, but at least GIGABYTE has accommodated this scenario. Users of large heatsinks should beware of potential memory interaction due to overhang.

For the fan headers, our CPU fan header is located above the socket and just to the left of the WiFi card. The SYS fan header is behind the USB 3.0 ports to the left of the power delivery. In an ideal world I would have preferred three fan headers – one for the CPU cooler and two for case fans, though this is not a crippling issue. Perhaps a compromise would be an additional 2x3-pin to 3-pin cable in the box.

The onboard WiFi card comes from the mPCIe slot, and is a dual band 2x2:2 b/g/n Atheros solution, connected via loose cables to the rear IO antenna slots. Based on designs of other motherboards I have seen, perhaps having the mPCIe slot coming out of the motherboard, and thus the WiFi card vertical, and near the rear IO, would reduce the need for these cables, that did get in the way a few times during my testing. This is not a new phenomenon though – there is another loose wire connecting the battery, but the battery is stuck to the rear IO to save space.

In terms of connectivity, we get four SATA 6 Gbps ports on the top of the board, next to a USB 3.0 header and a USB 2.0 header. The DRAM slots are placed on the edge of the motherboard, and are dual latch versions – in recent reviews I am becoming an advocate of single latch designs to make memory easy to remove when a large GPU is present, although there seems to be a premium in that request. GIGABYTE is aiming for a small gaming or HTPC system here, so it is possible that the PCIe slot may be unoccupied or not necessarily have a big GPU (but you still can if you want to).

Before we get to the rear IO discussion, there is one big oddity on the motherboard – the placement of the CPU 4-pin power connector. As mentioned in previous mini-ITX reviews, when this connector is located between the PCIe slot, the CPU, the power delivery and finally enclosed by the rear IO, it seems in a very odd place. The connector is going to have to come from one of the four sides, and in each direction there is something potentially in the way. The only solution to install the cable is thus from above, or to have the 4-pin somewhere else on the motherboard. I would suggest the latter for future revisions, just in case it becomes a sticking point for certain builds.

For the rear IO we get a set of four USB 2.0 ports, a PS/2 combination port, two HDMI outputs, DVI-D, dual ports for antennas, two USB 3.0 ports, the Realtek NIC and audio jacks.

GIGABYTE’s sole competition in the A85X mini-ITX space comes from the ASRock FM2A85X-ITX, which has been on the market longer and is slightly cheaper. The ASRock model offers seven SATA 6 Gbps ports at the loss of the WiFi, as well one of the HDMI which becomes a VGA. The orientation is also different, with the DRAM slots at the top, and the ASRock uses a different power delivery solution.

A75 is a chipset also available to FM2 socket manufacturers, and MSI offer an competitor in their FM2 A75 mini-ITX, (which in terms of chipset talk reduces the available SATA 6 Gbps from 8 to 6 and removes RAID 5, so not that relevant in this circumstance) with WiFi and USB 3.0, although with the Realtek ALC887 audio codec (2.1 output) and no VRM heatsinks.

Post Your Comment

31 Comments

Every time I look at the ultra-crowded layout of an mITX board I'm reminded of how dated the main 24 ATX power plug is and how much it would benefit from being replaced. While they were king in the p1 era with the CPU and PCI busses running on 3.3v directly and most other chips on the board designed for 5V; 3.3 and 5V are barely used at all any more but have 3 and 5 wires in the 24pin cable; while -5V has been removed entirely from modern versions of the spec. Dropping to a single 3.3/5v wire and removing the -5v one would free 7 pins directly; and with only 4 power pins left in the legacy connector (3.3, 5, 2 x 12) there's no need for 8 ground pins either. Probably we could drop 5 of them.

This would allow for a successor cable that's only half as large; freeing space on crowded boards and replacing the 24wire cable with a 12 wire one that would be much less of a pain to route in a crowded case. I'm inclined to keep the CPUs 12V separate just to avoid trading one overly fat wire bundle for another and because AIUI the other half of why the CPUs 12V comes in separately is to get it as close to the socket as possible without crowding the area with everything else.Reply

While we are at why do motherboards virtually never come with the ATX connector being at right angles rather than straight up - we get that for SATA connectors and it seriously improves cable managementReply

Not such a good idea for mITX boards, when you might expect to install them in small cases such as the Antec ISK110 or Minibox M350. Right-angled ATX power or sata ports would be blocked off.

How about RAM though? Why not use SO-DIMMs that are about 60% the size of regular DIMMs? They're readily available, and are priced the same or very close to the price of regular size RAM. Assuming two sticks of RAM, that would save even more room on the motherboard than a redesign of the 24-pin connector. Just look at the Asus P8H67-I Deluxe for an example. :)Reply

It's not just mITX boards that would have a problem with right angled ATX power sockets. Unless the PSU also included a right angle 24pin cable it would be problematic in any case that uses cable management holes to route the cables behind the mobo tray. Trying to make a 90* bend in that cramped a space would put a lot of torque on the socket; a big ugly loop sticking up allows for a much looser and less stressful bend.Reply

For dimm sizes I think it's mostly a capacity issue. For more modest builds it probably doesn't matter; but higher capacities tend to come out a year or two sooner in full size dimms because you can jam more chips onto them if need be. Currently DDR3 dimms and sodimms both max out at 8GB for desktops; but if you're willing to pay the price premium server ram is available in up to 32GB dimms.Reply

True, but how about SO-DIMMs on a budget Intel H81/B85 or AMD A55/A75 board? Or one of the low-power Intel Atom or AMD Brazos ITX boards? Or even a budget Z87 ITX board, to avoid the need for a vertically mounted VRM daughterboard (unless that's actually cheaper to do)? More space for surface mounted components, and probably cheaper to make the board? Could also mean less PCB layers?Reply

You're only saving 36 pins/dimm (204 vs 240); so there's no where near enough savings to drop a PCB layer. Beyond that I'd guess that since they do offer some models with SoDIMM slots that they just don't sell as well. If I had to guesses why it'd be that people are more likely to have spare DIMMs laying around than spare SoDIMMs; meaning that the total build cost is lower since the ram is free and/or the cost savings from larger DIMMs are enough to drive shoppers.

The one configuration I could see driving some enthusiast/gamer consumption of SoDIMM based mITX boards would be 4 slots instead of only 2 for 32GB max instead of 16; is conspicuous by its absence.Reply