MStar, Novatek, and Realtek, three vendors of scaler units for use in displays, have announced support for AMD's FreeSync. Specifically, for the Q1'15 line of monitors, these partners will provide scaler chips that use DisplayPort Adaptive-Sync and, when paired with a compatible AMD GPU, will support FreeSync.

The press release claims that these scalar chips will either support 1080p and 1440p monitors that are up to 144Hz, or drive 4K displays that are up to 60Hz. While this is promising, at least compared to the selection at G-Sync's launch a year earlier, it does not mean that this variety of monitors will be available -- just that internal components will be available for interested display vendors. Also, it means that there are probably interested display vendors.

AMD and partners "intend to reveal" displays via a "media review program" in Q1. This is a little later than what we expected from Richard Huddy's "next month" statements, but it is possible that "Sampling" and "Media Review Program" are two different events. Even if it is "late", this is the sort of thing that is forgivable to me (missing a few months while relying on a standards body and several, independent companies).

The ASUS STRIX GTX 750 Ti OC sports the custom DirectCU II cooling system which not only improves the temperatures on the card but also reduces the noise produced by the fans. It comes out of the box with an overclocked GPU base clock 1124MHz and a boost clock of 1202MHz, with the 2GB of VRAM set to the stock speed of 5.4GHz; [H]ard|OCP managed to increase that to an impressive 1219/1297MHz and 6.0GHz even for the VRAM without increasing voltages. Unfortunately even with that overclock it lagged behind the Sapphire Radeon R9 270 Dual-X which happens to be about the same price at $170.

"Rounding out our look at ASUS' new STRIX technology we have another STRIX capable video card on our test bench today, this time based on the GTX 750 Ti GPU. We will take the ASUS STRIX GTX 750 Ti OC Edition and test it against an AMD Radeon R9 270 and AMD Radeon R9 265 to see what reigns supreme under $200."

As expected, the cooler is a continuation of NVIDIA's reference cooler, as seen on recent high-end graphics cards (such as the GeForce Titan). Again, this is not a surprise. The interesting part is that it is rated for about 250W whereas Maxwell is rumored to draw 180W. While the reference card has two six-pin PCIe power connectors, I am curious to see if the excess cooling will lead to interesting overclocks. That is not even mentioning what the AIB partners can do.

Its display connectors have been hotly anticipated. As you can see above, the GTX 980 has five outputs: three DisplayPort, one HDMI, and one DVI. Which version of HDMI? Which version of DisplayPort? No clue at the moment. There has been some speculation regarding HDMI 2.0, and the DisplayPort 1.3 standard was just released to the public today, but I would be pleasantly surprised if even one of these made it in.

Less than a year after the launch of AMD's R9 290X, we are beginning to hear rumors of a follow-up. What is being called the R9 390X, because if it is called anything else, then that was a very short-lived branding scheme, might be liquid cooled. This would be the first single-processor, reference graphics card to have an integrated water cooler. That said, the public evidence is not as firm as I would normally like.

According to Tom's Hardware, Asetek is working on a liquid-cooled design for "an undisclosed OEM". The product is expected to ship during the first half of 2015 and the press release claims that it will "continue Asetek's success in the growing liquid cooling market". Technically, this could be a collaboration with an AIB partner, not necessarily a GPU developer. That said, the leaked photograph looks like a reference card.

We don't really know anything more than this. I would expect that it will be a refresh based on Hawaii, but that is pure speculation. I have no evidence to support that.

Personally, I would hope that a standalone air-cooled model would be available. While I have no experience with liquid cooling, it seems like a bit extra of a burden that not all purchasers of a top-of-the-line single GPU add-in board would want to bare. Specifically, placing the radiator if their case even supports it. That said, having a high-performing reference card will probably make the initial benchmarks look extra impressive, which could be a win in itself.

Psst. AMD fans. Don't tell "Team Green" but Linus decided to take four R9 290X graphics cards and configure them in Quad Crossfire formation. They did not seem to have too much difficulty setting it up, although they did have trouble with throttling and setting up Crossfire profiles. When they finally were able to test it, they got a 3D Mark Fire Strike Extreme score of 14979.

Psst. NVIDIA fans. Don't tell "Team Red" but Linus decided to take four GeForce Titan Black graphics cards and configure them in Quad SLI formation. He had a bit of a difficult time setting up the machine at first, requiring a reshuffle of the cards (how would reordering PCIe slots for identical cards do anything?) and a few driver crashes, but it worked. Eventually, they got a 3D Mark Fire Strike Extreme score of around 13,300 (give or take a couple hundred).

Please keep in mind that this information has been assembled via research done by WCCF Tech and Videocardz off of 3DMark entries of unreleased GPUs; we won't get the official numbers until the middle of this month. That said, rumours and guesswork about new hardware are a favourite past time of our readers so here is the information we've seen so far about the upcoming GM204 chip from NVIDIA. On the desktop side is the GeForce GTX 980 and GeForce GTX 970 which should both have 4GB of GDDR5 on a 256-bit bus with GPU clock speeds ranging from 1127 to 1190 MHz. The performance that was shown on 3DMark has the GTX 980 beating the 780 Ti and R9 290X and the GTX 970 performing similarly to the plain GTX 780 and falling behind the 290X. SLI scaling looks rather attractive with a pair of GTX 980 coming within a hair of the performance of the R9 295X2.

On the mobile side things look bleak for AMD, the GTX 980M and GTX 970M surpass the current GTX 880M which in turn benchmarks far better than AMD's M290X chip. Again the scaling in SLI systems will be impressive assuming that the leaks that you can see indepth here are accurate. It won't be too much longer before we know one way or the other so you might want to keep your finger off of the Buy Button for a short while.

I was originally intending to test this with benchmarks but, after a little while, I realized that Ivy Bridge was not supported. This graphics driver starts and ends with Haswell. While I cannot verify their claims, Intel advertises up to 30% more performance in some OpenCL tasks and a 10% increase in games like Batman: Arkham City and Sleeping Dogs. They even claim double performance out of League of Legends at 1366x768.

Intel is giving gamers a "free lunch".

The driver also tunes Conservative Morphological Anti-Aliasing (CMAA). They claim it looks better than MLAA and FXAA, "without performance impact" (their whitepaper from March showed a ~1-to-1.5 millisecond cost on Intel HD 5000). Intel recommends disabling it after exiting games to prevent it from blurring other applications, and they automatically disable it in Windows, Internet Explorer, Chrome, Firefox, and Windows 8.1 Photo.

Adaptive Rendering Control was also added in this driver. This limits redrawing identical frames by comparing the ones it does draw with previously drawn ones, and adjusts the frame rate accordingly. This is most useful for games like Angry Birds, Minesweeper, and Bejeweled LIVE. It is disabled when not on battery power, or when the driver is set to "Maximum Performance".

The Intel Iris and HD graphics driver is available from Intel, for both 32-bit and 64-bit Windows 7, 8, and 8.1, on many Haswell-based GPUs.

A few days with some magic monitors

Last month friend of the site and technology enthusiast Tom Petersen, who apparently does SOMETHING at NVIDIA, stopped by our offices to talk about G-Sync technology. A variable refresh rate feature added to new monitors with custom NVIDIA hardware, G-Sync is a technology that has been frequently discussed on PC Perspective.

The first monitor to ship with G-Sync is the ASUS ROG Swift PG278Q - a fantastic 2560x1440 27-in monitor with a 144 Hz maximum refresh rate. I wrote a glowing review of the display here recently with the only real negative to it being a high price tag: $799. But when Tom stopped out to talk about the G-Sync retail release, he happened to leave a set of three of these new displays for us to mess with in a G-Sync Surround configuration. Yummy.

So what exactly is the current experience of using a triple G-Sync monitor setup if you were lucky enough to pick up a set? The truth is that the G-Sync portion of the equation works great but that game support for Surround (or Eyefinity for that matter) is still somewhat cumbersome.

In this quick impressions article I'll walk through the setup and configuration of the system and tell you about my time playing seven different PC titles in G-Sync Surround.

While not fully in effect yet, AMD is cutting $500 off of the R9 295X2 price tag to $999 USD. Currently, there are two models available on Newegg USA at the reduced price, and one at Amazon for $1200. We expect to see other SKUs reduce soon, as well. This puts the water-cooled R9 295X2 just below the cost of two air-cooled R9 290X graphics cards.

If you were interested in this card, now might be the time (if one of the reduced units are available).

Matrox, along with S3, develop GPU ASICs for use with desktop add-in boards, alongside AMD and NVIDIA. Last year, they sold less than 7000 units in their quarter according to my math (rounding to 0.0% market share implies < 0.05% of total market, which was 7000 units that quarter). Today, Matrox Graphics Inc. announce that they will use an AMD GPU on their upcoming product line.

While they do not mention a specific processor, they note that "the selected AMD GPU" will be manufactured at a 28nm process with 1.5 billion transistors. It will support DirectX 11.2, OpenGL 4.4, and OpenCL 1.2. It will have a 128-bit memory bus.

Basically, it kind-of has to be Cape Verde XT (or XT GL) unless it is a new, unannounced GPU.

If it is Cape Verde XT, it would have about 1.0 to 1.2 TFLOPs of single precision performance (depending on the chosen clock rate). Whatever clock rate is chosen, the chip contains 640 shader processors. It was first released in February 2012 with the Radeon HD 7770 GHz Edition. Again, this is assuming that AMD will not release a GPU refresh for that category.

Matrox will provide their PowerDesk software to configure multiple monitors. It will work alongside AMD's professional graphics drivers. It is a sad that to see a GPU ASIC manufacturer throw in the towel, at least temporarily, but hopefully they can use AMD's technology to remain in the business with competitive products. Who knows: maybe they will make a return when future graphics APIs reduce the burden of driver and product development?