I don't consider myself a keyboard guru, but I sure do go through a lot of them in my line of work. At any of five different workstations in our office I'll be using a different keyboard. And we tend to interchange them often enough that I would guess I have typed on as many as 15 different keyboards this year. Some for longer periods of time than others of course, but the ones that make it to my main desk get quite a workout.

When our friends at Seasonic told us they wanted to send along a Topre Type Heaven keyboard for us to try out, I told them to feel free; but in my head I was thinking "oh geez another keyboard." Turns out I didn't give this brand and this keyboard enough credit out the gate.

With a price tag of $150 on Amazon.com, there are going to quite of few of you that just instantly turn off. Understandable. Others though will appreciate the need for a high quality input device if you do any appreciable amount of typing for work or pleasure. Using a technology called electrostatic capacitive key switches, Topre combines benefits of Cherry and standard membrane keyboards in one package.

Check out my video above for some sound comparison as well as my thoughts on using the keyboard long term. Not to spoil it: but I'm keeping this keyboard on my desk despite me missing the multimedia controls of my previous keyboard.

The First Custom R9 290X

It has been a crazy launch for the AMD Radeon R9 series of graphics cards. When we first reviewed both the R9 290X and the R9 290, we came away very impressed with the GPU and the performance it provided. Our reviews of bothproducts resulted in awards of the Gold class. The 290X was a new class of single GPU performance while the R9 290 nearly matched performance at a crazy $399 price tag.

But there were issues. Big, glaring issues. Clock speeds had a huge amount of variance depending on the game and we saw a GPU that was rated as "up to 1000 MHz" running at 899 MHz in Skyrim and 821 MHz in Bioshock Infinite. Those are not insignificant deltas in clock rate that nearly perfectly match deltas in performance. These speeds also changed based on the "hot" or "cold" status of the graphics card - had it warmed up and been active for 10 minutes prior to testing? If so, the performance was measurably lower than with a "cold" GPU that was just started.

That issue was not necessarily a deal killer; rather, it just made us rethink how we test GPUs. The fact that many people were seeing lower performance on retail purchased cards than with the reference cards sent to press for reviews was a much bigger deal. In our testing in November the retail card we purchased, that was using the exact same cooler as the reference model, was running 6.5% slower than we expected.

The obvious hope was the retail cards with custom PCBs and coolers would be released from AMD partners and somehow fix this whole dilemma. Today we see if that was correct.

A slightly smaller MARS

The NVIDIA GeForce GTX 760 was released in June of 2013. Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously. My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD.

As for ASUS, they have a storied history with the MARS brand. Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them. The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course). The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide. It was heavy, expensive and fast; blazing fast. But at a price of $1200+ it wasn't on the radar of most PC gamers.

Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate. Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.

Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market. Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them. The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.

Those people selling the displays? Digital Storm, Falcon Northwest, Maingear, and Overlord Computer. This creates some unfortunate requirements on potential buyers. For example, Falcon Northwest is only selling the panels to users that either are buying a new Falcon PC or already own a Falcon custom system. Digital Storm on the other hand WILL sell the monitor on its own or allow you to send in your VG248QE monitor to have the upgrade service done for you. The monitor alone will sell for $499 while the upgrade price (with module included) is $299.

This distribution model for G-Sync technology likely isn't what users wanted or expected. After all, we were promised upgrade kits for users of that specific ASUS VG248QE display and we still do not have data on how NVIDIA plans to sell them or distribute them. Being able to purchase the display from these resellers above is at least SOMETHING before the holiday, but it really isn't the way we would like to see G-Sync showcased. NVIDIA needs to get these products in the hands of gamers sooner rather than later.

NVIDIA also prepared a new video to showcase G-Sync. Unlike other marketing videos this one wasn't placed on YouTube as the ability for it to run at a fixed 60 FPS is a strict requirement, something that YouTube can't do or can't do reliably. For this video's demonstration to work correctly you need set your display to a 60 Hz refresh rate and you should use a video player capable of maintaining the static 60 FPS content decoding.

A not-so-simple set of instructions

Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening. We have spent quitea lotof time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.

Our full video tutorial of installing and configuring SteamOS

First up was selecting the hardware for the build. As is usually the case, we had a nearly-complete system sitting around that needed some tweaks. Here is a quick list of the hardware we used, with a discussion about WHY just below.

We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on. First up was the case, the new EVGA Hadron Mini ITX chassis. It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.

The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended. Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.

In any event, I thought it might be interesting to extract this 6 minute discussion we had during last nights live streamed podcast about how the emergence of Litecoin mining operations is driving up prices of GPUs, particularly the compute-capable R9 290 and R9 290X Hawaii-based cards from AMD.

The price of the GTX 770 is a bit higher than it should be while the GTX 780 and GTX 780 Ti are priced in the same range they have been for the last month or so. The same cannot be said for the AMD cards listed here - the R9 280X is selling for $130 more than its expected MSRP at a minimum but you'll see quite a few going for much higher on Amazon, Ebay (thanks TR) and others. The Radeon R9 290 has an MSRP of $399 from AMD but the lowest price we found on Amazon was $499 and anything on Newegg.com is showing at the same price, but sold out. The R9 290X is even more obnoxiously priced when you can find them.

Do you have any thoughts on this? Do you think Litecoin mining is really causing these price inflations and what does that mean for AMD, NVIDIA and the gamer?

Quality time with G-Sync

Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.

In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.

The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.

Streaming games straight from NVIDIA

Over the weekend NVIDIA released a December update for the SHIELD Android mobile gaming device that included a very interesting, and somewhat understated, new feature: Beta support for NVIDIA GRID.

You have likely heard of GRID before, NVIDIA has been pushing it as part of the companies vision going forward to GPU computing in every facet and market. GRID was aimed at creating GPU-based server farms to enable mobile, streaming gaming to users across the country and across the world. While initially NVIDIA only talked about working with partners to launch streaming services based on GRID, they have obviously changed their tune slightly with this limited release.

If you own a SHIELD, and install the most recent platform update, you'll find a new icon in your NVIDIA SHIELD menu called GRID Beta. The first time you start this new application, it will attempt to measure your bandwidth and latency to offer up an opinion on how good your experience should be. NVIDIA is asking for at least 10 Mbps of sustained bandwidth, and wants round trip latency under 60 ms from your location to their servers.

Currently, servers are ONLY located in Northern California so the further out you are, the more likely you will be to run into problems. However, oing some testing in Kentucky and Ohio resulted in a very playable gaming scenarios, though we did run into some connection problems that might be load-based or latency-based.

After the network setup portion users are shown 8 different games that they can try. Darksiders, Darksiders II, Street Fighter X Tekken, Street Fighter IV, Alan Wake, The Witcher 2, Red Faction: Armageddon and Trine 2. You are free to play them free of charge during this beta though I think you can be sure they will be removed and erased at some point; just a reminder. Saves work well and we were able to save and resume games of Darksiders 2 on GRID easily and quickly.

Starting up the game was fast, about on par with starting up a game on a local PC, though obviously the server is loading it in the background. Once the game is up and running, you are met with some button mapping information provided by NVIDIA for that particular game (great addition) and then you jump into the menus as if you were running it locally.