How to Buy the Best GPU for Gaming

If you're a PC gamer, or a content creator who lives and dies by the speed of your graphics-accelerated software, your video card is the engine that powers what you can do—or how lustily you can brag. And until just a few months ago, that bragging was really expensive.

Over the last two years, at times buying a video card felt like dishing out for a rare flower bulb, not a PC component, amid some 21st-century tulip frenzy. The cryptomining crazes of 2017 and 2018 drove wild demand for graphics horsepower—the kind of computing muscle best suited to amateur and professional digital currency mining—and thus for certain video cards. Prices for even modest mainstream cards flew sky-high. For a time, the market went downright bonkers. Some cards traded for double or more than their list prices, if you could find them in stock at all.

Here in early '19, that hubbub has died down—at least for the moment. Our guide will help you sort through the best video-card options for your desktop PC, what you need to know to upgrade a system, and how to evaluate whether a particular card is a good buy. We'll also touch on some upcoming trends—they could affect which card you choose.

Even with cryptomania on the wane for now, you do need to choose with care. After all, consumer video cards range from under $50 to well over $1,000. It's easy to overpay or underbuy. (We won't let you do that, though.)

Who's Who in GPUs

First off, what does a graphics card do? And do you really need one?

If you're looking at any given pre-built desktop PC on the market, unless it's a gaming-oriented machine, PC makers will de-emphasize the graphics card in favor of promoting CPU, RAM, or storage options. Indeed, sometimes that's for good reason; a low-cost PC may not have a graphics card at all, relying instead on the graphics-acceleration silicon built into its CPU (an "integrated graphics processor," commonly called an "IGP"). There's nothing inherently wrong with relying on an IGP—most business laptops, inexpensive consumer laptops, and budget-minded desktops have them—but if you're a gamer or a creator, the right graphics card is crucial.

A modern graphics solution, whether it's a discrete video card or an IGP, handles the display of 2D and 3D content, drawing the desktop, and decoding and encoding video content in programs and games. All of the discrete video cards on the consumer market are built around large graphics processing chips designed by one of two companies: AMD or Nvidia. These processors are referred to as "GPUs," for "graphics processing units," a term that is also applied, confusingly, to the graphics card itself. (Nothing about graphics cards...ahem, GPUs...is simple!)

The two companies work up what are known as "reference designs" for their video cards, a standardized version of a card built around a given GPU. Sometimes these reference-design cards are sold directly by Nvidia (or, less often, by AMD) themselves to consumers. More often, though, they are duplicated by third-party card makers (companies referred to in industry lingo as AMD or Nvidia "board partners"), such as Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac. Depending on the graphics chip in question, these board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia), or they will design their own custom products, with different cooler designs, slight overclocking done from the factory, or features such as LED mood illumination. Some board partners will do both—that is, sell reference versions of a given GPU, as well as its own, more radical designs.

Who Needs a Discrete GPU?

We mentioned integrated graphics (IGPs) above. IGPs are capable of meeting the needs of most general users today, with three broad exceptions...

Professional Workstation Users. These folks, who work with CAD software or in video and photo editing, will still benefit greatly from a discrete GPU. Some of their key applications can transcode video from one format to another or perform other specialized operations using resources from the GPU instead of (or in addition to) those of the CPU. Whether this is faster will depend on the application in question, which specific GPU and CPU you own, and other factors.

Productivity-Minded Users With Multiple Displays. People who need a large number of displays can also benefit from a discrete GPU. Desktop operating systems can drive displays connected to the IGP and discrete GPUs simultaneously. If you've ever wanted five or six displays hooked up to a single system, you can combine an IGP and a discrete GPU to get there.

That said, you don't necessarily need a high-end graphics card to do that. If you're simply displaying business applications, multiple browser windows, or lots of static windows across multiple displays (i.e., not demanding PC games), all you need is a card that supports the display specifications, resolutions, monitor interfaces, and number of panels you need. If you're showing four web browsers across four display panels, a GeForce GTX 1080, say, won't confer any greater benefit than a GTX 1060 with the same supported outputs.

Gamers. And of course, there's the gaming market, to whom the GPU is arguably the most important component. RAM and CPU choices both matter, but if you have to pick between a top-end system circa 2015 with a 2019 GPU or a top-end system today using the highest-end GPU you could buy in 2015, you'd want the former.

Graphics cards fall into two distinct classes: consumer cards meant for gaming and light content creation work, and dedicated cards meant for professional workstations and geared toward scientific computing, calculations, and artificial intelligence work. This guide, and our reviews, will focus on the former, but we'll touch on workstation cards a little bit, later on. The key sub-brands you need to know across these two fields are Nvidia's GeForce and AMD's Radeon RX (on the consumer side of things), and Nvidia's Titan and Quadro, as well as AMD's Radeon Pro and Radeon Instinct (in the pro workstation field). As recently as 2017, Nvidia had the very high end of the consumer graphics-card market more or less to itself, and it still dominates there.

We'll focus here on the consumer cards. Nvidia's consumer card line in 2019 is broken into two distinct classes, both united under the long-running GeForce brand: GeForce GTX, and GeForce RTX. AMD's consumer cards, meanwhile, comprise the Radeon RX and Radeon RX Vega families, as well as the new Radeon VII.

Before we get into the individual lines in detail, though, let's outline a very important consideration for any video-card purchase.

Target Resolution: Your First Consideration

Resolution is the horizontal-by-vertical pixel count at which your video card will drive your monitor. This has a huge bearing on which card to buy, and how much you need to spend, when looking at a video card from a gaming perspective.

If you are a PC gamer, a big part of what you'll want to consider is the resolution(s) at which a given video card is best suited for gaming. Nowadays, even low-end cards will display everyday programs at lofty resolutions like 3,840 by 2,160 pixels (a.k.a., 4K). But for strenuous PC games, those cards will not have nearly the power to drive smooth frame rates at high resolutions like those. In games, the video card is what calculates positions, geometry, and lighting, and renders the onscreen image in real time. For that, the higher the in-game detail level and monitor resolution you're running, the more graphics-card muscle is required.

The three most common resolutions at which today's gamers play are 1080p (1,920 by 1,080 pixels), 1440p (2,560 by 1,440 pixels), and 2160p or 4K (3,840 by 2,160 pixels). Generally speaking, you'll want to choose a card suited for your monitor's native resolution. (The "native" resolution is the highest supported by the panel, and the one at which the display looks the best.) You'll also see ultra-wide-screen monitors with in-between resolutions (3,840 by 1,440 pixels is a common one); you can gauge these versus 1080p, 1440p, and 2160p by calculating the raw number of pixels for each (multiply the vertical number by the horizontal one) and seeing where that screen resolution fits in relative to the common ones. (See our targeted roundup of the best graphics cards for 1080p play.)

Now, of course, you can always dial down the detail levels for a game to make it run acceptably at a higher-than-recommended resolution, or dial back the resolution itself. But to an extent, that defeats the purpose of a graphics card purchase. The highest-end cards are meant for 4K play or for playing at very high refresh rates at 1080p or 1440p; you don't have to spend $1,000 or even $500 to play more than acceptably at 1080p. A secondary consideration nowadays, though, is running games at ultra-high frame rates to take advantage of the extra-fast refresh abilities of some new monitors; more on that later. Let's look at the graphics card makers' lines first, and see which ones are suited for what gaming resolutions.

Meet the Radeon and GeForce Families

The GPU lines of the two big graphics-chip makers are constantly evolving, with low-end models suited to low-resolution gameplay ranging up to elite-priced models for gaming at 4K and/or very high refresh rates. Let's look at Nvidia's first.

Nvidia's Lineup, Early 2019

The company's current line is split between cards using last-generation (a.k.a. "10-series") GPUs dubbed the "Pascal" line, and a new-in-2018 20-series line, based on GPUs called "Turing." Its Titan cards are outliers; more on them in a bit.

Here's a quick rundown of the card classes in the Pascal and Turing families, their rough pricing, and their usage cases...

Note that the GeForce GTX 1070- and 1080-class Pascal cards are being allowed to sell through and are going off the market in 2019 in favor of their GeForce RTX successors. We expect the same to happen soon for the GeForce GTX 1060, and eventually, the lesser Pascal cards. We'd class the GT 1030 to GTX 1050 as low-end cards, coming under $100 or a little above. The GTX 1050 Ti to GTX 1070 Ti make up Nvidia's comprehensive midrange, spanning from about $150 to $350, with a few of these cards ranging close to $500.

With apologies to nu-soul and nu-metal, the end-of-lifing GTX 1080-class cards, as well as the new RTX 2060 and RTX 2070, constitute what we'd call the "nu-high end," as they take the place of the old top-end GeForces in the $350 to $700 range. The RTX 2080 and RTX 2080 Ti cards, finally, we'd call a new "elite class."

As for the Titan cards, these are essentially stripped-down workstation cards that bridge the pro graphics and high-end/4K gaming worlds. For most gamers, the Titans won't be of interest due to their pricing. But know that the Titan Xp (older, around $1,200) and newer Titan RTX ($2,500) and Titan V ($2,999) cards are options for Powerball-winning gamers or pro/academic GPU-bound calculation work.

AMD's Lineup, Early 2019

As for AMD's card classes, here in early '19 the company is stronger competing with Nvidia's low-end and mainstream cards than its high-end ones, and it puts up no resistance against the elite class...

The Radeon RX 550 and 560 comprise the low end, while the RX 570 to 590 are the midrange and ideal for 1080p gaming. The RX 580 and RX Vega 56 and Vega 64 cards, the latter good for 1080p and 1440p play, were hit particularly hard by the crypto-craze but have come back down to earth. The Radeon VII is AMD's sole player in the elite bracket; it trades blows with the GeForce RTX 2080 at 4K but generally performed less well at lower resolutions in games.

Graphics Card Basics: Understanding the Core Specs

Now, the charts above should give you a good idea of which card families you should be looking at, based on your monitor and your target resolution. A few key numbers are worth keeping in mind when comparing cards, though: the graphics engine's clock speed, the onboard VRAM (that is, how much video memory it has), and—of course!—the pricing. And then there's adaptive sync.

Engine Clock Speed

When comparing GPUs from the same family, a higher base clock speed (that is, the speed at which the graphics core works) and more cores signify a faster GPU. Again, though: That's only a valid comparison between cards in the same product family. For example, the base clock on the venerable GeForce GTX 1080 is 1,733MHz, while the base clock is 1,759MHz on a (factory overclocked) Republic of Gamers Strix version of the GTX 1080 from Asus in its out-of-the-box Gaming Mode.

Note that this base clock measure is distinct from the graphics chip's boost clock. The boost clock is the speed to which the graphics chip can accelerate temporarily when under load, as thermal conditions allow. This can also vary from card to card in the same family. It depends on the robustness of the cooling hardware on the card and the aggressiveness of the manufacturer in its factory settings. The top-end partner cards with giant multifan coolers will tend to have the highest boost clocks for a given GPU.

Onboard Memory

The amount of onboard video memory (sometimes referred to by the rusty term "frame buffer") is usually matched to the requirements of the games or programs that the card is designed to run. In a certain sense, from a PC-gaming perspective, you can count on a video card to have enough memory to handle current demanding games at the resolutions and detail levels that the card is suited for. In other words, a card maker generally won't overprovision a card with more memory than it can realistically use; that would inflate the pricing and make the card less competitive. But there are some wrinkles to this.

A card designed for gameplay at 1,920 by 1,080 pixels (1080p) these days will generally be outfitted with 4GB or 6GB of RAM, while cards geared more toward play at 2,560 by 1,440 pixels (1440p) or 3,840 by 2,160 (2160p, or 4K) tend to deploy 8GB or more. Usually, for cards based on a given GPU, all of the cards have a standard amount of memory. The wrinkles: In some isolated but important cases, card makers offer versions of a card with the same GPU but different amounts of VRAM. The key ones to know nowadays: cards based on the GeForce GTX 1060 (some lesser versions offer 3GB, versus the full-fat 6GB), and the Radeon RX 580 (4GB versus 8GB). Both are GPUs you'll find in popular midrange cards a bit above or below $200, so mind the memory amount on these. The cheaper versions will have less.

Now, if you're looking to spend $150 or more on a video card, with the idea of all-out 1080p gameplay, a card with at least 4GB of memory really shouldn't be negotiable. Both AMD and Nvidia now outfit their $200-plus GPUs with more RAM than this. (AMD has stepped up to 8GB on its RX Vega cards, with 16GB on its Radeon VII, while Nvidia is using 6GB or 8GB on most, with 11GB on its elite GeForce RTX 2080 Ti.) Either way, sub-4GB cards should only be used for secondary systems, gaming at low resolutions, or simple or older games that don't need much in the way of hardware resources.

Memory bandwidth is another spec you will see. It refers to how quickly data can move into and out of the GPU. More is generally better, but again, AMD and Nvidia have different architectures and sometimes different memory bandwidth requirements, so numbers are not directly comparable.

Pricing: How Much Should You Spend?

Generations of cards come and go, but the price bands were constant for years—at least, when the market was not distorted by crypto. Now that the rush has abated, AMD and Nvidia both are targeting light 1080p gaming in the $100-to-$150 price range, higher-end 1080p and entry-level 1440p in cards between $200 to $300, and light to high-detail 1440p gaming between $300 and $400.

If you want a card that can handle 4K handily, you'll need to spend more than $400. A GPU that can push 4K gaming at high detail levels will cost $500 to $1,200. Cards in the $150-to-$350 market generally offer performance improvements in line with their additional cost. If a card is a certain amount costlier than another, the increase in performance is usually proportional to the increase in price. In the high-end and elite-level card stacks, though, this rule falls away; spending more money yields diminishing returns.

Once a Religious Issue: FreeSync Vs. G-Sync

Should you buy a card based whether it supports one of these two venerable specs for smoothing gameplay? It depends on the monitor you have.

FreeSync (AMD's solution) and G-Sync (Nvidia's) are two sides to the same coin, a technology called adaptive sync. With adaptive sync, the monitor displays at a variable refresh rate led by the video card; the screen draws at a rate that scales up and down according to the card's output capabilities at any given time in a game. Without it, wobbles in the frame rate can lead to artifacts, staggering/stuttering action, or screen tearing, in which mismatched screen halves display momentarily. Under adaptive sync, the monitor draws a full frame only when the video card can deliver a whole frame.

The monitor you own may support FreeSync or G-Sync, or neither one. FreeSync is much more common, as it doesn't add to a monitor's manufacturing cost; G-Sync requires dedicated hardware inside the display. You may wish to opt for one GPU maker's wares or the other's based on this, but know that the tides are changing on this front. At CES 2019, Nvidia announced a driver tweak that will allow FreeSync-compatible monitors to use adaptive sync with late-model Nvidia GeForce cards, and a small subset of FreeSync monitors have been certified by Nvidia as "G-Sync Compatible." So the choice may not be as black and white (or as red or green) as it has been for years.

Upgrading a Pre-Built Desktop With a New Graphics Card

Assuming the chassis is big enough, most pre-built desktops these days have enough cooling capability to accept a new discrete GPU with no problems.

The first thing to do before buying or upgrading a GPU is to measure the inside of your chassis for the available card space. In some cases, you've got a gulf between the far right-hand edge of the motherboard and the hard drive bays. In others, you might have barely an inch. (See our favorite graphics cards for compact PCs.)

Next, check your graphics card's height. The card partners sometimes field their own card coolers that depart from the standard AMD and Nvidia reference designs. Make certain that if your chosen card has an elaborate cooler design, it's not so tall that it keeps your case from closing.

Finally: the power supply unit (PSU). Your system needs to have a PSU that's up to the task of giving a new card enough juice. This is something to be especially wary of if you're putting a high-end video card in a pre-built PC that was equipped with a low-end card, or no card at all. Doubly so if it's a budget-minded or business system; these PCs tend to have underpowered or minimally provisioned PSUs.

The two most important factors to be aware of here are the number of six-pin and eight-pin cables on your PSU, and the maximum wattage the PSU is rated for. Most modern systems, including those sold by OEMs like Dell, HP, and Lenovo, employ power supplies that include at least one six-pin power connector meant for a video card, and some have both a six-pin and an eight-pin connector. Midrange and high-end graphics cards will require a six-pin cable, an eight-pin cable, or some combination of the two to provide working power to the card. (The lowest-end cards draw all the power they need from the PCI Express slot.) Make sure you know what your card needs in terms of connectors.

Nvidia and AMD both outline recommended power supply wattage for each of their graphics-card families. Take these guidelines seriously, but they are just guidelines, and they are generally conservative. If AMD or Nvidia says you need at least a 500-watt PSU to run a given GPU, don't chance it with the 300-watter you may have installed, but know that you don't need an 800-watt PSU to guarantee enough headroom, either.

Ports and Preferences: Understanding Video Card Connections

Three kinds of port are common on the rear edge of a current graphics card: DVI, HDMI, and DisplayPort. Some systems and monitors still use DVI, but it's the oldest of the three standards and is being phased out on many high-end cards here in 2019.

Most cards have several DisplayPorts (often three) and one HDMI port. When it comes to HDMI versus DisplayPort, note some differences. First, if you plan on using a 4K display, now or in the future, your card needs to at least support HDMI 2.0a or DisplayPort 1.2/1.2a. It's fine if the GPU supports anything above those labels, like HDMI 2.0b or DisplayPort 1.4, but that's the minimum you'll want for smooth 4K playback or gaming. (The latest-gen cards from both makers will be fine on this score.)

Note that some of the very latest cards from Nvidia in its GeForce RTX series employ a new port, called VirtualLink. This port looks like (and can serve as) a USB Type-C port that also supports DisplayPort over USB-C. What the port is really designed for, though: attaching future generations of virtual-reality (VR) headsets, providing power and bandwidth adequate to the needs of VR head-mounted displays (HMDs). It's nice to have, but no VR hardware supports it yet.

Looking Forward: Graphics Card Trends

Nvidia has been in the consumer video card driver's seat for a few years now, but 2019 should see more action than any in recent memory to shake things up between the two big players.

GeForce Vs. Radeon: Looking Ahead

If your goal is a high-end graphics card (we define that, these days, as cards at $500 or more) for playing games at 4K, and you plan to use the card for three to five years, the upper end of the market is mostly Nvidia's game at the moment. But that could shift as 2019 progresses, with AMD's next-generation "Navi" cards expected later this year. Based on a new 7nm manufacturing process, these cards could change AMD's fortunes in the graphics space. The Radeon VII, its first 7nm-built video card, is a competent offering for 1440p/4K play and content creators, but it doesn't quite topple the RTX 2080 in most respects. (See our face-off AMD Radeon VII vs. Nvidia RTX 2080: Which High-End Gaming Card to Buy?)

VR: New Interfaces, New HMDs?

As we alluded to with VirtualLink, VR is another consideration. VR's requirements are slightly different than those of simple monitors. Both of the mainstream VR HMDs, the HTC Vive and Oculus Rift, have an effective resolution across both eyes of 2,160 by 1,200. That's significantly lower than 4K, and it's the reason why midrange GPUs like AMD's Radeon RX 580 or Nvidia's GeForce GTX 1060 can be used for VR. On the other hand, VR demands higher frame rates than conventional gaming. Low frame rates in VR (anything below 90 frames per second is considered low) can result in a bad gaming experience. Higher-end GPUs in the $300-plus category are going to offer better VR experiences today and more longevity overall, but VR with current-generation headsets can be sustained on a lower-end card than 4K. Coming cards in 2019, we suspect, may push the VR bar lower.

High-Refresh: A New Frontier for Serious Gamers

Finally, bear in mind a further trend gaining momentum on the monitor side of things: high-refresh gaming monitors. For ages, 60Hz (or 60 screen redraws a second) was the panel-refresh ceiling for most PC monitors. We're seeing the emergence of lots of models now with higher refresh ceilings, designed especially for gamers. These panels may support up to 120Hz, 144Hz, or more for smoother gameplay. (This ability can also be piggybacked with FreeSync or G-Sync adaptive sync to enable smooth frame rates when the card is pushed to its limit.)

What this means: If you have a video card that can consistently push frames in a given game in excess of 60fps, you may be able to see those formerly "wasted" frames in the form of smoother game motion. Most casual gamers won't care, but the difference is marked if you play fast-action titles, and competitive e-sports hounds will find the fluidity a competitive advantage. (See our picks for the best gaming monitors, including high-refresh models.) In short: Buying a powerful video card that pushes high frame rates can be a boon nowadays even for play at a pedestrian resolution like 1080p, if paired with a high-refresh monitor. High-refresh/high-detail gaming at 4K, though, is still beyond today's cards in the most demanding games.

Ready for Our Recommendations?

The GPUs below span the spectrum of budget to high-end, representing a wide range of the best cards that are available now. We'll update this story as the graphics card landscape changes, so check back often for the latest products and buying advice.

Note that we've factored in just a sampling of third-party cards here; many more fill out the market. You can take our recommendation of a single reference card in a given card class (like the GeForce GTX 1060, or Radeon RX Vega 64) as a similar endorsement of the family as a whole.

Best Graphics Cards Featured in This Roundup:

Cons: 12.9-inch length means it won't fit in many cases. Cooling design exhausts air into chassis. Ray-tracing and DLSS features remain underutilized, like with all RTX cards.

Bottom Line: A massive air cooler and dual eight-pin power connectors make MSI's GeForce RTX 2080 Gaming X Trio one of the most robust RTX 2080 partner cards we've seen. The only challenge? Fitting it in your PC's case.

Cons: Priced higher than GTX 1070 it replaces. Not powerful enough for maxed-out 4K gaming in every game.

Bottom Line: MSI's GeForce RTX 2070 Armor graphics card has great cooling, plus overclocking headroom to spare. It's solid for 1440p play, but it performs much like the outgoing GTX 1080, so step up to an RTX 2080 if you want a generational performance gain.

Pros: Delivers excellent graphics, especially at 1080p. One of the least power-hungry cards of the current generation.

Cons: Less-expensive competing AMD card can be faster. Founders Edition version is costly. No support for even dual-card SLI.

Bottom Line: This Founders Edition card lowers the admission cost into VR by a bunch, and generally outpaces the AMD Radeon RX 480. But it also costs more than AMD's card, and the RX 480 was more competitive in some newer tests.

Cons: Hiked-up price, versus GTX 1080 Founders Edition. Hard to judge value of ray tracing and DLSS until games come to market. Cooling design exhausts most air into case, not out.

Bottom Line: An exceptionally powerful graphics card, the GeForce RTX 2080 Founders Edition is a home run for gaming at 4K or high refresh rates. Only its pricing and the lack of games supporting ray tracing and DLSS keep it from being a grand slam right from launch.

Cons: Founders Edition commands a $200 premium over an already expensive base/reference card. Games will take time to adopt ray tracing and DLSS.

Bottom Line: A Ferrari among gaming GPUs, Nvidia's GeForce RTX 2080 Ti Founders Edition represents the fastest class of cards that money can buy. Just be ready to pay a supercar price to enjoy its luxury ride for 4K and high-refresh gaming.

Cons: High power requirements. Physically large for a mid-level graphics card. The Radeon RX 580 is considerably cheaper and not much slower.

Bottom Line: If you ignore power consumption, the Radeon RX 590 is the best-performing midrange card you can buy (as tested in this XFX model), showing double-digit gains over the GeForce GTX 1060. However, the existing Radeon RX 580 has the economic edge.

Bottom Line: AMD's new Radeon flagship graphics card, the Radeon VII is a worthwhile if power-hungrier alternative to the GeForce RTX 2080 for 4K gaming, but it generally isn't as fast at 1080p or 1440p resolutions.