pcmag.comHow to Buy the Best GPU for Gaming If you're a PC gamer, or a content creator who lives and dies by the speed of your graphics-accelerated software, your video card is the engine that powers what you can do—or how lustily you can brag. Our guide will help you sort through the best video-card options for your desktop PC, what you need to know to upgrade a system, and how to evaluate whether a particular card is a good buy. We'll also touch on some upcoming trends—they could affect which card you choose. After all, consumer video cards range from under $50 to well over $1,000. It's easy to overpay or underbuy. (We won't let you do that, though.) The Best Graphics Card Deals This Week* *Deals are selected by our partner, TechBargains Who's Who in GPUs: AMD vs. Nvidia First off, what does a graphics card do? And do you really need one? If you're looking at any given pre-built desktop PC on the market, unless it's a gaming-oriented machine, PC makers will de-emphasize the graphics card in favor of promoting CPU, RAM, or storage options. Indeed, sometimes that's for good reason; a low-cost PC may not have a graphics card at all, relying instead on the graphics-acceleration silicon built into its CPU (an "integrated graphics processor," commonly called an "IGP"). There's nothing inherently wrong with relying on an IGP—most business laptops, inexpensive consumer laptops, and budget-minded desktops have them—but if you're a gamer or a creator, the right graphics card is crucial. A modern graphics solution, whether it's a discrete video card or an IGP, handles the display of 2D and 3D content, drawing the desktop, and decoding and encoding video content in programs and games. All of the discrete video cards on the consumer market are built around large graphics processing chips designed by one of two companies: AMD or Nvidia. These processors are referred to as "GPUs," for "graphics processing units," a term that is also applied, confusingly, to the graphics card itself. (Nothing about graphics cards...ahem, GPUs...is simple!) The two companies work up what are known as "reference designs" for their video cards, a standardized version of a card built around a given GPU. Sometimes these reference-design cards are sold directly by Nvidia (or, less often, by AMD) themselves to consumers. More often, though, they are duplicated by third-party card makers (companies referred to in industry lingo as AMD or Nvidia "board partners"), such as Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac. Depending on the graphics chip in question, these board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia), or they will fashion their own custom products, with different cooler designs, slight overclocking done from the factory, or features such as LED mood illumination. Some board partners will do both—that is, sell reference versions of a given GPU, as well as its own, more radical designs. Who Needs a Discrete GPU? We mentioned integrated graphics (IGPs) above. IGPs are capable of meeting the needs of most general users today, with three broad exceptions... Professional Workstation Users. These folks, who work with CAD software or in video and photo editing, will still benefit greatly from a discrete GPU. Some of their key applications can transcode video from one format to another or perform other specialized operations using resources from the GPU instead of (or in addition to) those of the CPU. Whether this is faster will depend on the application in question, which specific GPU and CPU you own, and other factors. Productivity-Minded Users With Multiple Displays. People who need a large number of displays can also benefit from a discrete GPU. Desktop operating systems can drive displays connected to the IGP and discrete GPUs simultaneously. If you've ever wanted five or six displays hooked up to a single system, you can combine an IGP and a discrete GPU to get there. That said, you don't necessarily need a high-end graphics card to do that. If you're simply displaying business applications, multiple browser windows, or lots of static windows across multiple displays (i.e., not demanding PC games), all you need is a card that supports the display specifications, resolutions, monitor interfaces, and number of panels you need. If you're showing four web browsers across four display panels, a GeForce RTX 2080 Super card, say, won't confer any greater benefit than a GeForce GTX 1660 card with the same supported outputs. Gamers. And of course, there's the gaming market, to whom the GPU is arguably the most important component. RAM and CPU choices both matter, but if you have to pick between a top-end system circa 2016 with a 2019 GPU or a top-end system today using the highest-end GPU you could buy in 2016, you'd want the former. Graphics cards fall into two distinct classes: consumer cards meant for gaming and light content creation work, and dedicated cards meant for professional workstations and geared toward scientific computing, calculations, and artificial intelligence work. This guide, and our reviews, will focus on the former, but we'll touch on workstation cards a little bit, later on. The key sub-brands you need to know across these two fields are Nvidia's GeForce and AMD's Radeon RX (on the consumer side of things), and Nvidia's Titan and Quadro, as well as AMD's Radeon Pro and Radeon Instinct (in the pro workstation field). As recently as 2017, Nvidia had the very high end of the consumer graphics-card market more or less to itself, and it still dominates there. We'll focus here on the consumer cards. Nvidia's consumer card line in late 2019/early 2020 is broken into two distinct classes, both united under the long-running GeForce brand: GeForce GTX, and GeForce RTX. AMD's consumer cards, meanwhile, comprise the Radeon RX and (now fading) Radeon RX Vega families, as well as the Radeon VII. Before we get into the individual lines in detail, though, let's outline a very important consideration for any video-card purchase. Target Resolution: Your First Consideration Resolution is the horizontal-by-vertical pixel count at which your video card will drive your monitor. This has a huge bearing on which card to buy, and how much you need to spend, when looking at a video card from a gaming perspective. If you are a PC gamer, a big part of what you'll want to consider is the resolution(s) at which a given video card is best suited for gaming. Nowadays, even low-end cards will display everyday programs at lofty resolutions like 3,840 by 2,160 pixels (a.k.a., 4K). But for strenuous PC games, those cards will not have nearly the power to drive smooth frame rates at high resolutions like those. In games, the video card is what calculates positions, geometry, and lighting, and renders the onscreen image in real time. For that, the higher the in-game detail level and monitor resolution you're running, the more graphics-card muscle is required. The three most common resolutions at which today's gamers play are 1080p (1,920 by 1,080 pixels), 1440p (2,560 by 1,440 pixels), and 2160p or 4K (3,840 by 2,160 pixels). Generally speaking, you'll want to choose a card suited for your monitor's native resolution. (The "native" resolution is the highest supported by the panel, and the one at which the display looks the best.) You'll also see ultra-wide-screen monitors with in-between resolutions (3,440 by 1,440 pixels is a common one); you can gauge these versus 1080p, 1440p, and 2160p by calculating the raw number of pixels for each (multiply the vertical number by the horizontal one) and seeing where that screen resolution fits in relative to the common ones. (See our targeted roundups of the best graphics cards for 1080p play and the best graphics cards for 4K gaming.) Now, of course, you can always dial down the detail levels for a game to make it run acceptably at a higher-than-recommended resolution, or dial back the resolution itself. But to an extent, that defeats the purpose of a graphics card purchase. The highest-end cards are meant for 4K play or for playing at very high refresh rates at 1080p or 1440p; you don't have to spend $1,000 or even $500 to play more than acceptably at 1080p. A secondary consideration nowadays, though, is running games at ultra-high frame rates to take advantage of the extra-fast refresh abilities of some new monitors; more on that later. Let's look at the graphics card makers' lines first, and see which ones are suited for what gaming resolutions. Meet the Radeon and GeForce Families The GPU lines of the two big graphics-chip makers are constantly evolving, with low-end models suited to low-resolution gameplay ranging up to elite-priced models for gaming at 4K and/or very high refresh rates. Let's look at Nvidia's first. A Look at Nvidia's Lineup The company's current line is split between cards using last-generation (a.k.a. "10-series") GPUs dubbed the "Pascal" line, and newer GTX 1600- and RTX 2000-series lines, based on GPUs using an architecture called "Turing." Its Titan cards are outliers; more on them in a bit. Here's a quick rundown of the currently relevant card classes in the Pascal and Turing families, their rough pricing, and their usage cases... If you are a keen observer of the market, you may notice that many of the familiar GeForce GTX Pascal cards like the GTX 1070 and GTX 1080 are not listed above. They are being allowed to sell through and are largely going off the market in 2019 and 2020 in favor of their GeForce RTX successors. We expect this to happen soon for the GeForce GTX 1060 due to the release of the GeForce GTX 1660 and 1660 Ti, and eventually, the lesser Pascal cards. We'd class the GT 1030 to GTX 1050 as low-end cards, coming under $100 or a little above. The GTX 1650/1650 Super to GTX 1660 Ti make up Nvidia's current midrange, spanning from about $150 to $300, or a little more. With apologies to nu-soul and nu-metal, the end-of-lifing GTX 1080-class cards, as well as the GeForce RTX 2060 and RTX 2070 (both the originals and the newer "Super" variants), constitute what we'd call the "nu-high end," as they take the place of the old top-end GeForces in the $350 to $700 range. The RTX 2080 Super and RTX 2080 Ti cards, finally, form what we'd call a new "elite class." As for the Titan cards, these are essentially stripped-down workstation cards that bridge the pro graphics and high-end/4K gaming worlds. For most gamers, the Titans won't be of interest due to their pricing. But know that the Titan Xp (older, around $1,200) and newer Titan RTX ($2,500) and Titan V ($2,999) cards are options for Powerball-winning gamers, machine-learning pioneers, AI developers, or folks involved in pro/academic GPU-bound calculation work. A Look at AMD's Lineup As for AMD's card classes, as 2020 dawns the company is stronger than it has been for some time, competing ably enough with Nvidia's low-end and mainstream cards. It's weaker at the high end, though, and it puts up no resistance against the elite class... The aging Radeon RX 550 and 560 comprise the low end, while the Radeon RX 570 to RX 590 are the midrange and ideal for 1080p gaming, though one must suspect that their time is limited, given the company's newest addition to its 1080p-play arsenal, the Radeon RX 5500 XT and the Radeon RX 5600 XT. The Radeon RX 580, RX Vega 56, and RX Vega 64 cards, the first a great-value 1080p card and the latter two good for both 1080p and 1440p play, were cards that were hit particularly hard by the cryptocurrency craze of 2017-2018, inflating card prices to the sky, but have since come back down to earth as they sell through and fade away in favor of newer AMD cards. Indeed, the 1080p and especially the 1440p AMD cards have seen a shakeup. The company released the first of its new, long-awaited line of 7nm-based "Navi" midrange graphics cards in July, based on a whole new architecture AMD calls RDNA. The first three cards are the Radeon RX 5700, the Radeon RX 5700 XT, and the limited-run Radeon RX 5700 XT Anniversary Edition. All these cards have their sights pinned on the 1440p gaming market, and each indeed powers even the most demanding AAA titles at above 60fps in that resolution bracket. The Radeon VII is AMD's sole player in the elite bracket; it trades blows with the GeForce RTX 2080 at 4K but generally performs less well at lower resolutions in games. It's nearing its end of life. We suspect that AMD will replace it with a higher-end Navi/RDNA card before too long, and as of February 2020 rumors are circulating that AMD's "Big Navi" GeForce RTX 2080 Ti competitor could be just around the corner, slated for a potential announcement sometime in March of this year. Graphics Card Basics: Understanding the Core Specs Now, our comparison charts above should give you a good idea of which card families you should be looking at, based on your monitor and your target resolution. A few key numbers are worth keeping in mind when comparing cards, though: the graphics processor's clock speed, the onboard VRAM (that is, how much video memory it has), and—of course!—the pricing. And then there's adaptive sync. Clock Speed When comparing GPUs from the same family, a higher base clock speed (that is, the speed at which the graphics core works) and more cores signify a faster GPU. Again, though: That's only a valid comparison between cards in the same product family. For example, the base clock on the GeForce GTX 1080 is 1,733MHz, while the base clock is 1,759MHz on a (factory overclocked) Republic of Gamers Strix version of the GTX 1080 from Asus in its out-of-the-box Gaming Mode. Note that this base clock measure is distinct from the graphics chip's boost clock. The boost clock is the speed to which the graphics chip can accelerate temporarily when under load, as thermal conditions allow. This can also vary from card to card in the same family. It depends on the robustness of the cooling hardware on the card and the aggressiveness of the manufacturer in its factory settings. The top-end partner cards with giant multifan coolers will tend to have the highest boost clocks for a given GPU. This is to say nothing of AMD's new category: "game clock." According to the company, game clock represents the "average clock speed gamers should expect to see across a wide range of titles," a number that the company's engineers gathered during a test of 25 different titles on the new Navi lineup of cards. We mention this so that you don't compare game clocks to boost or base clocks, which game clock decidedly is not. Onboard Memory The amount of onboard video memory (sometimes referred to by the rusty term "frame buffer") is usually matched to the requirements of the games or programs that the card is designed to run. In a certain sense, from a PC-gaming perspective, you can count on a video card to have enough memory to handle current demanding games at the resolutions and detail levels that the card is suited for. In other words, a card maker generally won't overprovision a card with more memory than it can realistically use; that would inflate the pricing and make the card less competitive. But there are some wrinkles to this. A card designed for gameplay at 1,920 by 1,080 pixels (1080p) these days will generally be outfitted with 4GB or 6GB of RAM, while cards geared more toward play at 2,560 by 1,440 pixels (1440p) or 3,840 by 2,160 (2160p, or 4K) tend to deploy 8GB or more. Usually, for cards based on a given GPU, all of the cards have a standard amount of memory. The wrinkles: In some isolated but important cases, card makers offer versions of a card with the same GPU but different amounts of VRAM. Some key ones to know nowadays: cards based on the Radeon RX 5500 XT and RX 580 (4GB versus 8GB). Both are GPUs you'll find in popular midrange cards a bit above or below $200, so mind the memory amount on these. The cheaper versions will have less. Now, if you're looking to spend $150 or more on a video card, with the idea of all-out 1080p gameplay, a card with at least 4GB of memory really shouldn't be negotiable. Both AMD and Nvidia now outfit their $200-plus GPUs with more RAM than this. (AMD has stepped up to 8GB on its RX-series cards, with 16GB on its Radeon VII, while Nvidia is using 6GB or 8GB on most, with 11GB on its elite GeForce RTX 2080 Ti.) Either way, sub-4GB cards should only be used for secondary systems, gaming at low resolutions, or simple or older games that don't need much in the way of hardware resources. Memory bandwidth is another spec you will see. It refers to how quickly data can move into and out of the GPU. More is generally better, but again, AMD and Nvidia have different architectures and sometimes different memory bandwidth requirements, so these numbers are not directly comparable. Pricing: How Much Should You Spend? Generations of cards come and go, but the price bands were constant for years—at least, when the market was not distorted in 2017-18 by cryptocurrency miners. Now that the rush has abated, AMD and Nvidia both are targeting light 1080p gaming in the $100-to-$180 price range, higher-end 1080p and entry-level 1440p in cards between $200 to $300, and light-to-high-detail 1440p gaming between $300 and $400. If you want a card that can handle 4K handily, you'll need to spend more than $400... at the least. A GPU that can push 4K gaming at high detail levels will cost $500 to $1,200. Cards in the $150-to-$350 market generally offer performance improvements in line with their additional cost. If a card is a certain amount costlier than another, the increase in performance is usually proportional to the increase in price. In the high-end and elite-level card stacks, though, this rule falls away; spending more money yields diminishing returns. Once a Religious Issue: FreeSync vs. G-Sync Should you buy a card based on whether it supports one of these two venerable specs for smoothing gameplay? It depends on the monitor you have. FreeSync (AMD's solution) and G-Sync (Nvidia's) are two sides of the same coin, a technology called adaptive sync. With adaptive sync, the monitor displays at a variable refresh rate led by the video card; the screen draws at a rate that scales up and down according to the card's output capabilities at any given time in a game. Without it, wobbles in the frame rate can lead to artifacts, staggering/stuttering of the onscreen action, or screen tearing, in which mismatched screen halves display momentarily. Under adaptive sync, the monitor draws a full frame only when the video card can deliver a whole frame. The monitor you own may support FreeSync or G-Sync, or neither one. FreeSync is much more common, as it doesn't add to a monitor's manufacturing cost; G-Sync requires dedicated hardware inside the display. You may wish to opt for one GPU maker's wares or the other's based on this, but know that the tides are changing on this front. At CES 2019, Nvidia announced a driver tweak that will allow FreeSync-compatible monitors to use adaptive sync with late-model Nvidia GeForce cards, and a growing subset of FreeSync monitors has been certified by Nvidia as "G-Sync Compatible." So the choice may not be as black and white (or as red or green) as it has been for years. Upgrading a Pre-Built Desktop With a New Graphics Card Assuming the chassis is big enough, most pre-built desktops these days have enough cooling capability to accept a new discrete GPU with no problems. The first thing to do before buying or upgrading a GPU is to measure the inside of your chassis for the available card space. In some cases, you've got a gulf between the far right-hand edge of the motherboard and the hard drive bays. In others, you might have barely an inch. (See our favorite graphics cards for compact PCs.) Next, check your graphics card's height. The card partners sometimes field their own card coolers that depart from the standard AMD and Nvidia reference designs. Make certain that if your chosen card has an elaborate cooler design, it's not so tall that it keeps your case from closing. Finally: the power supply unit (PSU). Your system needs to have a PSU that's up to the task of giving a new card enough juice. This is something to be especially wary of if you're putting a high-end video card in a pre-built PC that was equipped with a low-end card, or no card at all. Doubly so if it's a budget-minded or business system; these PCs tend to have underpowered or minimally provisioned PSUs. The two most important factors to be aware of here are the number of six-pin and eight-pin cables on your PSU, and the maximum wattage the PSU is rated for. Most modern systems, including those sold by OEMs like Dell, HP, and Lenovo, employ power supplies that include at least one six-pin power connector meant for a video card, and some have both a six-pin and an eight-pin connector. Midrange and high-end graphics cards will require a six-pin cable, an eight-pin cable, or some combination of the two to provide working power to the card. (The lowest-end cards draw all the power they need from the PCI Express slot.) Make sure you know what your card needs in terms of connectors. Nvidia and AMD both outline recommended power supply wattage for each of their graphics-card families. Take these guidelines seriously, but they are just guidelines, and they are generally conservative. If AMD or Nvidia says you need at least a 500-watt PSU to run a given GPU, don't chance it with the 300-watter you may have installed, but know that you don't need an 800-watt PSU to guarantee enough headroom, either. Ports and Preferences: What Connections Should My Graphics Card Have? Three kinds of port are common on the rear edge of a current graphics card: DVI, HDMI, and DisplayPort. Some systems and monitors still use DVI, but it's the oldest of the three standards and is being phased out on many high-end cards these days. Most cards have several DisplayPorts (often three) and one HDMI port. When it comes to HDMI versus DisplayPort, note some differences. First, if you plan on using a 4K display, now or in the future, your card needs to at least support HDMI 2.0a or DisplayPort 1.2/1.2a. It's fine if the GPU supports anything above those labels, like HDMI 2.0b or DisplayPort 1.4, but that's the minimum you'll want for smooth 4K playback or gaming. (The latest-gen cards from both makers will be fine on this score.) Also, for now, only DisplayPort 1.4 will support 4K gaming at anything above 60Hz, and DisplayPort is also the only way you'll be able to push 1440p resolution above 60Hz with G-Sync turned on. You should guarantee you're buying a card with at least DisplayPort 1.4-out if you plan to use either of these resolutions above the 60Hz threshold with G-Sync enabled. Note that some of the very latest cards from Nvidia in its GeForce RTX series employ a new port, called VirtualLink. This port looks like (and can serve as) a USB Type-C port that also supports DisplayPort over USB-C. What the port is really designed for, though: attaching future generations of virtual-reality (VR) headsets, providing power and bandwidth adequate to the needs of VR head-mounted displays (HMDs). It's nice to have, but no VR hardware supports it yet. Looking Forward: Graphics Card Trends Nvidia has been in the consumer video card driver's seat for a few years now, but 2020 should see more action than any in recent memory to shake things up between the two big players. GeForce Vs. Radeon: Looking Ahead If your goal is a high-end graphics card (we define that, these days, as cards at $500 or more) for playing games at 4K, and you plan to use the card for three to five years, the upper end of the market is mostly Nvidia's game at the moment. But that could shift in 2020 progresses, with AMD's next-generation "Navi 20" cards expected to roll out. Based on the same 7nm manufacturing process as the first Navis, these cards could change AMD's fortunes in the high-end graphics space. The Radeon VII, its first 7nm-built video card, is a competent offering for 1440p/4K play and content creators, but it doesn't quite topple the RTX 2080 and newer GeForce RTX 2080 Super in most respects. (See our face-off AMD Radeon VII vs. Nvidia RTX 2080: Which High-End Gaming Card to Buy?) VR: New Interfaces, New HMDs? As we alluded to with VirtualLink, VR is another consideration. VR's requirements are slightly different than those of simple monitors. Both of the mainstream VR HMDs, the original HTC Vive and Oculus Rift, have an effective resolution across both eyes of 2,160 by 1,200. That's significantly lower than 4K, and it's the reason why midrange GPUs like AMD's Radeon RX 5700 XT or Nvidia's GeForce GTX 1660 Super can be used for VR. On the other hand, VR demands higher frame rates than conventional gaming. Low frame rates in VR (anything below 90 frames per second is considered low) can result in a bad VR gaming experience. Higher-end GPUs in the $300-plus category are going to offer better VR experiences today and more longevity overall, but VR with current-generation headsets can be sustained on a lower-end card than 4K. That said, in 2019, two new headsets upped the power requirements a bit. The Oculus Rift S has raised the bar to a resolution of 2,560 by 1,440 pixels per eye, while the hotly anticipated Valve Index has pumped its own respective numbers up to 1,440 by 1,600 per eye, or 2,880 by 3,200 pixels in total. If you decide to splurge on one of these newer headsets, you'll need a graphics card that can keep up with their intense demands (80Hz refresh on the Rift S, and 144Hz refresh on the Index). Valve, for one, recommends having at least a GeForce GTX 1070 installed if you want to run its new headset at a full clip. Image Sharpening May Change the Game Another major change to the landscape of gaming in the course of 2019 was the addition of image-sharpening technologies: Radeon Image Sharpening (RIS) by AMD, and Freestyle from Nvidia. But what are these programs, exactly, and how do they help gamers who are shopping on a budget? It all has to do with "render scaling." In most modern games, you've likely seen something in your graphics settings that lets you change the render scale of a game. In essence what this does is take the current resolution you have the game set to (in this example, let's say it's 1440p), and "scale" the "render" resolution down by a particular percentage, perhaps down to 2,048 by 1,152 pixels (again, for the sake of this example). But wait...who would make their game look worse on purpose? Users of game sharpeners, that's who. Image-sharpening technologies let you scale down a game's render resolution, thereby increasing the frame rate (lower resolutions mean less pixels to draw for the GPU, and thereby less demands), while a sharpener cleans things up on the back end for a modest performance cost. "Cleaning things up" involves applying a sharpening filter to a downsampled image, and if you can tune it just right (85 percent render scale with a 35 percent sharpen scale is a popular ratio), in theory you can gain a significant amount of performance with little discernible loss in visual clarity. Why is this important? If you can render your game down without losing visual quality, ultimately this means you can render down the impact on your wallet, too. We've pushed image-sharpening technologies to their limit, and in our testing found that the peak downsample is about 30 percent. This means you can buy a card that's nearly a third cheaper than the one you were originally looking at, sharpen it back up 30 percent using one of the aforementioned sharpening tools, and still get close to the same high-definition gaming experience you would expect from running a game at its native resolution that had no render scaling applied in the first place. When and how you can apply these filters to games has some limitations and caveats, however. We'll get deeper into the nuances (and there are many) in a future article we are working on. High-Refresh Monitors: A New Frontier for Serious Gamers Finally, bear in mind a further trend gaining momentum on the monitor side of things: high-refresh gaming monitors. For ages, 60Hz (or 60 screen redraws a second) was the panel-refresh ceiling for most PC monitors. We're seeing the emergence of lots of models now with higher refresh ceilings, designed especially for gamers. These panels may support up to 120Hz, 144Hz, or more for smoother gameplay. Rumors are we could even see monitors pushing upward of 300Hz at CES this year. (This ability can also be piggybacked with FreeSync or G-Sync adaptive sync to enable smooth frame rates when the card is pushed to its limit.) What this means: If you have a video card that can consistently push frames in a given game in excess of 60fps, on a high-refresh monitor you may be able to see those formerly "wasted" frames in the form of smoother game motion. Most casual gamers won't care, but the difference is marked if you play fast-action titles, and competitive e-sports hounds find the fluidity a competitive advantage. (See our picks for the best gaming monitors, including high-refresh models.) In short: Buying a powerful video card that pushes high frame rates can be a boon nowadays even for play at a pedestrian resolution like 1080p, if paired with a high-refresh monitor. Ready for Our Recommendations? The GPUs below span the spectrum of budget to high-end, representing a wide range of the best cards that are available now. We'll update this story as the graphics card landscape changes, so check back often for the latest products and buying advice. Note that we've factored in just a sampling of third-party cards here; many more fill out the market. You can take our recommendation of a single reference card in a given card class (like the GeForce RTX 2060 Super, or Radeon RX 5700 XT) as a similar endorsement of the family as a whole. AMD Radeon RX 5700 XT Review MSRP: $399.00 at Pros: Strong value proposition for a midrange video card. 1440p results above 60fps across all benchmarks. Radeon Image Sharpening improves game visual fidelity.Cons: Only modest early returns on overclocking our sample. No hardware ray-tracing support. Loud blower cooler.Bottom Line: With its Radeon RX 5700 XT, AMD introduces its new graphics architecture, a suite of software improvements, and enough speed for the money to keep 1440p gamers very happy-and competing GeForce cards at bay.Read Review Nvidia GeForce RTX 2070 Super Review MSRP: $499.00 at Pros: Superb price for performance. Stable overclocking results help it rival original RTX 2080. Runs quiet. RT and Tensor cores ensure future-proofing.Cons: It didn't come sooner.Bottom Line: Packing near-RTX 2080 performance and similar specs, Nvidia's GeForce RTX 2070 Super rules for solid 4K play at 60Hz and high-refresh gameplay at 1440p. It's a killer card for the money.Read Review MSI GeForce RTX 2080 Gaming X Trio Review MSRP: $849.99 at Pros: Excellent cooling. Aggressive factory overclock. Dual eight-pin power connectors and higher power rating. Two-zone RGB LED lighting.Cons: 12.9-inch length means it won't fit in many cases. Cooling design exhausts air into chassis. Ray-tracing and DLSS features remain underutilized, like with all RTX cards.Bottom Line: A massive air cooler and dual eight-pin power connectors make MSI's GeForce RTX 2080 Gaming X Trio one of the most robust RTX 2080 partner cards we've seen. The only challenge? Fitting it in your PC's case.Read Review Sapphire Pulse Radeon RX 5600 XT Review MSRP: $279.00 at Pros: Fast in its price class. Runs cool. Good for high-end 1080p gaming or midrange 1440p. Radeon Boost works as advertised.Cons: Card is long and bulky. Almost no overclocking potential with Performance BIOS.Bottom Line: The AMD Radeon RX 5600 XT is a powerful if bulky graphics card that gives low-end Nvidia GeForce RTX cards a run for their money, while also flat-out dominating GeForce GTX entries from 2019.Read Review Zotac GeForce GTX 1650 Super Twin Fan Review MSRP: $159.00 at Pros: Much faster than original non-Super GeForce GTX 1650 in 1080p and 1440p gaming. Runs quiet. Priced competitively. Impressively small in our Zotac test sample.Cons: Underperforms on some games. Runs hotter than the non-Super GTX 1650.Bottom Line: Zotac's punchy GeForce GTX 1650 Super Twin Fan is markedly better than the non-Super GTX 1650 and a solid version of this mainstream GPU. It gives budget-focused 1080p gamers a better option against competing AMD cards in the same price bracket.Read Review Pros: Excellent price-to-performance ratio. Solid results in 1440p gaming. Blower cooler vents most heat out your PC case.Cons: Blower cooling is still pretty loud. Overclocking ceiling is low, at this writing.Bottom Line: Gaming at 1440p is the strength of the Radeon RX 5700, a solid entry that, alongside its RX 5700 XT kin, makes AMD a real player again the field of midrange video cards.Read Review Pros: Strong results in high-refresh gaming and 1440p. Competitively priced. Highly attractive card build. DVI port for legacy monitors.Cons: Not powerful enough to handle all AAA games at higher resolutions.Bottom Line: Nvidia's GeForce RTX 2060 Super straddles the fine line between a video card made for esports and one for AAA gaming performance, and delivers consistently on each side where it matters most. Read Review Pros: Better frame rates than the GeForce RTX 2080 for the same price as the RTX 2080 base reference card. Strong 4K gaming results.Cons: Minimal overclocking headroom. Overkill for 1440p gaming (unless you're using a high-refresh panel).Bottom Line: It doesn't deliver quite the performance leap that Nvidia's other RTX Super cards do, but the GeForce RTX 2080 Super is a strong 4K gaming graphics card that improves on the original while keeping its price in check.Read Review Pros: Sets a new bar for single-GPU performance. Quiet, cool-running design. Supports ray tracing and DLSS for future games. Easy to attain at least modest overclocks. Cons: Founders Edition commands a $200 premium over an already expensive base/reference card. Games will take time to adopt ray tracing and DLSS.Bottom Line: A Ferrari among gaming GPUs, Nvidia's GeForce RTX 2080 Ti Founders Edition represents the fastest class of cards that money can buy. Just be ready to pay a supercar price to enjoy its luxury ride for 4K and high-refresh gaming.Read Review Pros: Performs in line with price. Adrenalin 2020 version of AMD's Radeon Software is feature-rich.Cons: Inconsistent performance. Not much real-world result from overclocking on our Sapphire sample. A little more expensive than parallel Nvidia cards.Bottom Line: The Radeon RX 5500 XT plays in the video-card zone AMD knows best: budget gaming at 1080p. It's solid and close, but slightly inconsistent performance in the early going allows Nvidia's latest line of GeForce GTX Super GPUs to edge it out.Read Review

weiterlesen: RSS Quelle öffnen