Everything You Need To Know About Gaming Monitors in 2021

One of these days, you’ll decide to upgrade that dusty old monitor of yours for a proper gaming display. And when that day comes, you’ll probably be bombarded with flashy advertisements, as well as uncalled-for advice from online buddies. They’ll shout about refresh rates you’ll need and the latency you want to avoid, but let’s be real: what do all those terms actually mean? Well, let us explain.

Sure, if you’re fairly new to displays in general, gaming monitor specifications can get overwhelming. But it’s not exactly rocket science, either. Having some basic knowledge on these terms and standards, ensures that you can find a better suiting monitor for your tastes. Yes, higher amounts of Hertz sounds nice, but why? 

To get you up to speed, we will go over the most important selling points for gaming monitors, as of 2021. These are the most important terms and technologies you’ll need to know on gaming display specifications.


Monitor size — Bigger doesn’t have to be better

Specifications rarely get this basic when buying new hardware. It’s all about physical size, nothing more than that. These dimensions should be your first choice to make, as preferred sizes can heavily depend on your budget, as well as personal taste. Bigger screens usually cost more, while also coming with placement issues for smaller set-ups. 

As is the standard, monitor size is usually measured as the diagonal of the display panel itself, expressed in inches. For gaming monitors, 27” serves as a solid starting point — be that on a budget or not. Most manufacturers launch new display technology as a 27” monitor first, later expanding their range with bigger and/or wider panels after the fact. 

That’s not that weird at all, when you think about it. On your everyday desk, 27” is big enough for comfortable viewing and gaming, with fine pixel densities for anything 1080p and up. As long as you’re not seated uncomfortably close to (or far away from) your main display, 27” is a great place to start. Additionally, high-end 27” models from a few years ago, can easily become next year’s budget picks. 

Promotional render of a 32:9 ultra-wide gaming monitor

Aspect ratio — Your usual 16:9, going wide

As Earthlings, we all but standardized the width-to-height ratio of our displays — at least those in our living rooms and on our desks. Whether they’re meant for productivity or gaming; most screens feature a 16:9 aspect ratio. 

And let’s be clear: there’s nothing wrong with that. After all, games (as well as entire console generations) are built around that aspect ratio. The same goes for YouTube videos, Twitch streams and the majority of Netflix shows: it’s all 16:9. By going for that aspect ratio, you inherently opt for the least amount of “letterboxing” your non-gaming content. Black bars, begone. 

There are slight variations on the formula, though. For a more cinematic experience, you can opt for a 21:9 monitor, often called “ultra-wide” displays. This matches most modern cinema films and looks great in immersive video games, but it isn’t always fully supported. The same goes for 32:9, which is effectively two 16:9 displays stapled together. 

These widened ratios are great for movies, multitasking, and some immersive gaming, but do be warned: they mostly net you zero advantages in competitive gaming. Some games might not support the resolution natively, others think around your wide view. A game like Overwatch will let you play on any aspect ratio, but tweaks your field-of-view (FOV) to make sure you can’t “see more” than your opponent. Take factors like these into consideration when browsing screen widths. 

Resolution — Do you even want 4K?

Apart from the actual display size, resolution simply stands for the sharpness of your monitor. From Full HD to 8K: the native pixel count makes for crispier picture quality. Do note: by enlarging the resolution, the pressure on your gaming hardware increases significantly. Churning out beautiful 4K visuals, can still only be done by the most expensive of video cards.

Therefore, the so-called “2K” resolution has quickly become one of gaming’s favorites. By displaying 2560 by 1440 pixels, you effectively double the sharpness of good-old 1080p — without immediately turning your graphics card into dust. These 1440p monitors offer a nice middle ground for gamers, especially if your hardware isn’t all that next-gen yet. 

In addition, 1440p might give your hardware some headroom to experiment with higher refresh rates. If you’re into competitive gaming, that might be more important than the sharpness of your screen…

Refresh rates — How a higher frame rate can Hertz you

A frame rate is how many frames of gameplay your system is actively churning out. To display those frames, any display has its own refresh rate where it tops out. The refresh rate is usually defined in Hertz (Hz). Offering speedier refresh rates means the monitor can keep up with higher frame rates, which is especially helpful in competitive gaming. Having a higher “ceiling” when it comes to frames, ensures your display is never the bottleneck of your set-up.

Our trusty 60 frames-per-second (fps), and thereby 60Hz, have long been the everyday monitor’s standard. But as display panels grew quicker over the years, modern gaming displays feature much higher refresh rates. Some particularly beefed up monitors can display up to 360 frames per second, although that’s mostly “a weird flex” for the manufacturer than something you’ll need.

If you’re looking at monitors for competitive gaming, you’ll notice that 144Hz has swiftly become the new golden standard for this genre of displays. Anything higher than that can generate buttery smooth visuals, but most competitive games, and especially their servers, top out at 144 updates per second, anyway.

Promotional photograph showing a PC gamer using a 360 Hz gaming display.

Input latency — How low can you go?

Latency, often referred to as “lag”, is what you want the least of. It’s usually defined in milliseconds (ms) and in cases of display technology, it’s mostly measured on “Gray to Gray” (GtG). A display boasting “1 ms (GtG)” means it can change any pixel from one type of gray to another type of gray in 1 millisecond. The shorter it takes to display changes, the more it feels like you’re in instantaneous control of your gameplay. 

Actual latency when gaming can differ greatly — other input devices and many different factors alter the total amount of lag — but overall, less is better. Most gamers won’t see or feel the difference between 5 ms (GtG) and 1 ms (GtG), but some competitive esporters simply won’t have anything above 1 ms. This debate often boils down to a matter of your own reflexes and in-game needs. 

Panel technology — TN versus IPS versus VA

How are gaming display panels built? In most cases, it’s either twisted nematic (TN), in-plane switching (IPS), or vertical alignment (VA). As TN is one of the oldest and cheapest flatscreen technologies, it was the first kind of panel to get boosted to higher refresh rates. As manufacturers experiment with these classic panels, they still boast the lowest latencies, even on more affordable models. 

IPS, and especially VA bring richer contrasts and deeper colors, while offering broader viewing angles too. In exchange, vertical alignment comes with a somewhat higher input latency too. As their novelty is fading, manufacturers have found out ways to make them faster, but this usually comes with elevated price points. 

If you want a swift display for competitive gaming on a budget, you will usually end up on TN. Nothing wrong with that, as long as you don’t expect the most vibrant of color reproduction. 

A line-up of an esports team, all behind their gaming monitors.

Bit depth — SDR, HDR, and higher

Less important display terms for competitive gaming, but always a “nice-to-have” feature: depth of color. A panel usually is built for certain color spaces, defined by dit depth. For most, 8-bit has long been the standard, while we’ve seen 10-bit and above on the rise as of late. 8-bit is considered the old Standard Dynamic Range (SDR), while anything from 10-bit and up puts your screen into High Dynamic Range (HDR) territory. More bit depth within pixels, effectively means a wider spectrum of color to display.

How vibrant this new HDR depth actually looks, is further altered by peak screen brightness — the amount of nits, or cd/m2. These factors aren’t that important to most PC gamers, but the “HDR wave” once seen in televisions, is now growing within gaming monitors too. If you want your display to pull you deeper into beautiful singleplayer games too, you might prefer a brighter display to a faster one.

Side-note: Do take in mind that your graphics cards, cabling, as well as the games themselves need to bring their own support for HDR. A 10-bit display will still only churn out 8-bit colors if the connected hardware and/or media doesn’t make use of the wider palette of shades. HDR has to come from all sides of the equation, or it doesn’t work at all. It has been on the rise in gaming, though.  

Color accuracy —  How much coverage gets you gaming alright?

Probably the least of a gamer’s worries: how accurate are the colors in your display? Just like HDR, color accuracy is a “nice-to-have” feature, but more so if you intend on retouching photos or editing videos on your gaming monitor too. The actual accuracy is usually defined by standardized color gamuts, with a certain percentile of coverage of that set spectrum.

Anything close to 100% sRGB is usually good enough for most gaming sessions. Creative professionals might want something nearing 100% coverage of AdobeRGB, or the even wider DCI-P3 gamut. But as of now, those more accurate color reproduction displays rarely come with higher frame rates. 

Photograph of an ASUS ProArt monitor, meant for true-to-life color reproduction.

HDMI or DisplayPort? And what versions?

Straightforward as can be: you need to plug in your display some way. As PC monitors have made a swift transition from analog to digital signals, gaming displays have mirrored that too. You can finally recycle that old VGA cable — HDMI and DisplayPort (sometimes over USB-C) are basically the only video connectors left for PCs. 

The specific iteration of HDMI and DP, as well as the amount of ports, is up to your devices. HDMI 2.0 nets you a bandwidth ceiling of 4K/60Hz with some HDR, while HDMI 2.1 goes up to 10K/120Hz with native support for variable refresh rates (VRR) and better audio codecs. Depending on your set-up you might need those specifications, but for most PC users, it’s already an absolute overkill. Especially HDMI 2.1 seems to have more use cases on gaming consoles and televisions, than PC monitors. 

If your graphics card supports it, DisplayPort (preferably 1.4 or up) should be your go-to connector. All types of DP support HDR bit depths, while the maximum bandwith on version 1.4 goes up to 4K/120Hz video — or 240 Hz on 1440p, for example. If you’re looking for lowest latencies and the easiest way of pumping more frames into one cable, DisplayPort is where that’s at. 

G-Sync and FreeSync compatibility

If you’re looking for a high-speed gaming display, it’s probably going to support either NVIDIA’s G-Sync or AMD’s FreeSync. What this exactly entails, boils down to an optimized synchronization between your graphics card and the screen itself. By making sure every frame is accounted for, the monitor eliminates screen tearing and stuttering. By continuously syncing the video stream, these screens can also fluidly adapt to variable frame rates.

It’s not exactly an end-all-be-all to all of the problems haunting displays. It’ll make sure parts of your screen won’t tear, but even with G-Sync or FreeSync, you may still experience some ghosting from swift animations. The higher your synchronized frame rates, the less you will see these visual anomalies.

G-Sync will work with NVIDIA graphics cards, while FreeSync is mostly meant for AMD cards. Through driver updates some NVIDIA cards can talk to certain FreeSync features, but it’s not always guaranteed. Make sure to check up on your own hardware before you opt for the one or the other when browsing gaming displays.

Ports and other I/O

Some more luxurious gaming displays might feature pass-through of some USB ports, but it’s rarely an actual selling point. Consider that these ports are almost always placed in horribly uncomfortable places, and you can understand why most people just opt for having better USB management elsewhere on their desks.

The same goes for headphone jacks on your monitor itself. It may be of use to some, but it’s best to just let the internal sound card on your PC (or digital headset) handle that audio stream. 


You get the picture?

And thus concludes our list of terms and specifications you need to know about gaming monitors. Indeed, it’s still a hefty amount of talk about pixels and input speed, but it’ll at least get you looking for the right hardware. Armed with this knowledge, you can at least determine what kind of display would fit your (upcoming) gaming set-up. No amount of Hertz or pixels can ever overwhelm you.

Is there anything you’re still unclear on? Or do you perhaps feel there might be something your fellow gamers should be educated on? Be sure to let us know! By chiming in with your own takes in the comments, we can better help eachother out. We would be delighted to keep talking on display technologies, as well as how they can amplify your gaming sessions.  

Leave a Comment: