G-Sync Technology Overview | A Brief History of Fixed Refresh Rate

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots we call pixels. They draw from left to right each "scanning" line from top to bottom. Adjusting the speed of the electron gun from one complete update until the next one was not very practiced before, and there was no particular need until the appearance 3D games this was not the case. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog connectors (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved away from fixed refresh rates. Film and television still rely on an input signal at a constant frame rate. Once again, switching to a variable refresh rate doesn't seem all that necessary.

Adjustable frame rates and fixed refresh rates are not the same

Before the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it came up when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphic scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.

Disabling synchronization causes interruptions

It turns out that variable frame rate GPU and the fixed refresh rate of the LCD panel don't work very well together. In this configuration, we encounter a graphical artifact called “tearing.” It occurs when two or more partial frames are rendered together during the same monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect while moving.

The image above shows two well-known artifacts that are common but difficult to capture. Because these are display artifacts, you won't see this in regular game screenshots, but our images show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card that supports video capture, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; This is the method we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one was done using a camera, the bottom one was done through the video capture function. The bottom picture is “cut” horizontally and looks displaced. In the top two images, the left photo was taken on a Sharp screen with a 60 Hz refresh rate, the right one was taken on an Asus display with a 120 Hz refresh rate. The tearing on a 120Hz display is less pronounced because the refresh rate is twice as high. However, the effect is visible and appears in the same way as in the left image. This type of artifact is a clear sign that the images were taken with vertical sync (V-sync) turned off.

Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect that is visible in the images of BioShock: Infinite is called ghosting. It is especially visible at the bottom left of the photo and is associated with a delay in screen refresh. In short, individual pixels do not change color quickly enough, resulting in this type of glow. A single frame cannot convey the effect ghosting has on the game itself. A panel with an 8ms gray-to-gray response time, such as the Sharp, will end up producing a blurry image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

V-sync: "wasted on soap"

Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card attempts to match the screen's refresh rate, eliminating tearing entirely. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60 Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.) etc.), which in turn will lead to noticeable slowdowns.

When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, since V-sync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can introduce additional input lag into the render chain. Thus, V-sync can be both a blessing and a curse, solving some problems but causing other disadvantages. An informal survey of our staff found that gamers tend to disable V-sync, only turning it on when tearing becomes unbearable.

Get Creative: Nvidia Unveils G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync that attempts to mitigate problems by enabling V-sync when frame rates are above the monitor's refresh rate and quickly disabling it when performance drops sharply below the refresh rate. Although the technology did its job well, it was a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.

GPU frame rate determines the monitor's refresh rate, removing artifacts associated with enabling and disabling V-sync

The packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a module that operates on variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the video card is outputting (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

Even before the tests begin, I would like to commend the team for their creative approach to solving a real problem affecting PC gaming. This is innovation at its finest. But what are the results G-Sync in practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaling device is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24-Inch 144Hz Gaming Monitor for $400", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync Technology Overview | 3D LightBoost, built-in memory, standards and 4K

As we reviewed Nvidia's press materials, we asked ourselves many questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed was that Nvidia sent a monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia 3D LightBoost technology, which was originally designed to improve the brightness of 3D displays, but for a long time unofficially used in 2D mode, using pulsating panel backlighting to reduce ghosting (or motion blur). Naturally, it became interesting whether it is used this technology V G-Sync.

Nvidia gave a negative answer. Although using both technologies simultaneously would be ideal solution Today, backlight strobing at variable refresh rates leads to flicker and brightness issues. Solving them is incredibly difficult because you have to adjust the brightness and track the pulses. As a result, the choice now is between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know, G-Sync eliminates the step input lag associated with V-sync as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long will it take for the frame to travel through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is turned off, and it is associated with the characteristics of the game, video driver, mouse, etc.

Will G-Sync be standardized?

This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology G-Sync. However, we wanted to ask the developer directly and find out if Nvidia plans to bring the technology to an industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, providing variable refresh rates. After all, Nvidia is a member of the VESA association.

However, there are no new specifications planned for DisplayPort, HDMI or DVI. G-Sync already supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology that is now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

G-Sync at Ultra HD resolutions

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will look at today, only supports 1920x1080 pixels. Currently, Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if there will be a module G-Sync support MST configuration?

In truth, 4K displays with variable frame rates will still have to wait. There is no separate upscaling device that supports 4K resolution yet; the nearest one should appear in the first quarter of 2014, and monitors equipped with them will not appear until the second quarter. Since the module G-Sync replaces the scaling device, compatible panels will begin to appear after this point. Fortunately, the module natively supports Ultra HD.

In those good old days when the owners personal computers They actively used huge CRT monitors, earning themselves astigmatism; there was no talk of image smoothness. The technologies of that time did not really support 3D. Therefore, poor users had to be content with what they had. But time passes, technology develops, and many are no longer satisfied with frame tearing during dynamic play. This is especially true for the so-called cyber-athletes. In their case, a split second makes all the difference. What should I do?

Progress does not stand still. Therefore, what previously seemed impossible can now be taken for granted. The same situation applies to image quality on a computer. Manufacturers of video cards and other PC components are now working hard to solve the problem of poor-quality image output on monitors. And, I must say, they have already come quite far. Just a little remains, and the image on the monitor will be perfect. But that's all - digression. Let's return to our main topic.

A little history

Many monitors actively tried to overcome tearing and improve the image. What they didn’t invent: they increased the “hertz” of the monitor, turned on V-Sync. Nothing helped. And one day a famous manufacturer NVIDIA video cards presents G-Sync technology, with which you can achieve “unreal” image smoothness without any artifacts. It seems to be good, but there is one small, but very serious “but”. To use this option, you need monitors that support G-Sync. Monitor manufacturers had to work harder and “throw” a couple of dozen models onto the market. What's next? Let's look at the technology and try to figure out if it's any good.

What is G-Sync?

G-Sync is a screen display technology from NVIDIA. It is characterized by a smooth frame change without any artifacts. There is no image tearing or stuttering. For adequate operation of this technology, quite powerful computer, since digital signal processing requires considerable processor power. That is why only new models of video cards from NVIDIA are equipped with the technology. In addition, G-Sync is a proprietary feature of NVIDIA, so owners of video cards from other manufacturers have no chance.

In addition, a G-Sync monitor is required. The fact is that they are equipped with a board with a digital signal converter. Owners of regular monitors will not be able to take advantage of this amazing option. It’s unfair, of course, but this is the policy of modern manufacturers - to siphon as much money as possible from the poor user. If your PC configuration allows you to use G-Sync, and your monitor miraculously supports this option, then you can fully appreciate all the delights of this technology.

How G-Sync works

Let's try to explain in a simplified way how G-Sync works. The fact is that a regular GPU (video card) simply sends a digital signal to the monitor, but does not take into account its frequency. This is why the signal appears “ragged” when displayed on the screen. The signal coming from the GPU is interrupted by the monitor frequency and looks unsightly in the final version. Even with the V-Sync option enabled.

When using G-Sync, the GPU itself regulates the monitor frequency. That is why signals reach the matrix when it is really needed. Thanks to this, it becomes possible to avoid image tearing and improve the smoothness of the picture as a whole. Since conventional monitors do not allow the GPU to control itself, a G-Sync monitor was invented, which included an NVIDIA board that regulates the frequency. Therefore, the use of conventional monitors is not possible.

Monitors supporting this technology

Gone are the days when users killed their eyesight by staring at ancient CRT monitors for hours. Current models are elegant and harmless. So why not give them some new technology? The first monitor with NVIDIA G-Sync support and 4K resolution was released by Acer. The new product created quite a sensation.

High-quality monitors with G-Sync are still quite rare. But the manufacturers have plans to make these devices standard. Most likely, in five years, monitors supporting this technology will become standard solution even for an office PC. In the meantime, all that remains is to look at these new products and wait for their widespread distribution. That's when they will become cheaper.

After that, monitors with G-Sync support began to be riveted by all and sundry. Even budget models with this technology have appeared. Although what's the use of this technology? budget screen with a bad matrix? But, be that as it may, such models do exist. The best option for this option is (G-Sync will work in full force on it).

The best monitors with G-Sync

Monitors with G-Sync technology stand out as a special line of devices. They must have the characteristics necessary for the full operation of this option. It is clear that not all screens can cope with this task. Several leaders in the production of such monitors have already been identified. Their models turned out very successful.

For example, the G-Sync monitor is one of the brightest representatives of this line. This device is premium. Why? Judge for yourself. The screen diagonal is 34 inches, resolution is 4K, contrast is 1:1000, 100 Hz, matrix response time is 5 ms. In addition, many would like to get this “monster” for themselves. It is clear that he will cope with G-Sync technology with a bang. It has no analogues yet. You can safely call it the best in its class and not be mistaken.

In general, ASUS G-Sync monitors are now at the top of Olympus. Not a single manufacturer has yet been able to surpass this company. And it is unlikely that this will ever happen. ASUS can be called a pioneer in this regard. Their monitors that support G-Sync are selling like hotcakes.

The future of G-Sync

Now they are actively trying to introduce G-Sync technology into laptops. Some manufacturers have even released a couple of such models. Moreover, they can work without a G-Sync card in the monitor. Which is understandable. Still, the laptop has slightly different design features. A video card supporting this technology is quite sufficient.

It is likely that NVIDIA G-Sync will soon occupy a significant place in the computer industry. Monitors with this technology should become cheaper. Eventually this option should become available everywhere. Otherwise, what's the point in developing it? In any case, everything is not so rosy yet. There are some problems with the implementation of G-Sync.

In the future, G-Sync technology will become the same commonplace thing that a VGA port for connecting a monitor was once for us. But all sorts of “vertical synchronization” against the background of this technology look like a blatant anachronism. Not only can these outdated technologies not provide satisfactory picture quality, but they also consume a considerable amount of system resources. Definitely, with the advent of G-Sync, their place in the dustbin of history.

We have long been accustomed to the fact that monitors have a fixed image scanning frequency - usually 60 Hz. The fixed frequency dates back to CRT televisions, when the video sequence had a clearly defined number of frames per second - usually 24. But in games, the frame rate is not constant - it can vary within a very wide range, and due to the fact that the scanning frequency is not coincides with the frame rendering rate of the video card, as a result, image tearing appears, which interferes with a comfortable gaming experience. This happens because the image is displayed on the display even if the output of part of the previous frame has not yet been completely completed - the remaining part of the buffer is accounted for by the current screen update. That is why each frame displayed on the monitor, if the frequencies indicated above do not match, will essentially consist of two frames rendered by the video card.

Vertical Sync

The simplest method to solve the problem is to enable vertical sync. What is she doing? It displays the image on the monitor only when the frame is completely ready. Accordingly, if you have a 60 Hz monitor and the video card produces more than 60 fps, you will get a smooth picture without a single tear or artifact (and the video card will not be 100% loaded). But here another problem appears - the delay in image output. If the monitor is updated 60 times per second, then 16.7 ms is spent on one frame, and even if the video card prepared the frame in 5 ms, the monitor will still wait for the remaining 11 ms:

Therefore, the control becomes “sticky” - when moving the mouse, the response on the monitor occurs with a slight delay, so it becomes more difficult to position the crosshair in shooters and other dynamic games. It’s even worse if the video card is not capable of delivering 60 fps in the game - for example, if fps is 50, and vertical synchronization is enabled, then every second there will be 10 frames in which new information will not be displayed on the screen, that is, every second there will be 50 frames with with a delay of up to 16.7 ms, and 10 frames with a delay of 33.4 ms - as a result, the picture will be jerky and it will be impossible to play.

Therefore, until recently, players had three options - either enable vertical synchronization (if fps is above 60) and put up with not the most convenient controls, or disable synchronization and endure image artifacts.

AMD FreeSync and Nvidia G-Sync

Of course, large companies found a solution to the problem - they came up with forced synchronization of the scanning frequency and frame rendering by the video card. That is, if the video card took a frame in 5 ms, the monitor will display the previous frame at 5 ms, without expecting anything. If the next frame was rendered in 20 ms, the monitor will again keep the previous frame on the screen for 20 ms:


What does this give? Firstly, since the monitor displays completely finished frames and they are synchronized with the scan rate, there are no artifacts. Secondly, since the monitor displays the frame as soon as it is ready, without waiting for anything, there is no “viscosity” of control - the image on the monitor changes as soon as you move the mouse.

Differences between FreeSync and G-Sync

Each of the vendors went their own way: with AMD, the refresh rate is controlled by the video card itself, and the monitor must be connected via DisplayPort. On the one hand, this is bad - if the video card does not have hardware support for FreeSync, then you will not be able to use it. Taking into account the fact that this technology is supported only by chips of the R7 and R9 line starting from the 200s, as well as the Fury and RX lines, the chips of the HD 7000 lines are left behind, some of which, generally speaking, are no different from the chips of the 200 line (yes, a banal renaming ). Mobile versions AMD video cards FreeSync is not supported at all, even if they are more powerful than desktop cards that support it. On the other hand, since essentially all control comes from the video card, a FreeSync monitor turns out to be $80-100 cheaper than one with G-Sync, which is quite noticeable.

Nvidia took a different route - the scan frequency is controlled by the monitor itself, which has a special chip built into it. On the one hand, this is good - video cards starting from the GTX 650 Ti are supported, as well as mobile solutions starting from the 965M. On the other hand, the chip costs money, so monitors with G-Sync are more expensive.

The permissible scanning frequencies also differ. For AMD it is 9-240 Hz, for Nvidia it is 30-144 Hz. The figure of 9 Hz rather raises a smile (since this is a slide show), and Nvidia’s 30 can, in principle, be considered an acceptable minimum. But the fact that Nvidia has a limit of 144 Hz may not be enough, since top gaming monitors have frequencies up to 240 Hz. But, alas, AMD does not yet have such video cards that can produce more than 200 fps in e-sports games, so 240 Hz at the moment is just a good margin for the future. On the other hand, if the frame rate in the game drops below the minimum scan frequency of the monitor, AMD simply forces this frequency to be set, that is, we get the same problems as with vertical synchronization. Nvidia did something more clever - the G-Sync chip can duplicate frames in order to get into the operating frequency range of the monitor, so there will be no delays in control or artifacts:

Another plus for AMD is the absence of small delays when transferring data to the monitor, since FreeSync uses Adaptive-Sync technology of the DisplayPort standard in order to know in advance the minimum and maximum refresh rate of the monitor, so data transfer is not interrupted by coordinating the operation of the video card with the module G-Sync in the monitor, like Nvidia. However, in practice the difference turns out to be no more than 1-2%, so this can be neglected.

Of course, the question arises - do frame synchronization technologies affect gaming performance? The answer is no, they don’t: the difference with synchronization turned off and FreeSync or G-Sync turns out to be zero, and this is understandable - in fact, these technologies do not force the video card to calculate more data - they simply output ready-made data faster.

In the end - which is better? No matter how funny it may sound, users have no choice: those who use the “red” products are forced to use FreeSync. Those who use green products can similarly only use G-Sync. But, in principle, at the moment the technologies produce similar results, so the choice really only lies in the manufacturer and power of the video card.

The hero of our review, ASUS ROG SWIFT PG278Q, is one of the first monitors to implement the new G-Sync technology created by NVIDIA. In addition, this is the first and so far only monitor with WQHD resolution (2560x1440), supporting a refresh rate of 144 Hz, and therefore capable of displaying stereoscopic images using shutter glasses.

CRT monitors do not have this drawback. Since the image is formed line by line by a scanning beam, and the phosphor of individual pixels quickly fades, a separate section of the screen is presented to the observer for a short time. The movement of the object is represented by separate pulses spaced apart in time, between which the image is filled with black. Moving along the black field following the perceived line of motion, the gaze does not cause blurring of the retinal image.

The illustration from the EIZO Foris FG2421 review applies to ULMB, taking into account that the backlight in the ASUS ROG SWIFT PG278Q does not flicker at 120 Hz

Thanks to the backlight flickering synchronously with the screen refresh, the presented frames turn into pulses separated by moments of blackness - similar to a CRT, ultimately relieving the viewer of the blur of moving objects. ULMB is not compatible with the G-Sync function in the current version.

Best monitors for gaming | Models supporting Nvidia G-Sync technologies

Variable or adaptive screen refresh rate technology comes in two varieties: AMD FreeSync and Nvidia G-Sync. They perform the same function - they reduce the refresh rate of the source (video card) and the display to prevent annoying frame tearing during fast movements in the game. FreeSync is part of the DisplayPort specification, while G-Sync requires additional hardware licensed from Nvidia. The implementation of G-Sync adds about $200 to the price of the monitor. If you already have a modern GeForce graphics card, the choice is obvious. If you're still undecided, you should know that G-Sync has one advantage. When the frame rate drops below the G-Sync threshold, which is 40 fps, frames are duplicated to prevent image tearing. FreeSync does not have such a feature.


Pivot table


Model AOC G2460PG Asus RoG PG248Q Dell S2417DG Asus ROG SWIFT PG279Q
Category FHD FHD QHD QHD
Best price in Russia, rub. 24300 28990 31000 58100
Panel/backlight type TN/W-LED TN/W-LED edge array TN/W-LED edge array AHVA/W-LED edge array
24" / 16:9 24" / 16:9 24" / 16:9 27" / 16:9
Curvature radius No No No No
1920x1080 @ 144 Hz 1920x1080 @ 144 Hz, 180 Hz overclocked 2560x1440 @ 144 Hz, 165 Hz overclocked 2560x1440 @ 165 Hz
FreeSync operating range No No No No
Color depth/color gamut 8-bit (6-bit with FRC) / sRGB 8-bit/sRGB 8-bit/sRGB 8-bit/sRGB
Response time (GTG), ms 1 1 1 4
Brightness, cd/m2 350 350 350 350
Speakers No No No (2) 2 W
Video inputs (1) DisplayPort (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors No (1) 3.5mm headphone output (1) 3.5 mm Stereo in, (1) 3.5 mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (2) outputs; v2.0: (2) outputs v3.0: (1) input, (2) output v3.0: (1) input, (4) output v3.0: (1) input, (2) output
Energy consumption, W 40 typical 65 max. 33 typical 90 max., 0.5 expected
559x391-517x237 562x418-538x238 541x363x180 620x553x238
Panel thickness, mm 50 70 52 66
Frame width, mm 16-26 11 top/side: 6, bottom: 15 8-12
Weight, kg 6,5 6,6 5,8 7
Guarantee 3 years 3 years 3 years 3 years

Model Acer Predator XB271HK Acer Predator XB321HK Asus ROG PG348Q Acer Predator Z301CTМ
Category UHD UHD WQHD QHD
Best price in Russia, rub. 43900 62000 102000 58000
Panel/backlight type AHVA/W-LED edge array IPS/W-LED edge array AH-IPS/W-LED edge array AMVA/W-LED, edge array
Screen diagonal/aspect ratio 27" / 16:9 32" / 16:9 34" / 21:9 30" / 21:9
Curvature radius No No 3800 mm 1800 mm
Maximum resolution/frequency updates 3840x2160 @ 60 Hz 3840x2160 @ 60 Hz 3440x1440 @ 75 Hz, 100 Hz overclocked 2560x1080 @ 144 Hz, 200 Hz overclocked
FreeSync operating range No No No 8-bit/sRGB
Color depth/color gamut 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB
Response time (GTG), ms 4 4 5 4
Brightness, cd/m2 300 350 300 300
Speakers (2) 2 W, DTS (2) 2 W, DTS (2) 2 W (2) 3W, DTS
Video inputs (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort, (1) HDMI (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (3) output
Energy consumption, W 71.5 typical 56 typical 100 max. 34 W at 200 nits
Dimensions LxHxW (with base), mm 614x401-551x268 737x452-579x297 829x558x297 714x384-508x315
Panel thickness, mm 63 62 73 118
Frame width, mm top/side: 8, bottom: 22 top/side: 13, bottom: 20 top/side: 12, bottom: 24 top/side: 12, bottom: 20
Weight, kg 7 11,5 11,2 9,7
Guarantee 3 years 3 years 3 years 3 years

AOC G2460PG – FHD 24 inches


  • Best price in Russia: 24,300 rub.

ADVANTAGES

  • Excellent implementation of G-Sync
  • Screen refresh rate 144 Hz
  • ULMB Motion Blur Suppression
  • High build quality
  • Very high quality color rendering and shades of gray

FLAWS

VERDICT

Although G-Sync remains a premium and expensive option, the AOC G2460PG is the first monitor in this segment that is aimed at the budget buyer. It costs about half the price of the Asus ROG Swift, so you can save a little money, or install two monitors on your desk at once.

Asus RoG PG248Q – FHD 24 inches


  • Best price in Russia: RUB 28,990.

ADVANTAGES

  • G-Sync
  • 180 Hz
  • Low latency
  • Responsiveness
  • Color accuracy with calibration
  • Sleek appearance
  • Build quality

FLAWS

  • To achieve the best picture, adjustments are required
  • Contrast
  • Expensive

VERDICT

The PG248Q is like an exotic sports car - expensive and impractical to operate. But if you set the correct settings during installation, you will get an excellent gaming experience. In terms of smoothness and responsiveness, this monitor is perhaps the best we've tested to date. It is worth the money and time spent. Highly recommended.

Dell S2417DG


  • Best price in Russia: 31,000 rub.

    ADVANTAGES

    • Superior motion processing quality
    • Color accuracy at factory settings
    • Resolution QHD
    • Refresh rate165 Hz
    • Gaming Features
    • Frame 6 mm

    FLAWS

    • Contrast
    • Gamma Curve Accuracy
    • ULMB reduces light output and contrast
    • Viewing Angles

    VERDICT

    If Dell had fixed the gamma issues we encountered in testing, the S2417DG would have earned our Editor's Choice award. The monitor conveys movements incredibly smoothly, with absolutely no ghosting, shaking or tearing - you can’t take your eyes off it. The benefit from the ULMB function is minor, but nevertheless it is present. It's not the cheapest 24-inch gaming monitor, but it beats out its more expensive competitors and deserves a spot on the list.

    Asus RoG Swift PG279Q – QHD 27 inches


    • Best price in Russia: 58,100 rub.

    ADVANTAGES

    • Stable operation at 165 Hz
    • G-Sync
    • Vivid and sharp images
    • Rich color
    • GamePlus
    • Joystick for OSD menu
    • Stylish appearance
    • High build quality

    FLAWS

    • Significant reduction in luminous flux in ULMB mode
    • Calibration is required to achieve the best image quality
    • Expensive

    VERDICT

    Asus' new addition to the ROG lineup isn't perfect, but it's definitely worth a look. The PG279Q has everything an enthusiast needs, including a crisp and bright IPS panel, 165Hz refresh rate, and G-Sync. This monitor isn't cheap, but we haven't heard of users regretting the purchase yet. We enjoyed playing on this monitor, and you'll probably enjoy it too.

    Acer Predator XB271HK – UHD 27 inches


    • Best price in Russia: 43,900 rub.

    ADVANTAGES

    • Rich colors
    • Image accuracy at factory settings
    • G-Sync
    • Ultra HD resolution
    • Viewing Angles
    • Build quality

    FLAWS

    • Expensive

  • Close