Cards with g sync support. Review of NVIDIA G-Sync technology and ASUS ROG SWIFT PG278Q monitor. The best monitors with G-Sync

Best monitors for gaming | Models supporting Nvidia G-Sync technologies

Variable or adaptive screen refresh rate technology comes in two varieties: AMD FreeSync and Nvidia G-Sync. They perform the same function - they reduce the refresh rate of the source (video card) and the display to prevent annoying frame tearing during fast movements in the game. FreeSync is part of the DisplayPort specification, while G-Sync requires additional hardware licensed from Nvidia. The implementation of G-Sync adds about $200 to the price of the monitor. If you already have a modern GeForce graphics card, the choice is obvious. If you're still undecided, you should know that G-Sync has one advantage. When the frame rate drops below the G-Sync threshold, which is 40 fps, frames are duplicated to prevent image tearing. FreeSync does not have such a feature.


Pivot table


Model AOC G2460PG Asus RoG PG248Q Dell S2417DG Asus ROG SWIFT PG279Q
Category FHD FHD QHD QHD
Best price in Russia, rub. 24300 28990 31000 58100
Panel/backlight type TN/W-LED TN/W-LED edge array TN/W-LED edge array AHVA/W-LED edge array
24" / 16:9 24" / 16:9 24" / 16:9 27" / 16:9
Curvature radius No No No No
1920x1080 @ 144 Hz 1920x1080 @ 144 Hz, 180 Hz overclocked 2560x1440 @ 144 Hz, 165 Hz overclocked 2560x1440 @ 165 Hz
FreeSync operating range No No No No
Color depth/color gamut 8-bit (6-bit with FRC) / sRGB 8-bit/sRGB 8-bit/sRGB 8-bit/sRGB
Response time (GTG), ms 1 1 1 4
Brightness, cd/m2 350 350 350 350
Speakers No No No (2) 2 W
Video inputs (1) DisplayPort (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors No (1) 3.5mm headphone output (1) 3.5 mm Stereo in, (1) 3.5 mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (2) outputs; v2.0: (2) outputs v3.0: (1) input, (2) output v3.0: (1) input, (4) output v3.0: (1) input, (2) output
Energy consumption, W 40 typical 65 max. 33 typical 90 max., 0.5 expected
559x391-517x237 562x418-538x238 541x363x180 620x553x238
Panel thickness, mm 50 70 52 66
Frame width, mm 16-26 11 top/side: 6, bottom: 15 8-12
Weight, kg 6,5 6,6 5,8 7
Guarantee 3 years 3 years 3 years 3 years

Model Acer Predator XB271HK Acer Predator XB321HK Asus ROG PG348Q Acer Predator Z301CTМ
Category UHD UHD WQHD QHD
Best price in Russia, rub. 43900 62000 102000 58000
Panel/backlight type AHVA/W-LED edge array IPS/W-LED edge array AH-IPS/W-LED edge array AMVA/W-LED, edge array
Screen diagonal/aspect ratio 27" / 16:9 32" / 16:9 34" / 21:9 30" / 21:9
Curvature radius No No 3800 mm 1800 mm
Maximum resolution/frequency updates 3840x2160 @ 60 Hz 3840x2160 @ 60 Hz 3440x1440 @ 75 Hz, 100 Hz overclocked 2560x1080 @ 144 Hz, 200 Hz overclocked
FreeSync operating range No No No 8-bit/sRGB
Color depth/color gamut 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB 10-bit/sRGB
Response time (GTG), ms 4 4 5 4
Brightness, cd/m2 300 350 300 300
Speakers (2) 2 W, DTS (2) 2 W, DTS (2) 2 W (2) 3W, DTS
Video inputs (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort, (1) HDMI (1) DisplayPort v1.2, (1) HDMI v1.4 (1) DisplayPort v1.2, (1) HDMI v1.4
Audio connectors (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output (1) 3.5mm headphone output
USB v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (4) output v3.0: (1) input, (3) output
Energy consumption, W 71.5 typical 56 typical 100 max. 34 W at 200 nits
Dimensions LxHxW (with base), mm 614x401-551x268 737x452-579x297 829x558x297 714x384-508x315
Panel thickness, mm 63 62 73 118
Frame width, mm top/side: 8, bottom: 22 top/side: 13, bottom: 20 top/side: 12, bottom: 24 top/side: 12, bottom: 20
Weight, kg 7 11,5 11,2 9,7
Guarantee 3 years 3 years 3 years 3 years

AOC G2460PG – FHD 24 inches


  • Best price in Russia: 24,300 rub.

ADVANTAGES

  • Excellent implementation of G-Sync
  • Screen refresh rate 144 Hz
  • ULMB Motion Blur Suppression
  • High build quality
  • Very high quality color rendering and shades of gray

FLAWS

  • Non-standard gamma
  • Insufficient brightness for optimal ULMB performance
  • Not IPS

VERDICT

Although G-Sync remains a premium and expensive option, the AOC G2460PG is the first monitor in this segment that is aimed at the budget buyer. It costs about half the price of the Asus ROG Swift, so you can save a little money, or install two monitors on your desk at once.

Asus RoG PG248Q – FHD 24 inches


  • Best price in Russia: RUB 28,990.

ADVANTAGES

  • G-Sync
  • 180 Hz
  • Low latency
  • Responsiveness
  • Color accuracy with calibration
  • Sleek appearance
  • Build quality

FLAWS

  • To achieve the best picture, adjustments are required
  • Contrast
  • Expensive

VERDICT

The PG248Q is like an exotic sports car - expensive and impractical to operate. But if you set the correct settings during installation, you will get an excellent gaming experience. In terms of smoothness and responsiveness, this monitor is perhaps the best we've tested to date. It is worth the money and time spent. Highly recommended.

Dell S2417DG


  • Best price in Russia: 31,000 rub.

    ADVANTAGES

    • Superior motion processing quality
    • Color accuracy at factory settings
    • Resolution QHD
    • Refresh rate165 Hz
    • Gaming Features
    • Frame 6 mm

    FLAWS

    • Contrast
    • Gamma Curve Accuracy
    • ULMB reduces light output and contrast
    • Viewing Angles

    VERDICT

    If Dell had fixed the gamma issues we encountered in testing, the S2417DG would have earned our Editor's Choice award. The monitor conveys movements incredibly smoothly, with absolutely no ghosting, shaking or tearing - you can’t take your eyes off it. The benefit from the ULMB function is minor, but nevertheless it is present. It's not the cheapest 24-inch gaming monitor, but it beats out its more expensive competitors and deserves a spot on the list.

    Asus RoG Swift PG279Q – QHD 27 inches


    • Best price in Russia: 58,100 rub.

    ADVANTAGES

    • Stable operation at 165 Hz
    • G-Sync
    • Vivid and sharp images
    • Saturated color
    • GamePlus
    • Joystick for OSD menu
    • Stylish appearance
    • High build quality

    FLAWS

    • Significant reduction in luminous flux in ULMB mode
    • For achievement best quality images need calibration
    • Expensive

    VERDICT

    Asus' new addition to the ROG lineup isn't perfect, but it's definitely worth a look. The PG279Q has everything an enthusiast needs, including a crisp and bright IPS panel, 165Hz refresh rate, and G-Sync. This monitor isn't cheap, but we haven't heard of users regretting the purchase yet. We enjoyed playing on this monitor, and you'll probably enjoy it too.

    Acer Predator XB271HK – UHD 27 inches


    • Best price in Russia: 43,900 rub.

    ADVANTAGES

    • Rich colors
    • Image accuracy at factory settings
    • G-Sync
    • Ultra HD resolution
    • Viewing Angles
    • Build quality

    FLAWS

    • Expensive

    We have long been accustomed to the fact that monitors have a fixed image scanning frequency - usually 60 Hz. The fixed frequency comes from CRT televisions, when the video sequence had a clearly defined number of frames per second - usually 24. But in games, the frame rate is not constant - it can vary within very wide limits, and due to the fact that the scanning frequency is not coincides with the frame rendering frequency of the video card, as a result, image tearing appears, which interferes with a comfortable gameplay. This happens because the image is displayed on the display even if the output of part of the previous frame has not yet been completely completed - the remaining part of the buffer is accounted for by the current screen update. That is why each frame displayed on the monitor, if the frequencies indicated above do not match, will essentially consist of two frames rendered by the video card.

    Vertical Sync

    The simplest method to solve the problem is to enable vertical sync. What is she doing? It displays the image on the monitor only when the frame is completely ready. Accordingly, if you have a 60 Hz monitor and the video card produces more than 60 fps, you will get a smooth picture without a single tear or artifact (and the video card will not be 100% loaded). But here another problem appears - the delay in image output. If the monitor is updated 60 times per second, then 16.7 ms is spent on one frame, and even if the video card prepared the frame in 5 ms, the monitor will still wait for the remaining 11 ms:

    Therefore, the control becomes “sticky” - when moving the mouse, the response on the monitor occurs with a slight delay, so it becomes more difficult to position the crosshair in shooters and other dynamic games. It’s even worse if the video card is not capable of delivering 60 fps in the game - for example, if fps is 50, and vertical synchronization is enabled, then every second there will be 10 frames in which new information will not be displayed on the screen, that is, every second there will be 50 frames with with a delay of up to 16.7 ms, and 10 frames with a delay of 33.4 ms - as a result, the picture will be jerky and it will be impossible to play.

    Therefore, until recently, players had three options - either enable vertical synchronization (if fps is above 60) and put up with not the most convenient controls, or disable synchronization and endure image artifacts.

    AMD FreeSync and Nvidia G-Sync

    Of course, large companies found a solution to the problem - they came up with forced synchronization of the scanning frequency and frame rendering by the video card. That is, if the video card took a frame in 5 ms, the monitor will display the previous frame at 5 ms, without expecting anything. If the next frame was rendered in 20 ms, the monitor will again keep the previous frame on the screen for 20 ms:


    What does this give? Firstly, since the monitor displays completely finished frames and they are synchronized with the scan rate, there are no artifacts. Secondly, since the monitor displays the frame as soon as it is ready, without waiting for anything, there is no “viscosity” of control - the image on the monitor changes as soon as you move the mouse.

    Differences between FreeSync and G-Sync

    Each of the vendors went their own way: with AMD, the refresh rate is controlled by the video card itself, and the monitor must be connected via DisplayPort. On the one hand, this is bad - if the video card does not have hardware support for FreeSync, then you will not be able to use it. Taking into account the fact that this technology is supported only by chips of the R7 and R9 line starting from the 200s, as well as the Fury and RX lines, the chips of the HD 7000 lines are left behind, some of which, generally speaking, are no different from the chips of the 200 line (yes, a banal renaming ). Mobile versions AMD video cards do not support FreeSync at all, even if they are more powerful than desktop cards that support it. On the other hand, since essentially all control comes from the video card, a FreeSync monitor turns out to be $80-100 cheaper than one with G-Sync, which is quite noticeable.

    Nvidia took a different route - the scan frequency is controlled by the monitor itself, which has a special chip built into it. On the one hand, this is good - video cards starting from the GTX 650 Ti are supported, as well as mobile solutions starting from the 965M. On the other hand, the chip costs money, so monitors with G-Sync are more expensive.

    The permissible scanning frequencies also differ. For AMD it is 9-240 Hz, for Nvidia it is 30-144 Hz. The figure of 9 Hz rather raises a smile (since this is a slide show), and Nvidia’s 30 can, in principle, be considered an acceptable minimum. But the fact that Nvidia has a limit of 144 Hz may not be enough, since top gaming monitors have frequencies up to 240 Hz. But, alas, AMD does not yet have such video cards that can produce more than 200 fps in e-sports games, so 240 Hz at this moment- just a good reserve for the future. On the other hand, if the frame rate in the game drops below the minimum scan frequency of the monitor, AMD simply forces this frequency to be set, that is, we get the same problems as with vertical synchronization. Nvidia did something more clever - the G-Sync chip can duplicate frames in order to get into the operating frequency range of the monitor, so there will be no delays in control or artifacts:

    Another plus for AMD is the absence of small delays when transferring data to the monitor, since FreeSync uses Adaptive-Sync technology of the DisplayPort standard in order to know in advance the minimum and maximum refresh rate of the monitor, so data transfer is not interrupted by coordinating the operation of the video card with the module G-Sync in the monitor, like Nvidia. However, in practice the difference turns out to be no more than 1-2%, so this can be neglected.

    Of course, the question arises - do frame synchronization technologies affect gaming performance? The answer is no, they don’t: the difference with synchronization turned off and FreeSync or G-Sync turns out to be zero, and this is understandable - in fact, these technologies do not force the video card to calculate more data - they simply output ready-made data faster.

    In the end - which is better? No matter how funny it may sound, users have no choice: those who use the “red” products are forced to use FreeSync. Those who use green products can similarly only use G-Sync. But, in principle, at the moment the technologies produce similar results, so the choice really only lies in the manufacturer and power of the video card.

    In those good old days when the owners personal computers They actively used huge CRT monitors, earning themselves astigmatism; there was no talk about image smoothness. The technologies of that time did not really support 3D. Therefore, poor users had to be content with what they had. But time passes, technology develops, and many are no longer satisfied with frame tearing during dynamic play. This is especially true for the so-called cyber-athletes. In their case, a split second makes all the difference. What should I do?

    Progress does not stand still. Therefore, what previously seemed impossible can now be taken for granted. The same situation applies to image quality on a computer. Manufacturers of video cards and other PC components are now working hard to solve the problem of poor-quality image output on monitors. And, I must say, they have already come quite far. Just a little remains, and the image on the monitor will be perfect. But this is all a lyrical digression. Let's return to our main topic.

    A little history

    Many monitors actively tried to overcome tearing and improve the image. What they didn’t invent: they increased the “hertz” of the monitor, turned on V-Sync. Nothing helped. And one fine day, the famous video card manufacturer NVIDIA will present G-Sync technology, with which you can achieve “unreal” image smoothness without any artifacts. It seems to be good, but there is one small, but very serious “but”. To use this option, you need monitors that support G-Sync. Monitor manufacturers had to work harder and “throw” a couple of dozen models onto the market. What's next? Let's look at the technology and try to figure out if it's any good.

    What is G-Sync?

    G-Sync is a screen display technology from NVIDIA. It is characterized by a smooth frame change without any artifacts. There is no image tearing or stuttering. For adequate operation of this technology, quite powerful computer, since processing a digital signal requires considerable processor power. That is why only new models of video cards from NVIDIA are equipped with the technology. In addition, G-Sync is a proprietary feature of NVIDIA, so owners of video cards from other manufacturers have no chance.

    In addition, a G-Sync monitor is required. The fact is that they are equipped with a board with a digital signal converter. Owners of regular monitors will not be able to take advantage of this amazing option. It’s unfair, of course, but this is the policy of modern manufacturers - to siphon as much money as possible from the poor user. If your PC configuration allows you to use G-Sync, and your monitor miraculously supports this option, then you can fully appreciate all the delights of this technology.

    How G-Sync works

    Let's try to explain in a simplified way how G-Sync works. The fact is that a regular GPU (video card) simply sends digital signal to the monitor, but does not take into account its frequency. This is why the signal appears “ragged” when displayed on the screen. The signal coming from the GPU is interrupted by the monitor frequency and looks unsightly in the final version. Even with the V-Sync option enabled.

    When using G-Sync, the GPU itself regulates the monitor frequency. That is why signals reach the matrix when it is really needed. Thanks to this, it is possible to avoid image tearing and improve the smoothness of the picture as a whole. Since conventional monitors do not allow the GPU to control itself, a G-Sync monitor was invented, which included an NVIDIA board that regulates the frequency. Therefore, the use of conventional monitors is not possible.

    Monitors that support this technology

    Gone are the days when users killed their eyesight by staring at ancient CRT monitors for hours. Current models are elegant and harmless. So why not give them some new technology? The first monitor with NVIDIA G-Sync support and 4K resolution was released by Acer. The new product created quite a sensation.

    Not yet quality monitors with G-Sync are quite rare. But the manufacturers have plans to make these devices standard. Most likely, in five years, monitors supporting this technology will become a standard solution even for office PCs. In the meantime, all that remains is to look at these new products and wait for their widespread distribution. That's when they will become cheaper.

    After that, monitors with G-Sync support began to be riveted by all and sundry. Even budget models with this technology have appeared. Although what's the use of this technology? budget screen with a bad matrix? But, be that as it may, such models do exist. The best option for this option is (G-Sync will work in full force on it).

    The best monitors with G-Sync

    Monitors with G-Sync technology stand out as a special line of devices. They must have the characteristics necessary for full-fledged work this option. It is clear that not all screens can cope with this task. Several leaders in the production of such monitors have already been identified. Their models turned out very successful.

    For example, the G-Sync monitor is one of the brightest representatives of this line. This device is premium. Why? Judge for yourself. The screen diagonal is 34 inches, resolution is 4K, contrast is 1:1000, 100 Hz, matrix response time is 5 ms. In addition, many would like to get this “monster” for themselves. It is clear that he will cope with G-Sync technology with a bang. It has no analogues yet. You can safely call it the best in its class and not be mistaken.

    In general, ASUS G-Sync monitors are now at the top of Olympus. Not a single manufacturer has yet been able to surpass this company. And it is unlikely that this will ever happen. ASUS can be called a pioneer in this regard. Their monitors that support G-Sync are selling like hotcakes.

    The future of G-Sync

    Now they are actively trying to introduce G-Sync technology into laptops. Some manufacturers have even released a couple of such models. Moreover, they can work without a G-Sync card in the monitor. Which is understandable. Still, the laptop is somewhat different design features. A video card supporting this technology is quite sufficient.

    It is likely that NVIDIA G-Sync will soon occupy a significant place in the computer industry. Monitors with this technology should become cheaper. Eventually this option should become available everywhere. Otherwise, what's the point in developing it? In any case, everything is not so rosy yet. There are some problems with the implementation of G-Sync.

    In the future, G-Sync technology will become the same commonplace thing that a VGA port for connecting a monitor was once for us. But all sorts of “vertical synchronization” against the background of this technology look like a blatant anachronism. Not only can these outdated technologies not provide satisfactory picture quality, but they also consume a considerable amount of system resources. Definitely, with the advent of G-Sync, their place in the dustbin of history.

    Do you have a G-SYNC capable monitor and an NVIDIA graphics card? Let's look at what G-SYNC is, how to enable it and configure it correctly in order to fully use the potential and capabilities of this technology. Keep in mind that just turning it on isn't everything.

    Every gamer knows what vertical synchronization (V-Sync) is. This function synchronizes the image frames in such a way as to eliminate the effect of screen tearing. If you disable vertical sync on regular monitor, then the input lag (delay) will decrease and you will notice that the game will respond better to your commands, but thereby the frames will not be properly synchronized and it will lead to screen tearing.

    V-Sync eliminates screen tearing, but at the same time causes an increase in the delay of the image output relative to the controls, so that the game becomes less comfortable. Every time you move the mouse, it appears that the movement effect occurs with a slight delay. And here the G-SYNC function comes to the rescue, which allows you to eliminate both of these shortcomings.

    What is G-SYNC?

    Quite an expensive but effective solution for video cards NVIDIA GeForce is the use of G-SYNC technology, which eliminates screen tearing without the use of additional delay (input lag). But to implement it you need a monitor that includes the G-SYNC module. The module adjusts the screen refresh rate to the number of frames per second, so there is no additional delay and the effect of screen tearing is eliminated.

    Many users, after purchasing such a monitor, only turn on NVIDIA support G-SYNC in panel settings NVIDIA management with the conviction that this is all that must be done. Theoretically yes, because G-SYNC will work, but if you want to fully maximize the use of this technology, then you need to use a number of additional functions, associated with the appropriate setting of classic vertical synchronization and limiting FPS in games to a number that is several frames lower than the maximum refresh rate of the monitor. Why? You will learn all this from the following recommendations.

    Enabling G-SYNC in the NVIDIA Control Panel

    Let's start with the simplest basic solution, that is, from the moment the G-SYNC module is turned on. This can be done using the NVIDIA Control Panel. Right-click on your desktop and select NVIDIA Control Panel.

    Then go to the Display tab - G-SYNC Setup. Here you can enable the technology using the “Enable G-SYNC” field. Tag it.

    You can then specify whether it will only work in full screen mode, or can also activate in games running in windowed mode or a full screen window (without borders).

    If you select the “Enable G-SYNC for full screen mode” option, the function will only work in games that have the full screen mode set (this option can be changed in the settings of specific games). Games in windowed mode or full screen will not use this technology.

    If you want windowed games to also use G-SYNC technology, then enable the “Enable G-SYNC for windowed and full screen mode” option. When this option is selected, the function intercepts the currently active window and overlays its action on it, enabling it to support modified screen refresh. You may need to restart your computer to activate this option.

    How to check that this technology is enabled. To do this, open the Display menu at the top of the window and check the “G-SYNC Indicator” field in it. This will inform you that G-SYNC is enabled when you launch the game.

    Then go to the Manage 3D Settings tab in the side menu. In the "Global settings" section ( General settings) Find the “Preferred refresh rate” field.

    Set this to "Highest available". Some games may impose their own refresh rate, which may result in G-SYNC not being fully utilized. Thanks to this parameter, all game settings will be ignored and the ability to use the maximum monitor refresh rate will always be enabled, which in devices with G-SYNC is most often 144Hz.

    In general, this basic setup which you need to do to enable G-SYNC. But, if you want to fully use the potential of your equipment, you should read the following instructions.

    What should I do with V-SYNC if I have G-SYNC? Leave it on or turn it off?

    This is the most common dilemma of G-SYNC monitor owners. It is generally accepted that this technology completely replaces the classic V-SYNC, which can be completely disabled in the NVIDIA Control Panel or simply ignored.

    First you need to understand the difference between them. The task of both functions is theoretically the same - to overcome the effect of screen tearing. But the method of action is significantly different.

    V-SYNC synchronizes frames to match the monitor's constant refresh rate. Consequently, the function acts as an intermediary, capturing the picture and therefore the display frame so as to adapt them to a constant frame rate, thereby preventing image tearing. As a result, this can lead to input lag (delay), because V-SYNC must first “capture and organize” the image, and only then display it on the screen.

    G-SYNC works exactly the opposite. It adjusts not the image, but the monitor refresh rate to the number of frames displayed on the screen. Everything is done in hardware using the G-SYNC module built into the monitor, so there is no additional delay in displaying the image, as is the case with vertical synchronization. This is its main advantage.

    The whole problem is that G-SYNC only works well when the FPS is in the supported refresh rate range. This range covers frequencies from 30 Hz to the maximum value the monitor supports (60Hz or 144Hz). That is, this technology works to its full potential when FPS does not drop below 30 and does not exceed 60 or 144 frames per second, depending on the maximum supported refresh rate. The infographic below, created by BlurBusters, looks really good.

    What happens if the fps goes outside this range? G-SYNC will not be able to adjust the screen update, so anything outside the range will not work. You will find exactly the same problems as on a regular monitor without G-SYNC and classic vertical sync will work. If it is turned off, screen tearing will occur. If it is turned on, you will not see the gap effect, but an iput lag (delay) will appear.

    Therefore, it is in your best interest to stay within the G-SYNC refresh range, which is a minimum of 30Hz and a maximum of whatever the monitor maxes out (144Hz is most common, but there are 60Hz displays as well). How to do it? Using appropriate vertical synchronization parameters, as well as by limiting the maximum number of FPS.

    What, then, is the conclusion from this? In a situation where the number of frames per second drops below 30 FPS, you need to leave vertical sync still enabled. These are rare cases, but if it does happen, V-SYNC ensures that there will be no tearing effect. If the upper limit is exceeded, then everything is simple - you need to limit the maximum number of frames per second so as not to approach the upper limit, when crossed, V-SYNC is turned on, thereby ensuring continuous operation of G-SYNC.

    Therefore, if you have a 144 Hz monitor, you need to enable the FPS cap at 142 to avoid going too close to the upper limit. If the monitor is 60 Hz, set the limit to 58. Even if the computer is able to make more FPS, it will not do this. Then V-SYNC will not turn on and only G-SYNC will be active.

    Enabling Vsync in NVIDIA Settings

    Open the NVIDIA Control Panel and go to the “Manage 3D Settings” tab. In the Global Setting section, find the Vertical Sync option and set the option to “On”.

    Thanks to this, vertical synchronization will always be ready to turn on if the FPS drops below 30 FPS, and a monitor with G-SYNC technology would not be able to cope with this.

    Limit FPS to less than maximum screen refresh rate

    The best way to limit frames per second is to use RTSS (RivaTuner Statistics Server) program. Undoubtedly, the best solution is to use the limiter built into the game, but not everyone has it.

    Download and run the program, then in the list of games on the left side, check the Global field. Here you can set a common limiter for all applications. On the right side, find the “Framerate limit” field. Set the limit here for 144Hz monitors - 142 FPS, respectively, for 60Hz devices -58 FPS.

    When the limit is set, there will be no delay in activating classic vertical synchronization and playing will become much more comfortable.

    In addition to realistic, beautiful pictures for maximum enjoyment computer game You need a stable frame rate. Slight subsidence, as well as visual artifacts such as frame tearing, can make it unplayable. In the past, V-Sync technology has been used to solve such problems, but the results often leave much to be desired. Currently, both major GPU manufacturers, NVIDIA and AMD, have developed their own synchronization technologies called G-Sync and FreeSync, respectively, which make games look much smoother than before.

    If the frequencies are not synchronized...

    Why does stuttering or frame tearing occur during the game? The fact is that the screen refresh rate of computer monitors is usually fixed at 60 Hz, that is, the screen is updated 60 times per second. The number of frames created per second by the video card during the game is not constant. It is when the screen refresh rate does not match the frame rate of the video card that we can observe the above-mentioned effects that unpleasantly affect the perception of the game. Considering that the performance of video cards is constantly improving, the frame rate in games is increasing - even at high image quality settings, but the screen refresh rate of most monitors remains at 60 Hz. As a result of this inconsistency, we end up with even more unwanted visual artifacts while gaming.

    V-Sync technology

    In the past, we turned on V-Sync to make games look smoother. This function tells the video card to output exactly 60 frames per second to the display, which is updated the same 60 times per second. The result is no frame tearing. However, this limits the performance of flagship video cards. Additionally, if the game speed does not reach 60 frames per second, for example because some scene is too graphically complex, the video card will be forced to output even less - 30 frames per second until a less complex scene is reached, and then the frame rate will again will rise to 60. Since the frame rate fluctuations are too large, such a switch between 30 and 60 frames per second will be noticeable to the eye as a “slowdown” of the picture.

    NVIDIA G-Sync technology

    Both major GPU manufacturers, NVIDIA and AMD, strive to improve image quality in games, so they both solved the problem of synchronizing the video card and monitor, but different ways. NVIDIA was the first to introduce its technology called G-Sync. This is a hardware solution that requires a special chip to be built directly into the monitor. The chip has a buffer memory through which data is exchanged with the NVIDIA GeForce series video card. The G-Sync chip reads frames from the video card and automatically changes the monitor's refresh rate to match the current game speed so that they match each other. In this case, the screen refresh rate can dynamically vary in the range from 0 to 240 Hz in real time. The result is a complete absence of all kinds of visual artifacts in games, such as frame tearing, stuttering, and screen flickering.

    AMD FreeSync Technology

    AMD FreeSync technology appeared later, but has already reached its second version. Like G-Sync, it provides dynamic change refresh rate of the monitor screen - in accordance with the frame output rate of the Radeon series video card. This eliminates the stuttering problem typical of traditional V-Sync, resulting in smoother in-game images.

    Unlike NVIDIA G-Sync technology, which requires the use of discrete chips, FreeSync is implemented at the interface level. It is an implementation of the industry standard DisplayPort Adaptive-Sync, which allows you to change the refresh rate in real time via the DisplayPort interface. Latest version The HDMI interface also supports FreeSync. FreeSync technology does not require adding any chips to monitors, but it does require a monitor with a DisplayPort or HDMI interface, as well as a video card with GPU Radeon.

    In addition to improved image smoothness, FreeSync 2 technology improves picture quality. Low Framerate Compensation (LFC) has expanded the refresh rate range of FreeSync-enabled monitors, ensuring smooth images even when game speeds drop below 30 frames per second. In addition, FreeSync 2 supports double sRGB color gamut and HDR high dynamic range.


    Conclusion

    Both adaptive synchronization technologies, NVIDIA G-Sync and AMD FreeSync, work effectively. The improvement in the image after their activation will be noticeable, especially in dynamic games with complex scenes, such as first-person shooters, racing, and sports simulators. However, it is very difficult to compare both technologies with each other based on any quantitative data. We can only rely on our own eyes, so each user will have their own subjective opinion about which one is better.

    The objective difference between the two technologies is cost. In addition to having a compatible graphics card with the appropriate GPU, for NVIDIA works G-Sync requires a monitor with an additional discrete chip and a paid license from NVIDIA, which affects the cost and price of the finished product. G-Sync monitors are more expensive than their AMD FreeSync counterparts, which is why there are more FreeSync monitors on the market. On the other hand, it's really worth it: monitors with G-Sync are usually higher-end in terms of technical characteristics and, in combination with high-performance GeForce video cards, produce a great picture. If we take into account the price difference, monitors with FreeSync and Radeon graphics cards represent a more profitable solution in terms of maximum return on investment.

    Learn how to set up FreeSync for a three-monitor setup in this article.