Nvidia g sync which cards support. Nvidia G-Sync technology. Enabling Vsync in NVIDIA Settings

This photo shows how the frame from the game seems to be divided into two. Why did this happen? Everything is more than simple. The video card forms the image frame by frame, each new frame appears on your monitor screen. But not immediately after its creation, but only during the next update of the image on the monitor screen. The image on it is updated a strictly certain number of times per second, and, what is important in our case, not on the entire screen at once, but line by line from top to bottom. Meanwhile, the video card prepares frames not at the same speed, but each at a completely different speed, depending on the complexity of the scene. In order to somehow synchronize personnel training graphics adapter and the output of these frames to the monitor, video adapters use two so-called buffers: from one buffer the monitor reads a new frame, in the other the next one is formed. However, this is often not enough to ensure a smooth picture, and we see half of two different frames or even several frames at once on the monitor screen.

There is an exit!

In order not to observe this mess permanently, there is a technology for vertical synchronization - synchronizing the output of frames by the video card and updating the image on the monitor. However, if your video card can’t handle it and new frames are created less frequently than the picture on the monitor is updated, then on the monitor you will see the same last frame two or more times until the next one is ready. Even if, on average, your graphics accelerator in a particular game with certain settings maintains a large number of frames per second on average, as soon as one of the many frames does not make it in time for the next image refresh cycle on the screen, the time until this frame appears will double. This is called a "frieze". If you play fast-paced shooters, then, of course, any delay in display can cost you one unkilled enemy or even the life of your virtual alter ego.

NVIDIA G-SYNC technology is designed to rid gamers of both “monitor” ailments at once. True, this is not the first attempt by the “greens” to find a solution that eliminates tears and friezes. The previous one was another technology called Adaptive V-Sync. True, it just automatically turns vertical sync on and off depending on the frame rate.

What about AMD? And this company understands that it is necessary to take care of the nerves of gamers, because they are practically the engine of the PC industry and much of what is connected with it. Thanks to AMD, the upcoming DisplayPort revision, called 1.2a, includes technology that allows you to adjust the monitor's refresh rate in order to synchronize it with the frame rate of the video card. And at the same time, there are no proprietary boards in the monitors. It's interesting and definitely less expensive alternative, but it has not yet seen the light of day, but more than one manufacturer has already announced monitors with G-SYNC.

In those good old days when the owners personal computers They actively used huge CRT monitors, earning themselves astigmatism; there was no talk about image smoothness. The technologies of that time did not really support 3D. Therefore, poor users had to be content with what they had. But time passes, technology develops, and many are no longer satisfied with frame tearing during dynamic play. This is especially true for the so-called cyber-athletes. In their case, a split second makes all the difference. What should I do?

Progress does not stand still. Therefore, what previously seemed impossible can now be taken for granted. The same situation applies to image quality on a computer. Manufacturers of video cards and other PC components are now working hard to solve the problem of poor-quality image output on monitors. And, I must say, they have already come quite far. Just a little remains, and the image on the monitor will be perfect. But this is all a lyrical digression. Let's return to our main topic.

A little history

Many monitors actively tried to overcome tearing and improve the image. What they didn’t invent: they increased the “hertz” of the monitor, turned on V-Sync. Nothing helped. And one day a famous manufacturer NVIDIA video cards presents G-Sync technology, with which you can achieve “unreal” image smoothness without any artifacts. It seems to be good, but there is one small, but very serious “but”. To use this option, you need monitors that support G-Sync. Monitor manufacturers had to work harder and “throw” a couple of dozen models onto the market. What's next? Let's look at the technology and try to figure out if it's any good.

What is G-Sync?

G-Sync is a screen display technology from NVIDIA. It is characterized by a smooth frame change without any artifacts. There is no image tearing or stuttering. For adequate operation of this technology, quite powerful computer, since processing a digital signal requires considerable processor power. That is why only new models of video cards from NVIDIA are equipped with the technology. In addition, G-Sync is a proprietary feature of NVIDIA, so owners of video cards from other manufacturers have no chance.

In addition, a G-Sync monitor is required. The fact is that they are equipped with a board with a digital signal converter. Owners of regular monitors will not be able to take advantage of this amazing option. It’s unfair, of course, but this is the policy of modern manufacturers - to siphon as much money as possible from the poor user. If your PC configuration allows you to use G-Sync, and your monitor miraculously supports this option, then you can fully appreciate all the delights of this technology.

How G-Sync works

Let's try to explain in a simplified way how G-Sync works. The fact is that a regular GPU (video card) simply sends digital signal to the monitor, but does not take into account its frequency. This is why the signal appears “ragged” when displayed on the screen. The signal coming from the GPU is interrupted by the monitor frequency and looks unsightly in the final version. Even with the V-Sync option enabled.

When using G-Sync, the GPU itself regulates the monitor frequency. That is why signals reach the matrix when it is really needed. Thanks to this, it is possible to avoid image tearing and improve the smoothness of the picture as a whole. Since conventional monitors do not allow the GPU to control itself, a G-Sync monitor was invented, which included an NVIDIA board that regulates the frequency. Therefore, the use of conventional monitors is not possible.

Monitors that support this technology

Gone are the days when users killed their eyesight by staring at ancient CRT monitors for hours. Current models are elegant and harmless. So why not give them some new technology? First monitor with NVIDIA support G-Sync and 4K resolution was released by Acer. The new product created quite a sensation.

Not yet quality monitors with G-Sync are quite rare. But the manufacturers have plans to make these devices standard. Most likely, in five years, monitors supporting this technology will become a standard solution even for office PCs. In the meantime, all that remains is to look at these new products and wait for their widespread distribution. That's when they will become cheaper.

After that, monitors with G-Sync support began to be riveted by all and sundry. Even budget models with this technology have appeared. Although what's the use of this technology? budget screen with a bad matrix? But, be that as it may, such models do exist. The best option for this option is (G-Sync will work in full force on it).

The best monitors with G-Sync

Monitors with G-Sync technology stand out as a special line of devices. They must have the characteristics necessary for full-fledged work this option. It is clear that not all screens can cope with this task. Several leaders in the production of such monitors have already been identified. Their models turned out very successful.

For example, the G-Sync monitor is one of the brightest representatives of this line. This device is premium. Why? Judge for yourself. The screen diagonal is 34 inches, resolution is 4K, contrast is 1:1000, 100 Hz, matrix response time is 5 ms. In addition, many would like to get this “monster” for themselves. It is clear that he will cope with G-Sync technology with a bang. It has no analogues yet. You can safely call it the best in its class and not be mistaken.

In general, ASUS G-Sync monitors are now at the top of Olympus. Not a single manufacturer has yet been able to surpass this company. And it is unlikely that this will ever happen. ASUS can be called a pioneer in this regard. Their monitors that support G-Sync are selling like hotcakes.

The future of G-Sync

Now they are actively trying to introduce G-Sync technology into laptops. Some manufacturers have even released a couple of such models. Moreover, they can work without a G-Sync card in the monitor. Which is understandable. Still, the laptop is somewhat different design features. A video card supporting this technology is quite sufficient.

It is likely that NVIDIA G-Sync will soon occupy a significant place in the computer industry. Monitors with this technology should become cheaper. Eventually this option should become available everywhere. Otherwise, what's the point in developing it? In any case, everything is not so rosy yet. There are some problems with the implementation of G-Sync.

In the future, G-Sync technology will become the same commonplace thing that a VGA port for connecting a monitor was once for us. But all sorts of “vertical synchronization” against the background of this technology look like a blatant anachronism. Not only can these outdated technologies not provide satisfactory picture quality, but they also consume a considerable amount of system resources. Definitely, with the advent of G-Sync, their place in the dustbin of history.

NVIDIA G-SYNC monitors features revolutionary NVIDIA technology that eliminates screen tearing and input lag with VSync frame synchronization, and improves the performance of modern monitors, resulting in the smoothest, most responsive gaming experience you've ever seen.

As a result, game scenes appear instantly, objects become clearer, and gameplay becomes smoother.

HOW DOES NVIDIA G-SYNC WORK?

NVIDIA® G-SYNC™ is an innovative solution that breaks old molds to create the most advanced and responsive computer displays in history. The NVIDIA G-SYNC module can be installed independently or purchased pre-installed in the latest gaming monitors. It helps you forget about screen tearing, input lag, and eye-strain-inducing jitter that were caused by older technologies that migrated from analog TVs to modern monitors.

PROBLEM: OLD MONITOR TECHNOLOGY

When televisions were invented, they used cathode ray tubes, which worked by scanning the flow of electrons across the surface of a phosphor-coated tube. This beam caused the pixel to flicker, and when enough pixels were activated fairly quickly, the cathode ray tube created the feeling of full-motion video. Believe it or not, these first TVs operated at a refresh rate of 60 Hz, since the industrial frequency alternating current in the USA it is 60 Hz. Matching the TV's refresh rate to industrial AC frequency made it easier to build early electronics and reduce screen noise.

By the time personal computers were invented in the early 1980s, cathode ray TV technology was firmly established as the simplest and most cost-effective technology for creating computer monitors. 60Hz and fixed refresh rates have become the standard, and system builders have learned to make the most of a less-than-ideal situation. Over the past three decades, while cathode ray TV technology has evolved into LCD and LED TV technology, no major company has dared to challenge this stereotype, and synchronizing the GPU with the monitor's refresh rate remains standard industry practice to this day.

The problem is that video cards do not render images at a constant frequency. In fact, GPU frame rates will vary significantly even within the same scene in the same game depending on the current GPU load. And if monitors have a fixed refresh rate, then how do you transfer images from the GPU to the screen? The first way is to simply ignore the monitor's refresh rate and refresh the image mid-cycle. We call this VSync disabled mode, which is how most gamers play by default. The downside is that when one monitor refresh cycle includes two images, a very noticeable "break line" appears, commonly referred to as screen tearing. A good known way to combat screen tearing is to enable VSync technology, which causes the GPU to delay screen refreshes until the start of a new monitor refresh cycle. This causes image judder when the GPU frame rate is lower than the display refresh rate. It also increases latency, which leads to input lag - a noticeable delay between pressing a key and the result appearing on the screen.

To make matters worse, many players suffer from eye strain due to image shaking, and others develop headaches and migraines. This led us to develop Adaptive VSync technology, an effective solution that was well received by critics. Despite the creation of this technology, the problem of input lag still remains, and this is unacceptable for many enthusiast gamers and is completely unsuitable for professional gamers (e-sports) who independently configure their video cards, monitors, keyboards and mice to minimize annoying delays between action and reaction.

SOLUTION: NVIDIA G-SYNC

Meet NVIDIA G-SYNC technology, which eliminates screen tearing, VSync display lag, and image jitter. To achieve this revolutionary capability, we created G-SYNC for monitors, which allows the monitor to synchronize with the frame rate of the GPU, rather than the other way around, resulting in faster, smoother, tear-free images that take gaming to the next level.

Industry gurus John Carmack, Tim Sweeney, Johan Andersson and Mark Rein were impressed by NVIDIA G-SYNC technology. Esports players and eSports leagues are lining up to use NVIDIA G-SYNC technology, which will unleash their true skills, requiring even faster reactions thanks to an imperceptible delay between on-screen actions and keyboard commands. During internal testing, avid gamers spent their lunch hours playing online matches. local network, using G-SYNC-enabled monitors to win.

If you have a monitor that supports NVIDIA G-SYNC, online game you will have an undeniable advantage over other players, provided that you also have a low ping.

A REVOLUTIONARY SOLUTION IS HERE

In these times of technological wonders, few advances can be called truly “innovative” or “revolutionary.” However, NVIDIA G-SYNC is one of those few breakthroughs that revolutionizes outdated monitor technology with an innovative approach that has never been tried before.

G-SYNC eliminates input lag, tearing, and jitter for a stunning visual experience on any G-SYNC-enabled monitor; they're so great you'll never want to use them again regular monitor. In addition to revolutionary changes in visual capabilities, multiplayer gamers will have the added advantage of combining G-SYNC, a high-speed graphics card GeForce GTX and low latency input devices. This will definitely interest fans of shooters. For eSports athletes, NVIDIA G-SYNC is a significant improvement. Because G-SYNC eliminates input lag, success or failure is now entirely up to the players, helping to separate the pros from the amateurs.

If you, like eSports athletes, want the sharpest, smoothest and most responsive game process, then monitors with NVIDIA G-SYNC support are a complete breakthrough, the likes of which cannot be found anywhere else. A true innovation in the era of enhancements, NVIDIA G-SYNC will revolutionize the way you play games.

G-Sync Technology Overview | A Brief History of Fixed Refresh Rate

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots we call pixels. They draw from left to right each "scanning" line from top to bottom. Adjusting the speed of the electron gun from one full update to the next was not very practical before, and there was no particular need for this before the advent of 3D games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog connectors (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved away from fixed refresh rates. Film and television still rely on an input signal at a constant frame rate. Once again, switching to a variable refresh rate doesn't seem all that necessary.

Adjustable frame rates and fixed refresh rates are not the same

Before the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it came up when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphic scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.

Disabling synchronization causes gaps

It turns out that variable frame rate GPU and the fixed refresh rate of the LCD panel don't work very well together. In this configuration, we encounter a graphical artifact called “tearing.” It occurs when two or more partial frames are rendered together during the same monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect while moving.

The image above shows two well-known artifacts that are common but difficult to capture. Because these are display artifacts, you won't see this in regular game screenshots, but our images show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card that supports video capture, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; This is the method we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one was done using a camera, the bottom one was done through the video capture function. The bottom picture is “cut” horizontally and looks displaced. In the top two images, the left photo was taken on a Sharp screen with a 60 Hz refresh rate, the right one was taken on an Asus display with a 120 Hz refresh rate. The tearing on a 120Hz display is less pronounced because the refresh rate is twice as high. However, the effect is visible and appears in the same way as in the left image. This type of artifact is a clear sign that the images were taken with vertical sync (V-sync) disabled.

Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect that is visible in the images of BioShock: Infinite is called ghosting. It is especially visible at the bottom left of the photo and is associated with a delay in screen refresh. In short, individual pixels do not change color quickly enough, resulting in this type of glow. A single frame cannot convey the effect ghosting has on the game itself. A panel with an 8ms gray-to-gray response time, such as the Sharp, will end up producing a blurry image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

V-sync: "wasted on soap"

Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card attempts to match the screen's refresh rate, eliminating tearing completely. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60 Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.) etc.), which in turn will lead to noticeable slowdowns.

When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, since V-sync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can introduce additional input lag into the render chain. Thus, V-sync can be both a blessing and a curse, solving some problems but causing other disadvantages. An informal survey of our staff found that gamers tend to disable V-sync, only turning it on when tearing becomes unbearable.

Get Creative: Nvidia Unveils G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync that attempts to mitigate problems by enabling V-sync when frame rates are above the monitor's refresh rate and quickly disabling it when performance drops sharply below the refresh rate. Although the technology did its job well, it was a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.

GPU frame rate determines the monitor's refresh rate, removing artifacts associated with enabling and disabling V-sync

The packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a module that operates on variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the video card is outputting (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

Even before testing begins, I would like to commend the team for their creative approach to solving a real problem affecting PC gaming. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaling device is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24-Inch 144Hz Gaming Monitor for $400", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to find out how new technology Nvidia will impact the most popular games.

G-Sync Technology Overview | 3D LightBoost, built-in memory, standards and 4K

As we reviewed Nvidia's press materials, we asked ourselves many questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed was that Nvidia sent a monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using pulsating panel backlighting to reduce ghosting (or motion blur). Naturally, it became interesting whether it is used this technology V G-Sync.

Nvidia gave a negative answer. While using both technologies at the same time would be ideal, today strobing the backlight at a variable refresh rate leads to flicker and brightness issues. Solving them is incredibly difficult because you have to adjust the brightness and track the pulses. As a result, the choice now is between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know, G-Sync eliminates the step input lag associated with V-sync as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long will it take for the frame to travel through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is turned off, and it is associated with the characteristics of the game, video driver, mouse, etc.

Will G-Sync be standardized?

This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology G-Sync. However, we wanted to ask the developer directly and find out if Nvidia plans to bring the technology to an industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, providing variable refresh rates. After all, Nvidia is a member of the VESA association.

However, there are no new specifications planned for DisplayPort, HDMI or DVI. G-Sync already supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology that is now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

G-Sync at Ultra HD resolutions

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will look at today, only supports 1920x1080 pixels. On this moment Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if there will be a module G-Sync support MST configuration?

In truth, 4K displays with variable frame rates will still have to wait. There is no separate upscaling device that supports 4K resolution yet; the nearest one should appear in the first quarter of 2014, and monitors equipped with them will not appear until the second quarter. Since the module G-Sync replaces the scaling device, compatible panels will begin to appear after this point. Fortunately, the module natively supports Ultra HD.