FPS limit for g sync. NVIDIA G-Sync technology

Test Methodology

The ASUS ROG SWIFT PG278Q monitor has been tested with our new method. We decided to ditch the slow and sometimes inaccurate Spyder4 Elite in favor of the faster and more accurate X-Rite i1Display Pro colorimeter. Now this colorimeter will be used to measure the main parameters of the display in conjunction with software package Argyll CMS latest version. All operations will be carried out in Windows 8. During testing, the screen refresh rate is 60 Hz.

In accordance with the new methodology, we will measure the following monitor parameters:

  • White brightness at backlight power from 0 to 100% in 10% steps;
  • Black brightness at backlight power from 0 to 100% in 10% steps;
  • Display contrast at backlight power from 0 to 100% in 10% increments;
  • Color gamut;
  • color temperature;
  • Gamma curves of the three primary RGB colors;
  • Gamma curve in grey;
  • Delta E (according to CIEDE2000 standard).

Delta E is used for calibration and analysis GUI for Argyll CMS - DispcalGUI, the latest version at the time of this writing. All measurements described above are carried out before and after calibration. During the tests, we measure the main monitor profiles - set by default, sRGB (if available) and Adobe RGB (if available). Calibration is carried out in the default profile, except for special cases, which will be discussed later. For monitors with wide color gamuts, we select the sRGB hardware emulation mode, if available. In the latter case, colors are converted using the monitor's internal LUTs (which can be up to 14 bits per channel) and output to a 10-bit matrix, while an attempt to narrow the color gamut to sRGB boundaries with OS color correction tools will lead to a decrease in color coding accuracy. Before starting all tests, the monitor warms up for an hour, and all its settings are reset to factory settings.

We'll also continue our old practice of posting calibration profiles for the monitors we've tested at the end of the article. At the same time, the 3DNews test lab warns that such a profile will not be able to 100% correct the shortcomings of your particular monitor. The fact is that all monitors (even within the same model) will certainly differ from each other in their small color reproduction errors. It is physically impossible to make two identical matrices - they are too complicated. Therefore, for any serious monitor calibration, a colorimeter or spectrophotometer is needed. But even a “universal” profile created for a specific instance can generally improve the situation for other devices of the same model, especially in the case of cheap displays with pronounced color rendition defects.

Viewing angles, backlight uniformity

The first thing that interested us in the ASUS PG278Q was the viewing angles, because the monitor uses a TN-matrix - its biggest problems are always associated with them. Luckily, things didn't turn out so bad. Of course, IPS matrices have larger viewing angles, but the ASUS PG278Q didn't have to be rotated often to eliminate contrast and color distortions.

But the developers of ASUS PG278Q could not avoid problems with screen backlight. The monitor has a slight backlight in all four corners and in the upper part. If a game is running on the display, then it will not be easy to see the flare, but it is worth running some movie in dark room(with the usual vertical black stripes above and below) - and the defect immediately becomes noticeable.

Testing without calibration

The maximum brightness of ASUS PG278Q was 404 cd/m 2 - even more than the manufacturer promises. Such a high value is justified by 3D support, because when using active shutter glasses, the perceived brightness of the monitor can drop by half. The maximum brightness of the black field was 0.40 cd / m 2, which is also quite good. As a result, the static contrast ratio fluctuates around 1000:1 across the entire range of backlight brightness. An excellent result - such a high contrast ratio is typical for high-quality IPS matrices. MVA, however, is out of reach.

With color gamut, our test subject is doing as well as required. The sRGB color space is covered by 107.1%. The white point is near the D65 reference point.

If we talk about games, then ASUS PG278Q has a full color palette, but there may be problems with professional photo processing due to slightly oversaturated colors due to excess color gamut compared to sRGB. However, the display we are considering is designed just for games, so you should not pay much attention to this shortcoming.

The color temperature of the ASUS PG278Q during the measurements was kept at 6,000 K, which is 500 K below the norm. This means that light colors can have a slight warm tint.

Only the red gamma curve turned out to be close to the standard, and the blue and green curves sank, although they tried to stick together. At the same time, things are going almost well with the gray scale of the monitor. When measuring dark tones, it practically does not deviate from the reference curve, and when moving to light tones, it departs, but not much.

The average value of the Delta E color accuracy score was 2.08 units, and the maximum value was 7.07 units. The results, of course, are not the best, but, firstly, ASUS PG278Q is still intended for games, not for photo processing, and secondly, for a TN-matrix, the results we obtained are quite satisfactory.

Testing after calibration

Usually, after calibration, the white brightness drops, and very much - by 10% or more, even for quite high-quality panels. In the case of ASUS PG278Q, it fell by about 3% - to 391 cd/m 2 . The brightness of the black field was not affected by the hardware calibration. As a result, the static contrast ratio dropped to 970:1.

Calibration had practically no effect on the color gamut, but the white point returned to its proper place, even if it moved only a little.

After calibration, the color temperature rose slightly, but did not reach the reference. Now the gap between the measured and reference value was approximately 100-200 K instead of 500 K, which, however, is quite tolerable.

The position of the three main gamma curves, unfortunately, did not change much after calibration, while the gray scale began to look a little better.

But the calibration had the best effect on color accuracy. The average value of Delta E dropped to 0.36 units, the maximum - to 1.26 units. Excellent results for any matrix, and for TN + Film - just fantastic.

G-Sync Testing: Methodology

NVIDIA's G-Sync guide shows settings for multi-game testing that will hover between 40 and 60 FPS. It is in such conditions at a refresh rate of 60 Hz that most “freezes” occur with V-Sync turned on. We'll start by comparing three usage scenarios: with V-Sync, without it, and with G-Sync, all at 60Hz.

But remember that raising the refresh rate from 60 to 120/144 Hz by itself makes tearing less noticeable without vertical sync, and with V-Sync it reduces “freezes” from 13 to 8/7 ms, respectively. Is there any real benefit to G-Sync over V-Sync at 144Hz? Let's check this too.

I would like to emphasize that, according to the description, in the case of G-Sync, the refresh rate does not make sense at all. Therefore, it is not entirely correct to say that we, for example, compared V-Sync and G-Sync at a frequency of 60 Hz. V-Sync was really at 60Hz, and G-Sync means screen refresh on demand, not with a certain period. But even with G-Sync enabled, we can still choose the screen refresh rate in the driver control panel. At the same time, FRAPS in games when G-Sync is activated shows that exactly the same frame rate ceiling is in effect, as if V-Sync was working. It turns out that this setting regulates the minimum frame lifetime and, accordingly, the screen refresh interval. Roughly speaking, the frequency range in which the monitor operates is set - from 30 to 60-144 Hz.

In order to enable G-Sync, you need to go to the NVIDIA control panel, find the appropriate link in the left corner of the screen and check the box next to the only checkbox. The technology is supported in drivers for Windows 7 and 8.

Then you need to make sure that G-Sync is also enabled in the "3D Settings" section - it can be found in the Vertical Sync submenu.

That's all: G-sync function turned on for all games running in full screen mode, - this function does not yet know how to work in the window. For testing, we used a stand with a graphic GeForce card GTX TITAN Black.

The tests were carried out in the games Assasin's Creed: Black Flag, as well as in Counter-Strike: Global Offensive. We tested the new technology in two ways: we just played it, and then hunted for gaps using a script that smoothly moved the game camera, that is, “moved the mouse” horizontally. The first method allowed us to evaluate the sensations of G-Sync "in combat", and the second one - to more clearly see the difference between on/off vertical sync and G-Sync.

G-Sync in Assassin's Creed: Black Flag, 60Hz

Without V-Sync and G-Sync at 60Hz tearing was perfectly visible with almost any movement of the camera.

The gap is visible in the upper right part of the frame, near the mast of the ship

When V-Sync was turned on, the image breaks disappeared, but “freezes” appeared, which did not benefit the gameplay.

The double mast of the ship in the photo is one of the signs of the "frieze"

After enabling G-Sync gaps and "friezes" disappeared completely, the game began to work more smoothly. Of course, a periodic decrease in the frame rate to 35-40 FPS was noticeable, but thanks to the synchronization of the display and the video card, it did not cause such noticeable brakes as with vertical synchronization.

However, as they say, it's better to see once than hear a hundred times, so we made a short video showing the new Assassins in operation with V-sync on and off, as well as with G-Sync. Of course, the video cannot convey the "live" impressions completely, if only because of shooting at a frequency of 30 frames per second. In addition, the camera “sees” the world differently than the human eye, so the video may show artifacts that are not visible in the real world, such as ghosting. Nevertheless, we tried to make this video as clear as possible: at least the presence or absence of gaps on it is quite noticeable.

Now let's run Assassin's Creed: Black Flag with minimal settings and see what has changed. The number of frames per second in this game mode did not exceed 60 FPS - the set screen refresh rate. Without vertical sync turned on, tearing was noticeable on the screen. But as soon as V-Sync was turned on, the gaps disappeared and the “picture” began to look almost the same as with G-Sync.

When exhibiting maximum settings graphics, the number of frames per second began to fluctuate around 25-35 FPS. Of course, breaks without V-Sync and “freezes” with it immediately returned. Even the inclusion of G-Sync could not correct this situation - with such a low number of FPS, the GPU itself generates brakes.

G-Sync in Assassin's Creed: Black Flag, 144Hz

With V-Sync and G-sync disabled You could find tearing on the screen, but thanks to the 144Hz refresh rate, there are far fewer of them than before. When turned on v-sync gaps disappeared, but “friezes” began to occur more often - almost the same as with a screen refresh rate of 60 Hz.

Inclusion G-sync, as before, was able to correct the situation, but the strongest improvement in the picture was noticeable only at high frame rates - from 60 FPS and above. But without lowering the settings or adding a second video card of the GeForce GTX Titan Black level, it was impossible to achieve such a high frame rate.

G-Sync in Counter-Strike: Global Offensive, 60 and 144 Hz

In online games, the gameplay and image quality are affected not only by the video card and monitor, but also by ping - the higher it is, the greater the delay in the "response" of the game. During our tests, the ping was at the level of 25-50ms, and the frame rate during the test fluctuated around 200 FPS.

Picture settings used in Counter-Strike: Global Offensive

Without using G-Sync and V-Sync in CS, as in Assassin's Creed, there were gaps. When turned on V-Sync at 60Hz it became more difficult to play - the frame rate dropped to 60 FPS, and the game character's running became uneven due to a large number"friezes".

When turned on G-sync the frame rate remained at the level of 60 frames per second, but there were much fewer "freezes". It cannot be said that they disappeared completely, but they stopped spoiling the impression of the game.

Now let's increase the screen refresh rate and see what changes. With G-Sync and V-Sync disabled at 144Hz There are much fewer discontinuities than at 60 Hz, but they have not completely disappeared. But when turned on v-sync all gaps disappeared, and the “freezes” became almost invisible: it is very comfortable to play in this mode, and the speed of movement does not decrease. Inclusion G-sync and completely turned the image into a candy: the gameplay became so smooth that even a 25-ms ping began to greatly affect the gameplay.

ULMB mode testing

Ultra Low Motion Blur is enabled from the monitor menu, but first you need to turn off G-Sync and set the screen refresh rate to 85, 100 or 120 Hz. Lower or higher frequencies are not supported.

The practical application of this "chip" is obvious: the text on the sites is less smeared during scrolling, and in strategies and other RTS games, moving units look more detailed.

ASUS ROG SWIFT PG278Q in 3D

ASUS ROG SWIFT PG278Q is the world's first monitor capable of displaying a stereoscopic image at a resolution of 2560x1440 thanks to the DisplayPort 1.2 interface. Also, in principle, no small achievement. Unfortunately, the monitor does not have a built-in IR transmitter, so we took the transmitter from the NVIDIA 3D Vision kit and the glasses from the 3D Vision 2 kit. This pairing worked without problems, and we were able to test stereoscopic 3D properly.

We did not find any effect of ghosting and other artifacts found in pseudo-volumetric video. Of course, sometimes in games some objects were at the wrong depth, but this cannot be attributed to the disadvantages of the monitor. On ASUS PG278Q you can both watch stereo movies and play similar games. The main thing is that the video adapter pulls.

⇡ Conclusions

Without wanting to underestimate the achievements of NVIDIA, it should be noted that in general, G-Sync is such an innovation that comes down to getting rid of a long-standing and harmful atavism - regular updating of LCD panels that do not initially need it. It turned out that for this it is enough to make small changes to the DisplayPort protocol, which, at the click of a finger, got into the 1.2a specification and, according to AMD's promises, will very soon find application in display controllers from many manufacturers.

So far, however, only a proprietary version of this solution is available in the form of G-Sync, which we had the pleasure of testing in the ASUS ROG SWIFT PG278Q monitor. The irony is that this is just such a monitor for which the benefits of G-Sync are not very noticeable. Refreshing the screen at 144Hz alone cuts down on notorious tearing to the point where many will be willing to turn a blind eye to the problem. And with vertical sync, we have less pronounced “freezes” and input lag compared to 60Hz screens. G-Sync in such a situation can only bring the smoothness of the game to the ideal.

Still, synchronizing screen updates with GPU frame rendering is still a more elegant and cost-effective solution than constantly updating at an ultra-high frequency. Also, let's not forget that the use of G-Sync is not limited to matrices with a frequency of 120/144 Hz. First of all, 4K monitors come to mind, which are still limited to a frequency of 60 Hz both in terms of matrix specifications and video input bandwidth. Then there are IPS monitors, which are also unable to switch to 120/144 Hz due to the limitations of the technology itself.

With a refresh rate of 60Hz, the effect of G-Sync cannot be overstated. If the frame rate consistently exceeds 60 FPS, then a simple vertical sync eliminates tearing just as well, but only G-Sync can keep the frame rate smooth when the frame rate drops below the refresh rate. In addition, G-Sync makes the 30-60 FPS performance range much more playable, either lowering GPU performance requirements or allowing for more aggressive quality settings. And again, the thought returns to 4K monitors, which require extremely powerful hardware to play with good graphics.

It's also commendable that NVIDIA has adopted the pulsing backlight technology for moving object blur removal (ULMB), which we saw earlier with the EIZO Foris FG2421. It's a pity that while it can not work simultaneously with G-Sync.

The ASUS ROG SWIFT PG278Q monitor itself is good, first of all, with a combination of 2560x1440 resolution and a refresh rate of 144 Hz. Previously, there were no devices with such parameters on the market, but meanwhile, gaming monitors with such a low response time and support for stereoscopic 3D are long overdue to grow out of Full HD. The fact that the PG278Q has a TN-matrix is ​​not much to complain about, because this is a really good copy with the highest brightness, contrast and excellent color reproduction, which, after calibration, will be the envy of IPS displays. The technology is only given out by limited viewing angles. Let's not leave without praise the design befitting such a quality product. ASUS ROG SWIFT PG278Q receives a well-deserved Editors' Choice award - it turned out to be so good.

Only the price in the region of 30 thousand rubles prevents us from recommending this gaming monitor for purchase without any hesitation. In addition, at the time of this writing, ASUS ROG SWIFT PG278Q is still not sold in the Russian Federation, so there is nowhere to see it, as well as G-Sync, with your own eyes. But we hope that ASUS and NVIDIA will solve this problem in the future - for example, by showing G-Sync at exhibitions computer games. Well, the price will probably come down someday...

WITH file server site you can download the color profile for this monitor, which we received after calibration.

The editors of the site would like to thank Graphitech for providing the X-Rite i1Display Pro colorimeter.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom's Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is highly influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the most high settings charts. And since Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-sync in the driver and leave the Skyrim settings intact, and then enable G-sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the most best solution will disable G-sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Such innovations are the only way continue to advance our industry while maintaining a technical edge personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-sync based on Nvidia's pendulum demo test, you'll be wondering if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the influence new technology cannot be confused, and in others it is less pronounced. But anyway G-sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

In those good old days, when personal computer owners actively used huge CRT monitors, earning themselves astigmatism, there was no question of image smoothness. The technologies of that time were not very supportive of 3D either. Therefore, poor users had to be content with what they had. But as time goes on, technologies develop, and many are no longer satisfied with the tearing of the frame (tearing) during a dynamic game. This is especially true for the so-called cyber-athletes. In their case, split seconds are everything. How to be?

Progress does not stand still. Therefore, what previously seemed impossible can now be taken for granted. The same situation with image quality on a computer. Manufacturers of video cards and other PC components are now hard at work on the problem of poor-quality image output to monitors. And I must say that they have already advanced quite far. Only a little remains, and the image on the monitor will be perfect. But that's all - lyrical digression. Let's return to our main topic.

A bit of history

Many monitors actively tried to overcome tearing and improve the image. What they didn’t invent: they increased the “hertz” of the monitor, turned on V-Sync. Nothing helped. And one day a famous manufacturer NVIDIA graphics cards presents G-Sync technology, with which you can achieve "unreal" image smoothness without any artifacts. It seems to be good, but there is one small, but very serious “but”. G-Sync enabled monitors are required to use this option. Monitor manufacturers had to work hard and “throw out” a couple of dozen models on the market. What's next? Let's look at the technology and try to figure out if it's so good.

What is G-Sync?

G-Sync is a display technology from NVIDIA. It is characterized by smooth frame changes without any artifacts. No image tearing or stuttering. For adequate operation of this technology, quite a few powerful computer, since processing a digital signal requires rather big processor power. That is why the technology is supplied only with new models of video cards from NVIDIA. In addition, G-Sync is a proprietary feature of NVIDIA, so the owners of video cards from other manufacturers will not get anything.

In addition, a G-Sync monitor is required. The fact is, they are equipped with a board with a digital signal converter. Owners of conventional monitors will not be able to take advantage of this exciting option. Unfair, of course, but such is the policy of modern manufacturers - to pump out as much as possible more money from a poor user. If your PC configuration allows you to use G-Sync, and the monitor miraculously supports this option, then you can fully appreciate all the delights of this technology.

How G-Sync works

Let's try to explain the principle of G-Sync in a simplified way. The fact is that a regular GPU (video card) simply sends digital signal to the monitor, but does not take into account its frequency. That is why the signal when displayed on the screen turns out to be “torn”. The signal coming from the GPU is interrupted by the monitor frequency and looks unsightly in the final version. Even with V-Sync turned on.

When using G-Sync, the GPU itself adjusts the frequency of the monitor. That is why the signals reach the matrix when it is really needed. Thanks to this, it becomes possible to avoid image breaks and improve the smoothness of the picture as a whole. Since ordinary monitors do not allow the GPU to control themselves, a G-Sync monitor was invented, in which an NVIDIA board was introduced that regulates the frequency. Therefore, the use of conventional monitors is not possible.

Monitors that support this technology

Gone are the days when users killed their eyesight by staring at ancient CRT monitors for hours. Current models are elegant and harmless. So why not add some new technology to them? First NVIDIA G-Sync 4K monitor released by Acer. The novelty made quite a splash.

As yet, high-quality monitors with G-Sync are quite rare. But in the plans of the manufacturers there is an idea to make these devices standard. Most likely, in five years, monitors with support for this technology will become the standard solution even for office PCs. In the meantime, it remains only to look at these new items and wait for their widespread distribution. That's when they get cheaper.

After that, monitors with G-Sync support began to rivet all and sundry. There were even budget models with this technology. Although what is the use of this technology on a budget screen with a bad matrix? But, be that as it may, such models do exist. The best option for this option is (G-Sync will work on it in full force).

The best monitors with G-Sync

Monitors with G-Sync technology stand out in a special line of devices. They must have the characteristics necessary for the full operation of this option. It is clear that not all screens will cope with this task. Several leaders in the production of such monitors have already been identified. Their models are very successful.

For example, the G-Sync monitor is one of the brightest representatives of this line. This device belongs to the premium class. Why? Judge for yourself. The screen diagonal is 34 inches, resolution - 4K, contrast ratio - 1:1000, 100 Hz, matrix response time - 5 ms. In addition, Many would like to get this "monster" for themselves. It is clear that he will cope with G-Sync technology with a bang. He has no analogues yet. You can safely call it the best in its class and not be mistaken.

In general, ASUS G-Sync monitors are now at the top of Olympus. Not a single manufacturer has been able to outdo this company. And it is unlikely that this will ever happen. ASUS can be called a pioneer in this regard. Their G-Sync capable monitors are selling like hot cakes.

The Future of G-Sync

Now G-Sync technology is actively trying to implement in laptops. Some manufacturers even released a couple of these models. Moreover, they can work without a G-Sync board in the monitor. Which is understandable. Still, the laptop is somewhat different design features. There is quite enough video card with support for this technology.

It is likely that NVIDIA G-Sync will soon occupy a fair place in the computer industry. Monitors with this technology should be cheaper. Eventually this option should become universally available. Otherwise, what's the point in developing it? In any case, things are not so rosy so far. There are some issues with implementing G-Sync.

In the future, G-Sync technology will become the same everyday thing that we once had a VGA port for connecting a monitor. But all sorts of "vertical synchronization" against the background of this technology look like a blatant anachronism. Not only can these outdated technologies not provide satisfactory picture quality, but they also "eat" a considerable amount of system resources. Definitely, with the advent of G-Sync, their place in the dustbin of history.

G-Sync technology overview | Short story fixed refresh rate

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Electron gun speed control from one complete renovation until the next one, it was not very practiced before, and there was no particular need for this before the advent of three-dimensional games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog ones (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved from a fixed refresh rate. Movies and television still rely on constant frame rate input. Once again, switching to a variable refresh rate doesn't seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not a problem for displays. But it arose when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call frame rate, usually expressed in FPS or frames per second) is inconsistent. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and if you look at the empty sky - 60 FPS.


Disabling sync causes tearing

It turns out that the variable frame rate GPU and fixed refresh rate LCD panels don't work very well together. In this configuration, we are faced with a graphical artifact called "gap". It occurs when two or more incomplete frames are rendered together during one monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect during movement.

The image above shows two well-known artifacts that are often found but difficult to capture. Since these are display artifacts, on normal game screenshots you won't see it, but our pictures show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a video capture card, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; this is the way we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with the camera, the bottom one is through the video capture function. The bottom image is "sliced" horizontally and looks misaligned. In the top two images, the left shot was taken on a Sharp screen at 60Hz, the right shot on an Asus display at 120Hz. The tearing on the 120Hz display isn't as pronounced as the refresh rate is twice as high. However, the effect is visible, and appears in the same way as in the left image. This type of artifact is a clear indication that the images were taken with vertical sync (V-sync) disabled.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite footage is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels don't change color fast enough, resulting in this type of glow. A single frame cannot convey the effect of ghosting on the game itself. A panel with an 8ms grey-to-gray response time, such as the Sharp, will result in blurred image with any movement on the screen. This is why these displays are generally not recommended for FPS games.

V-sync: "an sew on the soap"

Vertical sync, or V-sync, is a very old solution to tearing. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing tearing. The problem is that if your graphics card can't keep frame rates above 60 FPS (on a 60Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). etc.), which in turn will lead to noticeable braking.


When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, because vsync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems while causing other disadvantages. An informal survey of our staff found that gamers tend to turn v-sync off, and turn it on only when tearing becomes unbearable.

Get Creative: Nvidia Introduces G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync, which attempts to mitigate the problems of enabling V-sync when the frame rate is above the monitor's refresh rate, and quickly turning it off when performance falls sharply below the refresh rate. While the technology faithfully performed its function, it was only a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

Implementation G-sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


The GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The Packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a variable blanking module, the LCD panel can operate at a variable refresh rate related to the frame rate output by the graphics card (within the monitor's refresh rate). In practice, Nvidia has been creative in using the special features of the DisplayPort interface and has tried to catch two birds with one stone.

Even before the start of the tests, I would like to pay tribute to creativity to solve a real problem affecting PC gaming. This is innovation at its finest. But what are the results G-sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE review: $400 24" 144Hz gaming monitor", in which the monitor earned the Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync technology overview | 3D LightBoost, built-in memory, standards and 4K

As we browsed Nvidia's press releases, we asked ourselves quite a few questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed is that Nvidia sent the monitor Asus VG248QE, modified to support G-sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally developed to boost the brightness of 3D displays, but for a long time unofficially used in 2D mode, using a pulsating panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-sync.

Nvidia gave a negative answer. While using both technologies at the same time would be the ideal solution, today strobe backlighting at a variable refresh rate results in flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, the two technologies now have to be chosen, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know G-sync eliminates the incremental input lag associated with V-sync, as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for the frame to pass through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-sync noticeably less than one millisecond. In fact, almost the same delay we experience with V-sync turned off, and it is related to the features of the game, video driver, mouse, etc.

Will G-Sync be standardized?

Such a question was asked in a recent interview with AMD, when a reader wanted to know the company's reaction to technology. G-sync. However, we wanted to ask the developer directly and see if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA association.

However, no new specifications for DisplayPort, HDMI, or DVI are planned. G-sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-sync with a technology currently called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-sync and make them more accessible.

G-Sync at Ultra HD Resolutions

Nvidia promises monitors with support G-sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will review today, only supports 1920x1080 pixels. On this moment Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if the module G-sync support MST configuration?

Truth be told, 4K displays with variable frame rates will have to wait. There is no separate 4K upscaling device yet, the nearest one should appear in the first quarter of 2014, and monitors equipped with them - only in the second quarter. Since the module G-sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens before 30 Hz?

G-sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low frequencies refreshing the screen, the image on the LCD screen starts to deteriorate, resulting in visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync technology overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to high refresh rate panels only?

You will notice that the first monitor with G-sync it initially has a very high screen refresh rate (above the level required by the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as a 6-bit TN panel. We became curious, the introduction of technology G-sync is it only planned for high refresh rate displays or will we see it on the more common 60hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

Nvidia reiterated that the best experience from G-sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can really benefit from conventional monitors with a frequency of 60 Hz and a module G-sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-sync) can create a panel with G-sync for much less money.

Speaking of resolutions, it's shaping up like this: QHD screens with a refresh rate of more than 120Hz could start shipping as early as early 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Now, of course, you don't need to combine two graphics adapters in order to display an image in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to comfortably play at this resolution. But there is also no way to run two cards in SLI on three G-sync monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI and one DisplayPort. G-sync requires DisplayPort 1.2 and the adapter will not work (nor will an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia partners will start releasing "G-Sync Edition" cards from big amount DisplayPort connectors.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with v-sync. Is she needed for G-sync? The answer is no. G-sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-sync, because it adds an extra delay frame with no performance gain. Unfortunately, game triple buffering is often set on its own and cannot be bypassed manually.

What about games that usually react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to run at V-sync on a 60Hz panel (although this does make life difficult for us at times due to input lag). To test them, modification of certain files with the .ini extension is required. As it behaves G-sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they limited to 60 FPS?

Secondly, you need a monitor with an Nvidia module G-sync. This module replaces the screen scaler. And, for example, add to the split Ultra HD display G-sync impossible. In today's review, we use a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you can get an idea of ​​​​what impact will have G-sync if manufacturers start installing it in cheaper panels at 60 Hz.

Thirdly, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option to work G-sync on three monitors in Surround mode, it is their connection via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

And finally, do not forget about driver support. The latest package version 331.93 beta is already compatible with G-sync, and we anticipate that future WHQL-certified versions will feature it as well.

test bench

Test bench configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), base frequency 3.5 GHz, overclocked to 4.3 GHz, LGA 2011, 15 MB shared L3 cache, Hyper-Threading enabled, power savings enabled.
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 & 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb/s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2 GB
power unit Corsair AX860i 860W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now we need to figure out in what cases G-sync has the biggest impact. Chances are good that you are already using a monitor with a refresh rate of 60Hz. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that the majority of enthusiasts on the market will still stick to 60 Hz.

With V-sync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't deliver 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With v-sync disabled, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. For some players, this is so distracting that they simply turn on V-sync and endure stuttering and input lag.

With refresh rates of 120 and 144 Hz and higher frame rates, the display refreshes more frequently, reducing the amount of time a single frame persists across multiple screen scans when performance is poor. However, problems with active and inactive vertical sync persist. For this reason, we will test the Asus monitor in 60 and 144 Hz mode with technology on and off. G-sync .

G-Sync technology overview | Testing G-Sync with V-Sync enabled

It's time to start testing G-sync. It remains only to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it's wrong.

Today we measure not performance, but quality. In our case, the tests can show only one thing: the frame rate at a particular point in time. About the quality and experience of use with the technology turned on and off G-sync they say absolutely nothing. Therefore, we will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Because the G-sync introduces a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-sync on, G-sync off, 60Hz, 120Hz, 144Hz, ... The list goes on and on. But we'll start with a 60Hz refresh rate and active vsync.

It's probably easiest to start with Nvidia's own demo utility, which swings the pendulum from side to side. The utility can simulate a frame rate of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-sync. Although the test is fictional, it demonstrates the capabilities of the technology well. You can watch a scene at 50 FPS with vsync turned on and think: "Everything is quite good, and visible stuttering can be tolerated." But after activation G-sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a tech demo. I would like evidence based on real games. To do this, you need to run the game with high system requirements, such as Arma III.

In Arma III can be installed in a test machine GeForce GTX 770 and set ultra settings. With V-sync disabled, the frame rate fluctuates between 40 and 50 FPS. But if you enable V-sync, it will drop to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the graphics card simply decreases.

Since there was no image freeze, there was a significant difference when activating G-sync not noticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, as the same frame is not kept across multiple monitor scans. We feel that Arma is generally less "jerky" than many other games, so you don't feel any lag.

On the other hand, in Metro: Last Light, the influence G-sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 resolution with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA options from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes an antechamber where it's easy to strafe back and forth. Running the level with V-sync active at 60 Hz, we entered the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA causes fluctuations from 60 to 30 FPS, from which each duplicated frame creates an inconvenience. This is one of the games where we would definitely disable v-sync and just ignore tearing. Many people have already developed a habit.

However G-sync removes all negative effects. You no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower one more graphical setting. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn about this later.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom's Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is highly influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And since Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-sync in the driver and leave the Skyrim settings intact, and then enable G-sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Such innovation is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-sync based on Nvidia's pendulum demo test, you'll be wondering if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the impact of the new technology cannot be confused, while in others it is less pronounced. But anyway G-sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

G-Sync technology overview | A Brief History of Fixed Refresh Rates

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of the electron gun from one full upgrade to the next was not very practiced before, and there was no particular need for this before the advent of three-dimensional games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog ones (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved from a fixed refresh rate. Movies and television still rely on constant frame rate input. Once again, switching to a variable refresh rate doesn't seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not a problem for displays. But it arose when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call frame rate, usually expressed in FPS or frames per second) is inconsistent. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and if you look at the empty sky - 60 FPS.


Disabling sync causes tearing

It turns out that the variable frame rate of the GPU and the fixed refresh rate of the LCD panel do not work very well together. In this configuration, we are faced with a graphical artifact called "gap". It occurs when two or more incomplete frames are rendered together during one monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect during movement.

The image above shows two well-known artifacts that are often found but difficult to capture. Since these are display artifacts, you won't see them in normal game screenshots, but our screenshots show what you actually see during the game. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a video capture card, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; this is the way we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with the camera, the bottom one is through the video capture function. The bottom image is "sliced" horizontally and looks misaligned. In the top two images, the left shot was taken on a Sharp screen at 60Hz, the right shot on an Asus display at 120Hz. The tearing on the 120Hz display isn't as pronounced as the refresh rate is twice as high. However, the effect is visible, and appears in the same way as in the left image. This type of artifact is a clear indication that the images were taken with vertical sync (V-sync) disabled.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite footage is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels don't change color fast enough, resulting in this type of glow. A single frame cannot convey the effect of ghosting on the game itself. A panel with an 8ms grey-to-gray response time, such as the Sharp, will result in a blurry image with any movement on the screen. This is why these displays are generally not recommended for FPS games.

V-sync: "an sew on the soap"

Vertical sync, or V-sync, is a very old solution to tearing. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing tearing. The problem is that if your graphics card can't keep frame rates above 60 FPS (on a 60Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). etc.), which in turn will lead to noticeable braking.


When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, because vsync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems while causing other disadvantages. An informal survey of our staff found that gamers tend to turn v-sync off, and turn it on only when tearing becomes unbearable.

Get Creative: Nvidia Introduces G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync, which attempts to mitigate the problems of enabling V-sync when the frame rate is above the monitor's refresh rate, and quickly turning it off when performance falls sharply below the refresh rate. While the technology faithfully performed its function, it was only a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

Implementation G-sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


The GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The Packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a variable blanking module, the LCD panel can operate at a variable refresh rate related to the frame rate output by the graphics card (within the monitor's refresh rate). In practice, Nvidia has been creative in using the special features of the DisplayPort interface and has tried to catch two birds with one stone.

Even before the tests begin, I want to give credit for the creative approach to solving a real problem that affects PC games. This is innovation at its finest. But what are the results G-sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE review: $400 24" 144Hz gaming monitor", in which the monitor earned the Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync technology overview | 3D LightBoost, built-in memory, standards and 4K

As we browsed Nvidia's press releases, we asked ourselves quite a few questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed is that Nvidia sent the monitor Asus VG248QE, modified to support G-sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using a pulsing panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-sync.

Nvidia gave a negative answer. While using both technologies at the same time would be the ideal solution, today strobe backlighting at a variable refresh rate results in flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, the two technologies now have to be chosen, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know G-sync eliminates the incremental input lag associated with V-sync, as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for the frame to pass through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-sync noticeably less than one millisecond. In fact, almost the same delay we experience with V-sync turned off, and it is related to the features of the game, video driver, mouse, etc.

Will G-Sync be standardized?

Such a question was asked in a recent interview with AMD, when a reader wanted to know the company's reaction to technology. G-sync. However, we wanted to ask the developer directly and see if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA association.

However, no new specifications for DisplayPort, HDMI, or DVI are planned. G-sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-sync with a technology currently called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-sync and make them more accessible.

G-Sync at Ultra HD Resolutions

Nvidia promises monitors with support G-sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will review today, only supports 1920x1080 pixels. Ultra HD monitors currently use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if the module G-sync support MST configuration?

Truth be told, 4K displays with variable frame rates will have to wait. There is no separate 4K upscaling device yet, the nearest one should appear in the first quarter of 2014, and monitors equipped with them - only in the second quarter. Since the module G-sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens before 30 Hz?

G-sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low screen refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync technology overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to high refresh rate panels only?

You will notice that the first monitor with G-sync it initially has a very high screen refresh rate (above the level required by the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as a 6-bit TN panel. We became curious, the introduction of technology G-sync is it only planned for high refresh rate displays or will we see it on the more common 60hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

Nvidia reiterated that the best experience from G-sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can really benefit from conventional monitors with a frequency of 60 Hz and a module G-sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-sync) can create a panel with G-sync for much less money.

Speaking of resolutions, it's shaping up like this: QHD screens with a refresh rate of more than 120Hz could start shipping as early as early 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Now, of course, you don't need to combine two graphics adapters in order to display an image in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to comfortably play at this resolution. But there is also no way to run two cards in SLI on three G-sync monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI and one DisplayPort. G-sync requires DisplayPort 1.2 and the adapter will not work (nor will an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia partners will start releasing "G-Sync Edition" cards with more DisplayPort connectors.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with v-sync. Is she needed for G-sync? The answer is no. G-sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-sync, because it adds an extra delay frame with no performance gain. Unfortunately, game triple buffering is often set on its own and cannot be bypassed manually.

What about games that usually react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to run at V-sync on a 60Hz panel (although this does make life difficult for us at times due to input lag). To test them, modification of certain files with the .ini extension is required. As it behaves G-sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they limited to 60 FPS?

Secondly, you need a monitor with an Nvidia module G-sync. This module replaces the screen scaler. And, for example, add to the split Ultra HD display G-sync impossible. In today's review, we use a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you can get an idea of ​​​​what impact will have G-sync if manufacturers start installing it in cheaper panels at 60 Hz.

Thirdly, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option to work G-sync on three monitors in Surround mode, it is their connection via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

And finally, do not forget about driver support. The latest package version 331.93 beta is already compatible with G-sync, and we anticipate that future WHQL-certified versions will feature it as well.

test bench

Test bench configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), 3.5 GHz base clock, 4.3 GHz overclock, LGA 2011, 15 MB shared L3 cache, Hyper-Threading enabled, power saving features enabled.
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 & 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb/s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2 GB
power unit Corsair AX860i 860W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now we need to figure out in what cases G-sync has the biggest impact. Chances are good that you are already using a monitor with a refresh rate of 60Hz. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that the majority of enthusiasts on the market will still stick to 60 Hz.

With V-sync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't deliver 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With v-sync disabled, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. For some players, this is so distracting that they simply turn on V-sync and endure stuttering and input lag.

With refresh rates of 120 and 144 Hz and higher frame rates, the display refreshes more frequently, reducing the amount of time a single frame persists across multiple screen scans when performance is poor. However, problems with active and inactive vertical sync persist. For this reason, we will test the Asus monitor in 60 and 144 Hz mode with technology on and off. G-sync .

G-Sync technology overview | Testing G-Sync with V-Sync enabled

It's time to start testing G-sync. It remains only to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it's wrong.

Today we measure not performance, but quality. In our case, the tests can show only one thing: the frame rate at a particular point in time. About the quality and experience of use with the technology turned on and off G-sync they say absolutely nothing. Therefore, we will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Because the G-sync introduces a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-sync on, G-sync off, 60Hz, 120Hz, 144Hz, ... The list goes on and on. But we'll start with a 60Hz refresh rate and active vsync.

It's probably easiest to start with Nvidia's own demo utility, which swings the pendulum from side to side. The utility can simulate a frame rate of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-sync. Although the test is fictional, it demonstrates the capabilities of the technology well. You can watch a scene at 50 FPS with vsync turned on and think: "Everything is quite good, and visible stuttering can be tolerated." But after activation G-sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a tech demo. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

In Arma III can be installed in a test machine GeForce GTX 770 and set ultra settings. With V-sync disabled, the frame rate fluctuates between 40 and 50 FPS. But if you enable V-sync, it will drop to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the graphics card simply decreases.

Since there was no image freeze, there was a significant difference when activating G-sync not noticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, as the same frame is not kept across multiple monitor scans. We feel that Arma is generally less "jerky" than many other games, so you don't feel any lag.

On the other hand, in Metro: Last Light, the influence G-sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 resolution with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA options from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes an antechamber where it's easy to strafe back and forth. Running the level with V-sync active at 60 Hz, we entered the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA causes fluctuations from 60 to 30 FPS, from which each duplicated frame creates an inconvenience. This is one of the games where we would definitely disable v-sync and just ignore tearing. Many people have already developed a habit.

However G-sync removes all negative effects. You no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower one more graphical setting. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn about this later.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom's Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is highly influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And since Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-sync in the driver and leave the Skyrim settings intact, and then enable G-sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Such innovation is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-sync based on Nvidia's pendulum demo test, you'll be wondering if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the impact of the new technology cannot be confused, while in others it is less pronounced. But anyway G-sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

Internet