G-Sync vs FreeSync: Which Is The Best?

Is G-Sync better than FreeSync? What the hell is V-Sync? How do they differ from each other? These are all terms that often confuse a person when first hearing about them so We created this guide to help address these questions so that you can better understand them and make a more informed purchasing decision.

Frames Per Second and Refresh Rate

A few definitions need to be clarified before truly understanding adaptive sync technology and the differences between G-Sync, FreeSync, and V-Sync. First, Frames Per Second (FPS) refers to the number of frames or images that your graphics card can create every second. The more frames that are output per second the smoother the image and the game being played will be. The Refresh Rate is tied to the monitor and refers to how many frames your monitor can refresh every second. The refresh rate of a monitor is measured in hertz (Hz).

In general, if the amount of FPS your graphics card is producing is equal to or slightly greater than your monitor’s refresh rate then you will have a smooth gameplay experience. For example, if you have a 120hz monitor and your graphics card is outputting a constant 120 FPS then you’re seeing the smoothest image that your gear possibly can have. Most gaming monitors are still 60Hz while many monitors nowadays are 120Hz or higher. Simply put, the higher the refresh rate, the higher the maximum number of frames per second that can be displayed.

Frame Buffer

The Frame Buffer is basically a temporary storage that receives frames that your graphics card is generating during a game. These frames are then sent to your monitor where they are then displayed. Typically two buffered images are present at any moment. A primary buffer which is the one that is being shown your screen and a secondary buffer which is the image that is to be displayed next. When the two buffers are swapped and a next frame is displayed, the secondary buffer replaces the primary buffer while the secondary buffer is itself replaced by a newly generated one.

Screen Tearing

Screen Tearing (shown above) usually occurs when you have a powerful graphics card or are running a game that isn’t graphics intensive. It happens because your graphics card is producing more frames than your gaming monitor’s refresh rate. If the frame buffer swap occurs between a monitor refresh then your display it will be sent two or more images. Because information for multiple frames is simultaneously sent to your monitor’s display a screen tearing misalignment like in the example above can happen.

V-Sync

V-Sync, which stands for Vertical Synchronization, limits the number of frames your graphics card is outputting to match the refresh rate of your gaming monitor. Its purpose is to stop screen tearing misalignment and does what its name says, vertically synchronize. The method lowers your graphics card to only output a frame every time the monitor refreshes. Good in theory, but V-Sync is not bulletproof. For example, if your frame rate drops even just one frame below your monitor’s refresh rate then the refresh rate will compensate and be temporarily halved in order to keep in sync with the frame rate. If the graphics card hasn’t generated a new frame then the previous one is used, resulting in stuttering.

G-Sync

G-Sync logoG-Sync is NVIDIA’s patented implementation which requires monitors to include proprietary hardware and costs more. You are limited into only using a NVIDIA brand graphics card with the monitor if you wish to use G-Sync. It works under the same principle as FreeSync, but in comparison the current G-Sync monitors fare better at lower frame rates. Nowadays due to AMD implementing low framerate compensation the two are mostly equal in terms of performance. The largest drawback remains in its increased cost which stems from licensing fees. A feature exclusive to G-Sync that some might find useful is its ability to work in “borderless windowed” mode which allows you to quickly Alt+Tab between your game and other programs.

FreeSync

FreeSync logoWith AMD’s FreeSync technology the monitor automatically changes the refresh rate that it is operating at to match the frame rate of the graphics card. In order for a monitor to be officially labeled under the FreeSync brand it must live up to AMD’s FreeSync certification standards. If you want a FreeSync monitor and want to take advantage of its adaptive sync you’ll need an AMD card. AMD has announced that in 2016 FreeSync will be able to work over HDMI and not just DisplayPort like G-Sync.

The following table summarizes the differences between the two competing technologies:

FeatureAMD FreeSyncNVIDIA G-Sync
Proprietary Module Required
Open Standards
Licensing Fees
Uses DisplayPort
  • dkg

    So is freesync compatible with all GPUs or just AMD?

    • No, currently if you want a FreeSync monitor you’ll need an AMD card and if you want a G-Sync monitor you’ll need an NVIDIA card.

      • dkg

        Oh thanks for clearing that up.