There are a few key differences between the two types of graphics synchronization technologies. FreeSync is designed to improve performance by working together with your graphics card to create a single, unified image. This image is then used to display the content on your screen. G-Sync is designed to improve performance by using multiple GPUs to create an image that is then used to display the content on your screen. This image must be created separately and can take longer to load.
Different Name, Same Purpose
Both FreeSync and G-SYNC are variable refresh rate (VRR) technologies. They work in largely the same way and seek to solve the same problem in the gaming space: screen tearing.
Screen tearing occurs when there’s a mismatch between the monitor’s refresh rate and the GPU’s ability to keep up. Under load, the GPU may not be able to get a full frame ready so a partial frame is sent instead. This results in the new frame being overlaid on top of the old frame, causing unsightly “tearing” artifacts.
Variable refresh rate technology like FreeSync, G-SYNC, and VESA Adaptive-Sync solves the problem by opening a line of dialog between the monitor and GPU. By instructing the monitor to wait or buffer frames, partial frames are not sent.
The result is that the monitor only refreshes to display a new frame when that frame is ready. This makes games look better and motion appear smoother, and it’s common to find these technologies on most modern monitors and televisions.
RELATED: What is a Monitor’s Refresh Rate and How Do I Change It?
FreeSync Is an AMD Technology
FreeSync is AMD‘s variable refresh rate implementation. It’s free for monitor manufacturers to implement and requires no special hardware inside of the monitor to work.
There are three tiers of FreeSync to choose from, with the base implementation simply titled FreeSync. This works over HDMI 1.4 or DisplayPort 1.2a and supports refresh rates of 60Hz. It’s found all over the place, even on bargain office-style monitors.
FreeSync Premium offers 120Hz gaming and adds Low Framerate Compensation (LFC) into the mix. This can help smooth out gameplay stutters caused by frame rate dips. Finally, there’s FreeSync Premium Pro which adds additional HDR capabilities, but must also be supported by the game you’re playing.
While FreeSync can work at 30Hz or lower (according to AMD), many FreeSync displays have a lower limit of 48Hz to be effective.
RELATED: How Do Frame Rates Affect the Gaming Experience?
G-SYNC Is NVIDIA Technology
NVIDIA‘s take on variable refresh rate technology is called G-SYNC, and it too has several tiers to choose from. G-SYNC usually requires a DisplayPort connection but some televisions are adding support for the technology over HDMI (notably LG’s OLEDs).
At the bottom rung are G-SYNC Compatible displays, which have been certified by NVIDIA to be tear-free with no visible artifacts. Many FreeSync displays have been added to this list, since G-SYNC Compatible monitors don’t require additional hardware.
Standard G-SYNC displays do have a dedicated chip inside them, which may push up the price of the monitor. This allows NVIDIA to offer frame doubling below 30Hz on most models, which is a great benefit if you encounter choppy gameplay. G-SYNC also does a better job at eliminating blur than less capable alternatives.
At the upper end is G-SYNC Ultimate which supports frame rates of 144Hz and higher, for high-end PC gaming and competitive online play. This tier includes robust support for HDR, low latency gameplay, and calibrated sRGB and P3 support.
RELATED: How to Enable Ultra-Low Latency Mode for NVIDIA Graphics
Variable Refresh Rate Is the Real Benefit
While higher tiers of VRR implementation have their benefits, tear-free gaming is the real draw here. There’s no longer a need to limit your frame rate to 60 frames per second as is the case with older V-Sync implementation, which means you can take advantage of your hardware’s true potential.
Even the latest consoles have VRR—learn more about how Xbox and PlayStation use this technology.