Under The Hood for Displays: Custom Resolutions, Freesync Improvements, & Framerate Target Control

Continuing our look into Crimson’s new features, AMD has also implemented some new & improved functionality specifically targeted at displays. The company has been more aggressive about display technologies and features since embarking on their Freesync project, and that is reflected in some of the changes made here.

Custom Resolution Support

First and foremost, AMD has finally (and at long last) implemented support for custom resolutions within their control panel. Custom resolution support is something of a niche feature – most users will never find it, let alone need it – however it’s extremely useful for those users who do need it. In our own case, for example, we use this feature with our Sharp PN-K321 4K monitor in order to run 1440p@60Hz on it, as the monitor doesn’t explicitly support that setting and Windows would rather upscale 1440p to 2160p@30Hz when left to its own devices.

Custom resolution support is another area where AMD is catching up with NVIDIA, as the latter has supported custom resolutions for several years now. In the meantime it’s been possible to use third-party utilities such as Custom Resolution Utility with AMD’s drivers to force the matter, but bringing support within AMD’s drivers is still a notable improvement.

AMD has never previously supported this feature in part due to the very low but nonetheless real risk of damage. If given video settings it can’t use, a properly behaving monitor should simply reject the input. However not all devices are perfect, and it is possible (however unlikely) that a monitor could damage itself trying to run with unsupported settings. This is why for both AMD and NVIDIA, custom resolutions come with a warning and are not covered under their respective warranties.

On a side note, one thing that was interesting to find was that this was one of the features not implemented in Radeon Settings. Rather the custom resolution control panel is part of the pared down Catalyst Control Center, now called Radeon Additional Settings. Considering that AMD never supported custom resolutions until now, it’s a bit surprising that they’d add it to CCC rather than Radeon Settings. But I suspect this has a lot to do with why CCC is still in use to begin with; that not all of the necessary monitor controls are available in Radeon Settings at this time.

Freesync Improvements: Low Framerate Compensation

With Omega AMD included initial support for Freesync, and now with Crimson AMD is rolling out some new Freesync functionality that changes how the technology works at the GPU level.

NVIDIA, never one to shy away from throwing barbs at the competition, has in the past called out AMD for how Freesync has handled minimum refresh rates. Specifically, that when the framerate falls below the minimum refresh rate of the monitor, Freesync setups would revert to non-Freesync operation, either locking into v-sync operation or traditional v-sync off style tearing depending on whether v-sync was enabled. Though in some ways better than what NVIDIA offered at the time (v-sync control) on the other hand it meant that the benefits of Freesync would be lost if the framerate fell below the minimum. Meanwhile NVIDIA, though not publishing exactly what they do, would seem to use some form of frame repeating in order to keep G-Sync active, repeating frames to keep variable refresh going rather than working at the minimum refresh rate.

This is something AMD appears to have taken to heart, and while they don’t specifically name NVIDIA in their presentation, all signs point to it being a reaction to NVIDIA’s barbs and marketing angle. As a result the Crimson driver introduces a new technology for Freesync which AMD is calling Low Framerate Compensation (LFC). LFC is designed to directly address what the GPU and Freesync monitor do when the framerate falls below the minimum refresh rate.

In AMD’s slide above, they list out the five refresh scenarios, and the two scenarios that LFC specifically applies to. So long as the framerate is above the minimum refresh rate, Freesync is unchanged. However when the framerate falls below the minimum, AMD has instituted a series of changes to reduce judder. Unfortunately, not unlike NVIDIA, AMD is treating this as a “secret sauce” and isn’t disclosing what exactly they’re doing to alleviate the issue. However based on what we’re seeing and AMD’s description (along with practical solutions to the problem), our best guess is that AMD is implementing frame repeating to keep the instantaneous refresh rate above the monitor’s minimum.

Frame reuse is simple in concept but tricky in execution. Not unlike CrossFire, there’s a strong element of prediction here, as the GPU needs to guess when the next frame may be ready so that it can set the appropriate refresh rate and repeat a frame the appropriate number of times. Hence, in one of the few things they do say about the technology, that AMD is implementing an “adaptive algorithm” to handle low framerate situations. Ultimately if AMD does this right, then it should reduce judder both when v-sync is enabled and when it is disabled, by aligning frame repeats and the refresh rate such that the next frame isn’t unnecessarily delayed.

The good news here is that this is a GPU-side change, so it doesn’t require any changes to existing monitors – they simply receive new variable refresh timings. However in revealing a bit more about the technology, AMD does note that LFC is only enabled with monitors that have a maximum refresh rate greater than or equal to 2.5 times the minimum refresh rate (e.g. 30Hz to 75Hz), as AMD needs a wide enough variable refresh range to run at a multiple of framerates right on the edge of the minimum (e.g. 45fps). This means LFC can’t be used with Freesync monitors that have a narrow refresh rate, such as the 48Hz to 75Hz models. Ultimately owners of those monitors don’t lose anything, but they also won’t gain anything with LFC.

As it stands we’ve only had a very limited amount of time to toy with Freesync on the new drivers, but what we’re seeing so far looks solid. But we’re definitely curious in seeing how daily Freesync users respond to this.

Finally, along with the LFC news, for the Crimson driver release AMD has offered a brief update on the status of Freesync-over-HDMI, reiterating that the company is still working on the technology. AMD first demonstrated the concept at Computex 2015 back in June, and while they still have a long way to go before it can make it into a retail product, the company continues to believe adaptive-synchronization is a viable and meaningful addition for HDMI.

Framerate Target Control: Wider Ranges

Back in June for the launch of the Radeon R9 Fury X, AMD introduced a new frame limiting feature called Framerate Target Control (FRTC). FRTC offered an alternative to v-sync, allowing users to cap the framerate of a game at an arbitrary framerate, selected via AMD’s control panel. While FRTC worked it had one unfortunate limitation, and that was that it only worked over a very limited range – 55fps to 95fps. Though this was sufficient to cap the framerate right below 60fps or directly above it, users have been asking for wider ranges to support higher framerate monitors or to limit a game to even lower framerates such as 30fps.

For Crimson AMD has gone ahead and widened the range of FRTC. It can now cap a game at between 30fps and 200fps, a range over four-times as wide. At the same time AMD has mentioned in our briefing that they’ve also done some additional work to better restrict GPU clockspeeds when FRTC is in use to maximize the power savings from using it to limit the amount of work the GPU does. Now the GPU will operate at a lower clockspeed more often, increasing the amount of power saved versus letting a video card run uncapped.

Under The Hood: DirectX 9, Shader Caching, Liquid VR, and Power Consumption Radeon Settings: The New Face of AMD’s Drivers


View All Comments

  • looncraz - Wednesday, November 25, 2015 - link

    I only play BF4 with Mantle, and I've never noticed a single glitch (I did when it first came out (with colors), so I ran DX11 for a while).

    The resolution actually doesn't dictate how much RAM you need as much as people think. A 1080p frame buffer only weighs in at ~8MB, 4k is ~34MB. You need VRAM to store all of the textures and other game data. Your resolution has an effect on VRAM use only for certain features.
  • i_create_bugs - Wednesday, November 25, 2015 - link

    Except that you also need room for multiple render targets. Not just RGBA. Typically diffuse, normal, stencil, etc. Plus on top of that you need stencil/Z-buffer. Those buffers can also be 64 bits per pixel, if float16 pixel formats are used.

    Additionally sometimes frame buffer width is a bit more than actual resolution due to hardware limitations. So 1920 wide buffer might actually have room for 2048 pixels in real memory layout.

    Lower end guess for 1080P is 32 x 2^ceil(log2(1920)) x 1080. So at least 32 x 2048 x 1080 bytes. 67.5 MB per frame at 1080P. For 4k (3840x2160), 32 x 4096 x 2160 = 270 MB.

    Plus op top of that you need some RGBA frames for double / triple buffering.
  • looncraz - Thursday, November 26, 2015 - link

    You calculated it for BITS, not BYTES.

    Also, we usually end up aligning just a few pixels on the end (as in two or three).

    A 1920x1080 buffer will be allocated as a slightly wider, but no higher, buffer. FOUR bytes per pixel (not 32). That gives 7.91MB per frame buffer.

    As for the z coordinate, we usually use the last 8 bits of the above buffer. Why? Because 8-bits per color channel is what anyone usually ever uses. This is called D24S8.

    When you increase the resolution of your game yourself, you are increasing the size of the frame buffers, including the flip queue, post-processing buffers, and a few others. Basically, you can generally assume there are 15 frame-buffer linked sized buffers.

    So at 1080p, you need 118MB of VRAM for the buffers, and at 4k you need 475MB. This is why you can see VSR running so well on video cards with only 2GB of RAM. You do need more RAM, but it isn't drastic. What can make a more drastic difference is the game using resolution-specific textures. THAT can eat up an extra GB or so, depending on the game developer. Older games, or games meant for 1080p, however, will not have 4k texture packs.
  • The_Countess - Friday, November 27, 2015 - link

    you just described the game running in directX as well. Reply
  • dsumanik - Tuesday, November 24, 2015 - link

    Nvidia drivers have been superior for the last decade, end of story. I suspect in many cases when the silicon battle was close, this is how NVIDIA kept the edge.

    If crimson can fulfill it's ambitious vision, things will get might interesting next year.

    Got my fingers crossed for ya AMD.
  • Dalamar6 - Wednesday, November 25, 2015 - link

    NVidia's superior drivers are why AMD was bargain binned even during the times when their performance:price ratio was actually significantly better.

    Of course we're talking Windows, AMD literally has NO linux presence at all, and literally cripples rolling distributions, and this rebadged driver won't change that.
  • Gigaplex - Wednesday, November 25, 2015 - link

    AMD has their open source driver presence. For a lot of their hardware, it's very stable and performs well. It's pretty slow to support brand new hardware though. Reply
  • Fallen Kell - Wednesday, November 25, 2015 - link

    You mean 2D support is pretty stable and performs well. 3D is abysmal performance. There is a reason why not a single Steam Machine configuration out there has an AMD graphic card as an option, and it is because they all have HORRIBLE 3D performance. Reply
  • Beany2013 - Wednesday, November 25, 2015 - link

    As a user of Ubuntu and Debian, and AMD GPUs, I have to agree with Fallen Kell; it's not as bad as it was, but major updates (such as GCC updates, as happened with Ubuntu 15.10) utterly, utterly break things.

    It's working now on Wiley-proposed, but jesus, what a pain in the arse.

    I'm hoping that this, and other pressure (like not having any realistic Steam Machine presence) might make will force them to up their game. Majorly.

    Performance when it works though, is fine - in some cases though, it's just that you have to force it to work. With hard liquor. And swearing. And Fire.
  • FourEyedGeek - Wednesday, November 25, 2015 - link

    There is one area AMD beats NVIDIA in drivers, old cards. NVIDIA haven't paid as much attention to older cards as AMD has, though it could be because AMD use the same architecture for longer periods of time. At release the NVIDIA 680 was faster than the 7970, but on modern games with new drivers the 7950 can even beat the 680 in some games. Reply

Log in

Don't have an account? Sign up now