This is something that initially caught me off-guard when I first realized it, but AMD historically hasn’t liked to talk about their GPU plans much in advance. On the CPU size we’ve heard about Carrizo and Zen years in advance. Meanwhile AMD’s competitor in the world of GPUs, NVIDIA, releases some basic architectural information over a year in advance as well. However with AMD’s GPU technology, we typically don’t hear about it until the first products implementing new technology are launched.

With AMD’s GPU assets having been reorganized under the Radeon Technologies Group (RTG) and led by Raja Koduri, RTG has recognized this as well. As a result, the new RTG is looking to chart a bit of a different course, to be a bit more transparent and a bit more forthcoming than they have in the past. The end result isn’t quite like what AMD has done with their CPU division or their competition has done with GPU architectures – RTG will talk about both more or less depending on the subject – but among several major shifts in appearance, development, and branding we’ve seen since the formation of the RTG, this is another way in which RTG is trying to set itself apart from AMD’s earlier GPU groups.

As part of AMD’s RTG technology summit, I had the chance to sit down and hear about RTG’s plans for their visual technologies (displays) group for 2016. Though RTG isn’t announcing any new architecture or chips at this time, the company has put together a roadmap for what they want to do with both hardware and software for the rest of 2015 and in to 2016. Much of what follows isn’t likely to surprise regular observers of the GPU world, but it none the less sets some clear expectations for what is in RTG’s future over much of the next year.

DisplayPort 1.3 & HDMI 2.0a: Support Coming In 2016

First and foremost then, let’s start with RTG’s hardware plans. As I mentioned before RTG isn’t announcing any new architectures, but they are announcing some of the features that the 2016 Radeon GPUs will support. Among these changes is a new display controller block, upgrading the display I/O functionality we’ve seen as the cornerstone of AMD’s GPU designs since GCN 1.1 was first launched in 2013.

The first addition here is that RTG’s 2016 GPUs will be including support for DisplayPort 1.3. We’ve covered the announcement of DisplayPort 1.3 separately in the past, where in 2014 the VESA announced the release of the 1.3 standard. DisplayPort 1.3 will introduce a faster signaling mode for DisplayPort – High Bit Rate 3 (HBR3) – which in turn will allow DisplayPort 1.3 to offer 50% more bandwidth than the current DisplayPort 1.2 and HBR2, boosting DisplayPort’s bandwidth to 32.4 Gbps before overhead.

DisplayPort Supported Resolutions
Standard Max Resolution
(RGB/4:4:4, 60Hz)
Max Resolution
(4:2:0, 60Hz)
DisplayPort 1.1 (HBR1) 2560x1600 N/A
DisplayPort 1.2 (HBR2) 3840x2160 N/A
DisplayPort 1.3 (HBR3) 5120x2880 7680x4320

The purpose of DisplayPort 1.3 is to offer the additional bandwidth necessary to support higher resolution and higher refresh rate monitors than the 4K@60Hz limit of DP1.2. This includes supporting higher refresh rate 4K monitors (120Hz), 5K@60Hz monitors, and 4K@60Hz with higher color depths than 8 bit per channel color (necessary for a good HDR implementation). DisplayPort’s scalability via tiling has meant that some monitor configurations have been possible even via DP1.2 by utilizing MST over multiple cables, however with DP1.3 it will now be possible to support those configurations in a simpler SST configuration over a single cable.

For RTG this is important on several levels. The first is very much pride – the company has always been the first GPU vendor to implement new DisplayPort standards. But at the same time DP1.3 is the cornerstone of multiple other efforts for the company. The additional bandwidth is necessary for the company’s HDR plans, and it’s also necessary to support the wider range of refresh rates at 4K necessary for RTG’s Freesync Low Framerate Compensation tech, which requires a 2.5x min:max ratio to function. That in turn has meant that while RTG has been able to apply LFC to 1080p and 1440p monitors today, they won’t be able to do so with 4K monitors until DP1.3 gives them the bandwidth necessary to support 75Hz+ operation.

Meanwhile DisplayPort 1.3 isn’t the only I/O standard planned for RTG’s 2016 GPUs. Also scheduled for 2016 is support for the HDMI 2.0a standard, the latest generation HDMI standard. HDMI 2.0 was launched in 2013 as an update to the HDMI standard, significantly increasing HDMI’s bandwidth to support 4Kp60 TVs, bringing it roughly on par with DisplayPort 1.2 in terms of total bandwidth. Along with the increase in bandwidth, HDMI 2.0/2.0a also introduced support for other new features in the HDMI specification such as the next-generation BT.2020 color space, 4:2:0 chroma sampling, and HDR video.

That HDMI has only recently caught up to DisplayPort 1.2 in bandwidth at a time when DisplayPort 1.3 is right around the corner is one of those consistent oddities in how the two standards are developed, but none the less this important for RTG. HDMI is not only the outright standard for TVs, but it’s the de facto standard for PC monitors as well; while you can find DisplayPort in many monitors, you would be hard pressed not to find HDMI. So as 4K monitors become increasingly cheap – and likely start dropping DisplayPort in the process – supporting HDMI 2.0 will be important for RTG for monitors just as much as it is for TVs.

Unfortunately for RTG, they’re playing a bit of catch-up here, as the HDMI 2.0 standard is already more than 2 years old and has been supported by NVIDIA since the Maxwell 2 architecture in 2014. Though they didn’t go into detail, I was told that AMD/RTG’s plans for HDMI 2.0 support were impacted by the cancelation of the company’s 20nm planar GPUs, and as a result HDMI 2.0 support was pushed back to the company’s 2016 GPUs. The one bit of good news here for RTG is that HDMI 2.0 is still a bit of a mess – not all HDMI 2.0 TVs actually support 4Kp60 with full chroma sampling (4:4:4) – but that is quickly changing.

FreeSync Over HDMI to Hit Retail In Q1’16
Comments Locked


View All Comments

  • Nintendo Maniac 64 - Wednesday, December 9, 2015 - link

    I wouldn't be surprised if you could do 4k HDR at 90 or 100Hz.
  • Frenetic Pony - Tuesday, December 8, 2015 - link

    RE HDR rendering. HDR rendering for games is, in some ways, really easy. Any halfway decent high end game today already tonemaps from 16bits per channel from hdr, though some hacks like 10bit render targets for less important steps may have to go (tests will need to be done).

    No the hard part will be content creation. Textures will need to be done in a higher colorspace, meaning capture equipment like cameras and etc. will need to support it, then monitors will need to support it, then art tools will need to support it, then the game engine will need to support it. Considering how long high end games take to make it's going to take quite a while before it shows up unfortunately.
  • Mr Perfect - Wednesday, December 9, 2015 - link

    Actually, I think the graphics professionals are already ahead of us here. If you look at high-end displays meant for graphics work(think $2k IPS screens from NEC), they're all 10 bit with internal color processing of 14bit. It's just us plebians who are stuck with 8bit panels.
  • HollyDOL - Wednesday, December 9, 2015 - link

    Got Eizo CX271 myself and can confirm, more bits really make difference, esp. when wife works with raw photos. Alas for movies/games so far there is no difference I could see (esp. for movies with their 16-235 range). Few apps that more-or-less support rendering in HDR (like 10b per channel) seem to have impact especially with darker tones which look more like you would expect compared to 8bits per channel. I don't expect though having 10+ bits per channel as matured standart in games or general desktop apps for at least 2 more years... Except professional software built directly for it the implementations look more like school/PoC/experiment/learning projects and there is so far not really solid consistency between the results. I'd expect though it becomes generally supported standart in about 5yrs give or take.
  • Oxford Guy - Wednesday, December 9, 2015 - link

    1000+ nit brightness? Get ready for flashing ads to sear your eyeballs. Commercials are going to be even more irritating.
  • Oxford Guy - Wednesday, December 9, 2015 - link

    Meanwhile RTG is also working on the software side of matters as well in conjunction with Microsoft. At this time it’s possible for RTG to render to HDR, but only in an exclusive fullscreen context, bypassing the OS’s color management. Windows itself isn’t capable of HDR rendering, and this is something that Microsoft and its partners are coming together to solve."

    What about OS X?
  • Razdroid - Wednesday, December 9, 2015 - link

    So, 'balkanization' is actually a word in US/UK !?
  • Oxford Guy - Wednesday, December 9, 2015 - link

    yes, if you capitalize it
  • Midwayman - Wednesday, December 9, 2015 - link

    This might finally be the year I go 4k. been waiting for 120hz monitors and it looks like the pieces are falling into place.
  • StevoLincolnite - Wednesday, December 9, 2015 - link

    All I ask of you AMD is to release something amazing on 14nm at an affordable price. (So I can get two.)

    Also updating your legacy drivers so that old cards (That are more than capable enough for their intended needs) have Windows 10 support. (Like the Radeon 2000, 3000, 4000 series.)

Log in

Don't have an account? Sign up now