Day 11: 25 Insights in 25 Days Holiday Marathon
Getting Setup For HDR Grading Is Surprisingly Easy
Earlier in this year’s holiday marathon, I interviewed Bram Desmet & Juan Salvo about HDR.
I got a lot out of both of these chats including some clarification on HDR standards & aesthetic considerations for HDR grading.
But the way I learn is by trying things for myself – making mistakes and trying different approaches.
As I mentioned in that previous Insight, I’m just starting my exploration of HDR and this week and next, I have a top of the line Sony BVMX300 to test HDR workflow within Resolve & Premiere Pro – the two principle pieces of software I use every day.
While I’m still figuring out what HDR means for the way I approach grading, I thought I would share some of the technical/workflow bits about getting setup to grade HDR in Resolve.
Later in the Holiday Marathon, I’ll explore the same subject in Premiere Pro where things are a bit more….interesting.
RCM & ST. 2084
The DaVinci Resolve development team – in large part due to Resolve being used by some of the top grading houses in the world – has been ahead of the curve (pardon the pun) with HDR.
Based on SMPTE ST. 2084 and the Dolby Vision PQ EOTF, Resolve supports (rather easily) HDR grading.
The basic workflow with HDR grading in Resolve 12 is to use Resolve Color Management or RCM. Using RCM, the proper mathematical transforms for input/timeline and output colorspaces are applied.
I’ll show this setup in the movie below, but the nice thing about using RCM is that you can simply choose an ST. 2084 option based on your monitor’s capabilities – 500 nits, 1000 nits, etc.
While I so far prefer the color managed approach that RCM provides, it should be pointed out that ST. 2084 transforms are also available as LUTs in Resolve that can be applied at the node level or timeline level just like any other LUT.
As of yet, I have not done a detailed analysis of using RCM vs. using an ST. 2084 LUT, but its nice to know there is more than one way to handle HDR workflow in Resolve.
One important note: right now in Resolve 12.1 there seems to be a small issue with some of the ST. 2084 transforms not exactly hitting peak level and some slight banding. I’ve only noticed this in my testing once or twice but I’ve been assured by the Resolve development team that these small issues will be addressed in a future update. UPDATE: This issue has been fixed in subsequent versions of Resolve.
Bits, Monitoring, and Rendering
Bit depth is very important for HDR.
It’s generally accepted that 10-bit footage is the minimum for HDR work. In my testing so far, I’ve been working with r3ds and Arri raw files. I’ve also (for reasons I’ll explain in a later Insight) transcoded these files into a 12-bit J2k MXF using a PQ curve, and Open EXR sequences.
My point is that so far I’ve limited my testing to this high-quality footage. While h264s from your iPhone will probably never be ideal, I’m sure someone will figure out to integrate low bit depth footage into an HDR workflow, I’m just not there yet.
Also, it’s important to note that an HDR capable monitor is a requirement for proper HDR grading – not an option.
There is simply no way to view HDR video content currently on a computer monitor or a window within an application. So to a large degree, the viewer in Resolve becomes 100% useless.
Finally, when it comes to rendering both for final render and if you need to – render caching codec choice is also important. Choosing a 12bit + option like ProRes 4444XQ or for final render 12/16 Bit DPX is pretty much a necessity.
For these super high-quality files (especially at UHD+ resolutions) you’ll need a ton of very fast, high capacity storage.
In my discussions with Bram & Juan,we discussed the proposed HLG standard.
Because HLG is quite a bit behind in terms of practical development compared to ST. 2084 and Dolby Vision, right now there is no support for HLG in Resolve (or anywhere else that I know of UPDATE: Resolve 12.2 and later now support HLG). I’m sure as things progress with HLG it will be integrated into Resolve much like ST. 2084.
An area of improvement in Resolve and elsewhere is video scopes – specifically the Luma Waveform.
Since HDR is so tied to light output I would love to see an HDR mode integrated into Resolve’s Luma Waveform with a scale for NIT level using the 0-10,000 Nit scale per ST. 2084.
The Vectorscope (again in Resolve and elsewhere) probably needs updating if Rec 2020 is indeed to be used as the colorimetry standard for HDR.
Finally, one area I’m very sketchy on right now is how to assign proper Dolby Vision/2084 sidecar metadata (SMPTE ST. 2086). Dolby has a proprietary hardware unit but there doesn’t seem to be widespread access to the box. I need to do some more research on that to see if there are 3rd party tools for creating & attaching the proper metadata.
As I’ve said, all of this is new to me! But… I haven’t had this much fun testing and learning something new in a long time.
Questions or other thoughts please use the comments below.
UPDATE: As Steve points out below in the comments I’m guilty of making a technical mistake (quite often) when it comes to HDR. While ST. 2084 and Dolby Vision share the same PD EOTF curve (developed by Dolby and later adopted by SMPTE) they are not the same thing. The Dolby Vision system relies on metadata and authoring of that metadata relies on hardware unit from Dolby. ST. 2084 defines the PQ EOTF which anyone can use. Dolby Vision is a proprietary system from Dolby dependent on Dolby Vision metadata.