The Game Of Thrones ‘Too Dark’ Kerfuffle

May 28, 2019

We take a look at Game Of Thrones Season 8, Episode 3 and ask, as creative professionals, was it too dark and where did it go wrong?


Too Dark? Get A Better TV! GoT Blows Up The Internet

Game Of Thrones, by any measure, is a cultural phenomenon.

For the past 8 years, the show (let’s be honest) probably appealed to the Dungeons and Dragons set before it appealed to Hollywood Instagram influencers, grew in popularity until the very end of the series when it seemed like literally, everyone you knew was watching the series.

Well, everyone except for me.

 

Sex, violence, and dragons!  What’s not to love about Game Of Thrones?  The show became a somewhat surprising phenomenon that I only recently watched – due mainly to the controversy that one of the final season episodes caused.

 

That’s right! I hadn’t watched Game Of Thrones after the first couple episodes in season 1. And wasn’t planning on it until I woke up few Monday’s ago when literally every social, news, and even sports site that I read, was talking about GoT Episode 3 Of Season 8 ‘The Long Night’.

With headlines like ‘What happened in GOT? Dunno? Neither Do We!

As I dug a little deeper, it became apparent that the Internet was exploding because nearly everyone thought the  ‘The Long Night’ was way too dark, and that it was near impossible to follow the action or actually tell what was going on!.  Longtime watchers of the series, in particular, appeared to be on the verge of revolution and ready to storm HBO’s headquarters.

 

Zing! Mocking memes and complaints were widespread after the world watched 'The Long Night'

In the following days, it seemed that everyone who ever viewed anything on a TV wanted to chime in.  Consumer Reports even put out emergency triage advice to tune your TV so you could see the action in the episode!

My curiosity peaked, I decided to sit down and watch the episode. What followed was nearly a dozen viewings of the episode on nearly every device I could view it on,  and ultimately spending the past 3-4 weeks watching the entire series.

I’ve been thinking a lot about that ‘is it too dark’ episode and how it impacts what we do as colorists & creatives, but also how it speaks to larger industry issues facing our craft. Pat, Dan and I have talked about things a bit and I thought in this Insight I’d share some opinions on all this.

Transmission/Encoding Matters

In my opinion, and in my viewing of the GOT episode, perhaps the biggest contributing factor to viewers complaints about not being able to see things and about the episode being ‘blurry’, ‘noisy’ etc., were due to HBO’s compression on streaming platforms  NOTE:  I did not see a traditional ‘cable’ version of the episode.

I polled about a dozen friends and colleagues who watched the GOT episode via HBO Now and HBO Go right at 9pm when the show goes ‘live’ on traditional HBO.  All of them said that visible macro blocking and other motion artifacts made things tough to view and in particular the scenes with Dragons in the air and snow falling – almost unwatchable. And before you blame bad WI-FI and slow connectivity, 7 of the 12 people polled were watching via wired connections to their device or TV, and all of them had connectivity of 100mbps or greater.

I didn’t see the episode ‘live’ but this confirms experiences I have had with HBO and their streaming services when attempting to watch ‘live’.

I almost stopped watching Westworld because I couldn’t stand the compression problems.  As a consumer who might not understand some of the underlying technology, I can understand why seeing images with tons of macro blocking and other image artifacts would be frustrating – especially when you’re paying $15 a month!  But more to the point, I can understand how the average consumer confuses poor compression with poor artistic intent.

 

Obviously, a gross exaggeration of macro blocking/compression artifacts, but in my opinion, poor transmission at least in the ‘live’ version played a huge role in viewers not be able to ‘see’ image detail in Game Of Thrones Season 8, Episode 3.

 

I’m not a compression expert, but if HBO is streaming at a subpar bitrate for purposes of bandwidth control/management of the millions of viewers who stream right when the stream becomes available, I can sort of understand that, but that doesn’t make it less frustrating for the end viewer.  Furthermore, there is anecdotal evidence that post-premiere viewers are getting considerably better quality streams, even just hours later.

In my own viewing of the in question GOT episode, I watched on various devices via the HBO channel subscription on Amazon.  I saw some macro-blocking and other compression artifacts but nothing crazy.  When I showed a couple of diehard GOT fans what I was seeing the immediate response was ‘that’s not what it looked like when I watched the night of’.

To be honest, I have no idea if HBO is adjusting stream quality between initial release and the post ‘live’ window and they probably wouldn’t say even if they did. But the issue is that bad transmission can play a major part in how the average viewer perceives content. Often, this results in accusations that there were bad creative choices made – which can happen – but I think for many viewers of this episode, the stream was a major issue vs. creative choices to keep the episode naturally lit and dim.

It seems like H265 has been coming for a long time and I wonder if some of the reasons for offering low bit rate streams could be fixed by using H265s much higher efficiency/image quality at the same bit rate.

Standards + Calibration In The Living Room

Looking through various forums after ‘The Long Night’ aired, professionals and home theater enthusiasts were screaming that there was nothing wrong with the episode – it was viewers and their crappy TVs with crappy calibration!

They are not wrong!

As colorists, we spend so much time obsessing about monitor calibration, viewing environment and making sure signal flow is perfect. As an industry, we’ve come to an agreement about color spaces, EOTFs and even viewing conditions.  Home theater enthusiasts care about these things too, but way too often the average consumer has no idea about proper TV settings or calibration.

The discussion around Game Of Thrones is not the first time in recent memory that consumer TVs have been criticized.  Prior to season 2 of Stranger Things, showrunners Matt and Ross Duffer shouted from the rooftops for consumers to turn off any sort of motion smoothing that their set employs (I agree, you should).  The pair went on to say ‘We and everyone in Hollywood puts so much time and effort and money into getting things to look just right, and when you see it in someone’s home, it looks like it was shot on an iPhone.’

It’s always been baffling that there is such a disconnect between content creation and home viewing. Motion smoothing, excessive noise reduction, pull-down cadence problems, and overall calibration are just some of the problems with consumer sets.  In addition, a plethora of non-accurate modes, hard to understand controls, and poor UI design make it almost impossible for the average consumer to view content the way it was intended.

You could forgive TV makers for some these ‘features’ – things like MPEG noise reduction seem like a good thing, but generally, all the helpful stuff on TVs is so poorly implemented that it does more harm than good.

As a colorist, the hardest thing for me to stomach is poor calibration.  High-end TVs have become much better in this regard, but it’s amazing how even very good TVs are widely inaccurate!  For example, after choosing the most accurate mode with the proper white point, gamma, and turning off all the ‘auto’ stuff, consumer sets can still have average deltaE values of 10 to 15. Compared to the professional standard of under 2 – that’s a very noticeable difference in image reproduction!

 

This guy is clearly upset about consumer TV calibration too!  I doubt he even has a bias light!  Consumer TVs continue to get better, but poor calibration, motion issues, and confusing CMS controls all contribute to consumers being frustrated with how content looks and content creators frustrated with TV makers!

 

Much progress has been made in consumer TV calibration.  For example, the work that Spectracal (Portrait Displays) has done with its ‘Autocal’ integration with TV manufacturers like Samsung and LG have made things simple for the consumer to get the best possible image out of their TV. But, not many consumers are going to spend money on a good meter.

Industry giants like Dolby & Netflix are working on solutions to make this situation better and seamless for consumers.  Dolby for example already approves Dolby Vision capable TVs so it’s not a long way off before choosing a Dolby mode could adjust all TV settings (or possibly, at some point in the future, auto-calibrate a TV).

So why do TV makers continue to make TVs that are inaccurate and have a bunch of superfluous processing? Simple – cost & production time.

Making accurate panels is hard and consumers always want more for less.  Those two things are at odds – you’re not going to get a reference quality panel for $500.00.  Furthermore, outside of panel-level calibration and default profiles, TV makers can’t spend (potentially) a couple of hours calibrating individual sets.  Another ironic thing is that an accurate TV is not what a lot of consumers want!

Watching a 100NIT SDR TV in a very bright living room with lots of ambient light is hard to see.  Pump that TVs light output to 300-400nits, things are much more viewable even if black levels suffer!

I can understand consumers lashing out at content creators, but it’s not their fault (most of the time). I can also understand content creators complaining about consumer TVs capabilities.  But if content creators think that viewers are going to en masse start calibrating their TVs and setting up their viewing environments to match reference conditions, they will suffocate while holding their breath.

Change for more accuracy in the living room is going to come from content creators, studios and production companies who need to put pressure on consumer electronics companies.

TV Tech Matters

OLED and zoned backlit LCD/LED TVs represent the majority of TVs (I’m guessing someone watched GoT on a CRT!).

When I first started looking into the GOT dark controversy it seemed like the people complaining the loudest were videophiles using OLED displays – like those from LG, Sony and Panasonic.  This reading caused me to have a flashback to a recent project where my client was convinced that their movie (and thus my grade) looked terrible!

In the project I worked on there were a lot of really really slow fades from black and a lot of dark scenes where light was shaped quite a bit – meaning there was quite a bit of gradation in the shadows.  While the grade looked amazing on my FSI XM310k on the LG OLED I was using as a client monitor, there were a ton of banding problems and what sort of looked like large ‘blobs’ on screen.

The issue was not in the grade, but rather in how the OLED panel comes out of black.  Small changes in brightness levels (code values) are not handled very well by these OLED panels (WRGB being the technology used in large format OLEDs).  The result is chunky looking fades and banding in areas of subtle dark gradation.

 

Different TV technologies can impact (as you know) how content is seen, I noticed different issues with ‘The Long Night’  on different display technologies and especially with how WRGB OLED panels displayed subtle gradations in shadows, which were pervasive in the episode.

 

In watching GOT – not just season 8 episode 3 but during my entire epic binge-watching of the series, this issue was pervasive on my LG C8 OLED in my home theater.

It’s not just Game Of Thrones. I’ve experienced similar issues on countless movies and shows – again, the problem always manifests itself coming slowly out of black or scenes with lots of dark gradation.  The compression issue definitely exacerbated this behavior in my opinion, but the consistently and subtle dark gradations of ‘The Long Night’ brought this ‘feature’ of the WRGB OLED panel out in spades.

My viewing on LCD/LED TVs, but also computer monitors and my iPad didn’t show the banding issues that I was seeing on OLED. Not at all, but it did illustrate two other issues.

First is that on many LCD/LED panels there is a point where shadows just fall off a cliff.  Why? Because of the relatively high black levels of LCD/LED, shadow detail that was evident on an OLED was lost to muddiness on the LCD/LED.  This was most obvious to me watching on a Dell computer monitor in my home suite.

The subtleness of the grade also ran into problems on a TV I watched with a low number of backlight zones (a TCL 4 series).  Watching on this set was actually pretty comical because the zoned backlight system just didn’t know how to handle things at all!  The dark scenes were consistently muddy and you could see the backlight system trying to figure out all the shadows while also handling bright flashes from torches or a dragon fire burst.

But as you probably already know, technology issues like these are nothing new, and there is no such thing as a perfect monitor technology. One can argue it’s a content creators responsibility to QC content on a wide variety of displays (discussed next). But – making changes on the wildly varied results of consumer TVs vs. a proper reference is very dangerous.

Furthermore, until there is one monitor technology to rule them all, how different displays show content will continue to be a concern.

Making A Case For A Crappy TV In The Suite

I think everyone can agree that reference environments and standards are vital to evaluating the ‘truth’ about what our content looks like. Furthermore, I think professionals can agree that the closer consumer sets can get to reference the better it is for everyone. With better quality, higher bit rate streaming/transmission both content creators and viewers will be happier.

But there will never be technical perfection. As we near perfection, any gains made become smaller and smaller. In other words, we’re a long way from the majority of viewers having the quality of a grade 1 reference monitor + reference environment at a price they’re willing to pay.

As a professional, I don’t worry about the high-end.  A videophile who’s invested tens of thousands of dollars in a home setup is way more likely to invest in the best gear and also understand the importance of calibration and viewing environment.

It’s the ‘vivid’ mode,  mount the TV 6 feet high on the wall with no cable management consumers that I worry about – as there are a lot of those folks around the world!

For now, even though things are getting better, the world’s living room is largely dominated by bad technology, horrible calibration, and challenging viewing environments.  While I would NEVER make the suggestion to make decisions solely on the worst case scenario, I do see the benefit of viewing/listening to content on the worst case scenario TV.

 

While I probably wouldn’t recommend this TV in the suite, I do see the value of an off the shelf, low-grade consumer TV in the suite – not as something to base critical decisions on but as a data point to see how well your grade translates to less capable displays.

 

For decades, recording engineers have listened to their mixes in a car or used the Avatones in their suite to mimic consumer situations.  I think the same can be used in the color suite.

For a long time, I’ve had a 32in Toshiba HDTV that I picked up at Target for $99 when it was new. I’ve never changed a setting on this display other than setting it to ‘movie’ mode.  This display only goes on when everyone is happy with the grade and I simply use what I see on it as a data point – I DO NOT make grade changes based solely on this display but rather use it as a gut check, especially for very contrasty images and or very colorful/saturated projects.

This cheap TV has helped me develop better grades that translate more consistently across different consumer TVs (and devices) of varying capabilities.

Like the GOT episode, I’ve worked on many projects that the narrative calls for very dark scenes. On those projects, I’m making my initial decisions on my reference monitor and after we’re done grading the crappy TV often gives me a gut check for how far I’ve pushed things.

I’m not saying that the finishing team for GOT should have made different overall creative decisions based on low-performance consumer monitor, but they may have backed off the intensity of the very dark grade a click or two if they had auditioned things on a lower quality consumer display. Maybe…

Making A Case For Basic QC In The Living Room

While I fully support videophile home theater OCD (you know who you are!) the simple fact is that the majority of content is not consumed in a reference environment – or anything close to it.  From ambient lighting to horrible viewing angles (above the fireplace!) the average consumer viewing environment is far from ideal.

Similar to using a consumer TV as a gut check, I’m a big fan of mimicking the average or even worst-case viewing environment for QC purposes – again as a data point, not as the sole means for making decisions.

 

Viewing conditions a typical living room can be challenging – lots of ambient light, bad viewing angles not to mention poor TV setup can all play a role in how content is perceived. I try to QC content in a similar environment – not with the goal of making changes, but to make sure my grade translates.

 

Here are a couple of ways I do that:

  • Ambient Lighting  – I’ll often pipe a project on the consumer TV that we have in our conference room (Vizio in this case).  This room has large windows and gets a lot of light.  This allows me to see how well the grade translates to what a lot of typical consumer setups will have in terms of ambient light.  Don’t have a room with ambient lighting?  Turn the overheads on in your room – you can do this even if you’re watching on a reference monitor.
  • Backlight/Brightness adjustments –  Many consumers set up their TVs in great rooms or living rooms where there is a ton of ambient lighting but are not set up for reference peak luminance. Even if they’re not in ‘vivid’ mode it’s not uncommon to see TVs set up by consumers to 200, 300, even 400 nits.  Then, those consumers often drop the brightness level (blacks) to get more contrast (without realizing they’re losing black detail). To mimic, I have a preset on the conference room TV that applies these settings.  I also have a preset that raises the brightness control, which is typically the first line of defense for the consumer if something is too dark. Never mind that this isn’t the control you want to use to get more light output – but the control is named Brightness, so that’s what they adjust!
  • Mixed Lighting – In a reference environment, a lot of work goes into having proper lighting (bias lighting, D65 ambient lighting, etc).  But try searching a home theater forum and you’ll see people putting neon pink lighting around their TVs; that’s just one of the dangers of mixed lighting in a typical viewing environment.  To simulate this all you really need is a couple of tungsten table lamps in your suite.

If you can’t mimic these things in your color correction suite an easy solution is just to render out an h264, put it on a USB flash stick and bring it up – plug into your home TV!

Final Thoughts

So was ‘The Long Night’ too dark?

In the color bay on their calibrated display, and in a reference environment, I have no doubt that the episode looked amazing! To be honest who are we (as viewers) to tell the creatives for the show that they are wrong – by definition, it’s their prerogative. However, it’s important to understand that no home experience can replicate the viewing experience in a color suite. But when you combine poor stream/transmission quality, inferior TV technology, lack of calibration, and improper viewing environment – these factors undoubtedly all contributed to the near riots that the episode caused!

So, again, I ask was ‘The Long Night’ too dark?

No, I don’t think so.  Even if you subscribe to some of the concepts I’ve outlined here like checking a grade on consumer set and mimicking a consumer viewing environment, I think any adjustments the finishing team would have done would be have been too severe – or more likely not fix anything!  As we’ve discussed on Mixing Light before – making substantial changes based on the worst-case viewing scenario is dangerous and will likely cause other problems.

My hope is that this kerfuffle helps push changes in all areas of the home viewing experience and that consumers can start to understand where problems really do exist – and to help experienced content creators avoid these flame-ups in the future.

If you’d like to add something to the conversation or have any questions please use the comments below!

-Robbie

1,200+ Tutorials, Articles, and Webinars To Explore

Get 7-day access to our library of over 1,200+ tutorials - for $5!
Do you like what you see? Maintain access for less than $5 per month.


Start Your Test Drive!
Loading...