Both TV shows and films are embracing the
darkness like never before, and there have been several notable cases in recent years
that have had people wondering why. Why are these movies and shows so dark to the point
where I can’t see them? It’s a fair question, and by now, there has been plenty of feedback from
filmmakers, cinematographers, industry insiders, fans, and armchair critics over what exactly
is going on here. I wanted to take the time to break down just how dark some of these shows
are, how they compare to older material that had similar cinematography, and what might be driving
these creative choices, especially in TV shows. There’s a ton to cover here from the reason
they display poorly on most people’s screens, to the lackluster compression used on
streaming and broadcast. Later on, I’ll show some basic ways that TV shows can achieve
these dark color grades, and what those grades might have looked like on shows of the past. In
other words, why it doesn’t have to be this way. For those unaware, movies and TV shows are getting
darker. This has been a trend for years now, but probably the most hysterical reaction came
from Game of Throne’s Season 08 episode - The Long Night - back in 2018. As the episode aired,
many people were literally lost in the dark, searching for details obscured by bad displays
and even worse video compression. Game of Thrones in general was a pretty dark show when
it came to lighting, but it was never this dark. The fallout of the episode was
so intense that cinematographer of the episode - Fabien Wagner -
came out and defended his work. ‘Wagner says that the majority of the darkness
in the episode is thanks to the night-time shoot, with the rest of the atmosphere produced on-set
through lighting choices. “Another look would have been wrong,” he says. “Everything
we wanted people to see is there.”’ The lighting choices were deliberate, and I
have to imagine he sat in on the color grading session as well. It goes without saying that
these are pros working at the highest level, and the budgets for adequate lighting are
certainly there. These aren’t independent productions with limited budgets that have
to make the most of a small lighting package. Their decisions are well-informed
and artistically intended. Still, the producers probably took this
to heart going forward, right? Right? A few years later, House of the Dragon featured a
similarly dark episode, though not for exactly the same reasons. The issue here was largely due to
bad day-for-night color grade. It’s impossible to know for sure what production choices or
realities went into this look. It could have been decided fairly late to convert these shots,
or the budget simply wasn’t there to shoot this at night. I don’t want to come down too harshly
on the work of these professionals, but I don’t think the result was convincing anyone. Secondly,
even a lot of the interior shots are graded to painfully dark levels, so I think it’s still
to question some of the decisions made here. But why? Why are these shows so
dark? It’s happening in movies too, though the effect isn’t quite as bad because movie
theaters usually have proper ambient lighting, not to mention they’re shown in pristine quality.
They’re more consistently calibrated than home environments, though even in theaters,
you can find projectors that are too dim. Anyway, the broadest explanation for the
darkness of modern images is that there have been several huge technological changes
in filmmaking in the last 15 years or so, most notably, the wide availability of
extremely light sensitive digital cameras. It cannot be overstated how digital capture has
changed the industry. It’s an entire conversation unto itself, but the short version is that
cinematography’s embrace of digital took a few years to really take off. The overall aesthetic
wasn’t there, dynamic range was questionable vs film, color science was hard to determine,
and the resolutions were relatively speaking, quite low. Some would probably argue that
until the debut of the ARRI Alexa in 2010, digital was hit or miss. (Examples: Star
Wars Prequels, District 9, Social Network) One strength digital long had though, was a far
higher sensitivity to light relative to film. Notably, Michael Mann’s Collateral was shot
on the Viper Filmstream camera back in 2004, and the interior driving shots were minimally
lit and relief on the way the camera exposed shadows. This is something that simply
wouldn’t have been possible with film. Fast forward to 2010 and you have The Social
Network shot on the Red One. Around this time, you really start to see broader, more
diffused light sources that played to the strengths - and weaknesses -
of digital cameras at the time. But it’s not just that digital cameras are
more sensitive in lower light. At this time, digital cameras still didn’t quite reach the
dynamic range of film, and even more critically, highlight rolloff, or the way that the brightest
parts of the image fade into pure white, was often harsh and to be avoided at all costs.
This is why softer, more diffused lights were preferred with digital, and today, combined
with a desire for extremely naturalistic, almost exclusively motivated lighting,
softer light sources are everywhere. Digital cameras nowadays have, on the high
end, exceeded film’s dynamic range, but, the behavior of their capture can still be
less forgiving to highlights than film is, and their ability to save an
underexposed image is hard to beat. This demo from 2019 by Bill Lawson shows what
happens when you push both film and digital either up or down in exposure. Now, these are
still images to be fair, but as you can see, the film version of this shot only really
survives to about 2 stops of underexposure. -3 stops and below have absolutely crushed,
milky shadows that offer no detail. Even when he goes all the way to -10 stops, the digital
image, while unacceptably dark and noisy, still has a recognizable image, while
the film image is simply a grey blob. In the opposite direction though, increasing the
exposure immediately shows how forgiving film is in the sky details. The digital image here can
handle almost 3 stops of overexposure, but even by 2 stops, things are getting a little harsh.
By the time he pushes 4 stops, the sky is almost completely white, and the highlight details of
the fabric are lost. Skipping ahead to +10 stops, and the film image has definitely lost sky detail,
but the result is more of a hazy, faded look, whereas the digital image is totally white.
I’m skipping the details behind this because it can go on for hours, and I don’t really
have time to discuss photographic exposure science or theory, but the basic idea
is that digital’s ability to capture shadows has given cinematographers a previously
unavailable way to minimally light their sets, and to produce an image with subtle shadow
detail. Digital delivery and projection of these images - as opposed to film prints - also means
that the resulting color grades can be very dark. But just how dark? We can acknowledge that many recent
movies and TV shows look darker, but can we measure this beyond our subjective
sense of how it looks to us? Absolutely. Let’s take a look at some scenes from The
Long Night in DaVinci Resolve and bring up the waveform monitors. Without getting
it too much, waveform monitors are a fundamental tool used to gauge brightness and
exposure when shooting and grading video. For those who’ve used image editors, the
basic idea is similar to a histogram, but you get a better spatial view of where
the brightness values lie within in the image. I’ve set the viewer to show a 0-100
percentage instead of luminance values, and some people out there might still
refer to this as IRE. I’m one of them Darker images tend to have most of their
information in the lower regions of the monitor, and brighter images tend to have them in
the upper regions. A very contrasty image with lots of shadow and highlight
detail, will fill out the monitor from top to bottom. There’s no real
rule as to where exposures must lie, but simplistically speaking, anything below 0
is lost, and anything above 100 is also lost. We can see in many of the battle scenes here,
there is very little information above 25 IRE or so, which means, essentially, that all the
shadows and midtones have been crammed into this tiny fraction of the total available
dynamic range. If you look at shots where bright light sources appear, like flames, though
they peak into the upper regions a little bit, it’s only faintly so, and the way
they overexpose into pure white is, honestly, excellently controlled. I’m
going to get back to this in a bit. Looking at episode 07 of House of the
Dragon, you can see pretty much the exact same thing going on. The darkest shots have a
densely packed lower region of the waveform, with many shots dipping below
20 IRE in the entire frame. If you look close enough in these images,
you can still determine where the light sources are coming from. Beyond the
obvious things like fire and explosions, you have giant light sources highlighting the
backsides of characters in almost every shot, and there’s clearly enough light to ambiently
fill out faces and the shadows themselves. In other words we can see
the effect of the lighting, so the bulk of what we perceive as
darkness here is the result of the grade. Where do I begin with this? Very generally
speaking, HDTVs and now HDR TVs allow for a darker picture to be visible overall. The
bigger size, resolution, and brighter displays simply let the viewers see more than before,
so filmmakers work within those expectations. Older SD broadcasts could lose lots of
shadow detail in the lower resolution transfers of film images, and consumer CRTs,
while possible offering excellent black levels, were wildly inconsistent in quality.
TV shows had to accommodate this. Now ironically, a lot of LCD screens are
absolutely terrible for shows and movies with a lot of dark scenes. Backlight bleed can
ruin some scenes as I’m sure many of you have noticed before. When The Long Night first
aired, it’s probably safe to say that the vast majority of viewers were watching on an LCD
screen. Factor in that many people were probably viewing on low to mid-tier LCD screens that didn’t
feature multiple dimming zones or VA panels, and you have one of the worst ways to
view it to start. OLED screens could alleviate a lot of this with their ability
to display true black, but even on an OLED, House of the Dragon episode 7 was
still difficult for me to watch. Both broadcast and streaming compression are
simply not up for the task of delivering this kind of content, especially in 2018. In the US
at least, most linear cable broadcasts still follow the ATSC standard, which at the time of
the Long Night’s broadcast, likely meant a 1080i broadcast. The interlaced nature of this isn’t
necessarily a problem, as most consumer TVs can handle that properly; the real problem was the
bitrate and codec. OTA ATSC using MPEG-2 - yes, the same MPEG-2 from 1996 - can reach around
18Megabits per second, but cable broadcasts are often much lower than this, potentially reaching
somewhere around 12Megabits depending on your provider. This is horrendously bad for anything
at that resolution, but it’s flat-out unacceptable for something this dark. Darkness in general,
is really difficult for codecs - especially in 8-bit - to render correctly. Those incredibly
small details and subtle gradations in detail get smoothed out into murky nothingness, or
blocky messes that are impossible to decipher. On the streaming end of things, it
likely wasn’t much better for most viewers. While most streaming at that point
was H.264-based, bitrates for streaming were still far too low to render these scenes
correctly. The resulting artifacts were different than their cable-counterparts,
but probably not much better to look at. So, combine all of these factors, and the viewing environment is just the
perfect storm for ruining the image. So if viewers can’t see these images, why grade
them this dark? First off, everyone wants to blame someone for this. Yes, these are conscious choices
made in production, but they’re not without merit. The biggest creative reason is probably the
desire for a more cinematic experience in the home. The golden age of television
has ushered in more compelling, more dramatic storytelling, and
with that comes more cinematic, stylized approaches to shooting those images.
Darkness is a narrative and atmospheric tool. Could you imagine a modern Batman film that
isn’t sunk in the darkness? Or a horror film? Now the change in styles has manifested in quite
a few ways over the years; cinematic trends come and go, but most of the push for this has come
from technology. If you look at any show in the mid-2000s that made the SD to HDTV switch, it’s
probably most apparent there. Bigger, brighter screens, combined with digital broadcasting meant
that cinematographers could start to light with more shadows in mind. Still though, most prime
time dramas of that era were shot on film, and so the lighting practices had to expose for film
first and foremost. Once digital capture became the norm though, we started to see this change
towards ambitiously darker lighting and grading. In my opinion - and I can’t stress enough
that it’s an opinion - it’s an unrealistic expectation from the production, that most viewers
should have either a properly calibrated display. When The Long Night there were more than
a few think pieces about the importance of display calibration, or using neutral
settings on your TV to see the episode as it was intended. There was also talk
about watching it a darker environment, which is not wrong, but,
good luck convincing people. Now, It's true that most consumer TV screens
are poorly calibrated, if at all. Many people leave their settings as they are out of
the box, and anyone who's visited their parents for the holidays only to find motion
smoothing enabled knows exactly what I mean. But this isn't a fair expectation. For one,
calibration goes above most people's heads, and it's unrealistic to expect them
to understand things like white point, gamma curves, black level, color spaces,
and so many other variables at play. More importantly, a good color grade intended
for home viewing should render well across multiple screens. In music production, a mix
isn't considered good until it sounds decent on a wide range of devices - especially the bad
ones like ear buds and cell phones. We generally don't ask people to calibrate their headphones
or to enable special EQ to compensate for bad sound reproduction, and we shouldn't ask the same
of video. But let's say we could ask people to do this. It still doesn’t change the fact that
poor compression, bad viewing environments, and older display technology puts a limit
on how well we can render these scenes. You might be wondering to yourself, plenty of
movies, especially older ones, embrace darkness while maintaining visibility. It’s true,
darkness doesn’t have to literally be dark. I tried to find examples of shows with
similar set design and cinematography to demonstrate this, and it just so happens
that I recently finished watching HBO’s Rome. Now most of it is shot in brighter
environments, but every so often you get dark scenes that render cleanly. Shooting
on film is the biggest reason for this, because it naturally enforces bright lighting
practices, but given the television production lineage it has to GoT just a few years later,
it’s hard not to notice this difference. Part of this is admittedly changing trends
as well, but we discussed that before. On the cinema side of things, one
example I’ve always found fascinating is Army of Darkness. You have dark battle
scenes at night with fire, explosives, smoke, everything. It’s a castle siege with
supernatural forces at play, but I always finding this battle to look, well, easy to see.
To be fair, cinematography has obviously changed a lot since then, so let’s look at Evil Dead
Rise, which is the latest entry in the series, and of course is going to feature
lots of dark cinematography. You can see the change in trends here, with dark
shots featuring much of that same low IRE that The Long Night has. There’s nothing wrong with this,
but in a movie theater, this is more acceptable. Also, in some of the shots, even though it’s
dark, you see more light sources spreading across background details, separating the subjects out
just enough. In Game of Thrones, it’s harder to make this work in what is essentially a void
of darkness. And that was kind of the point. So trust me, this is all intentional. But
what if our intentions were different? The beauty of the way light and photographic
principles work is that the ideas behind what makes a grade work or not work applies to
both the old and the new. We’ve already established that these dark scenes are lit,
and oftentimes with far more light than you realize. What happens if we attempt to grab
scenes that are roughly similar in nature, and grade them to match one another. I gave
this a very quick shot in DaVinci Resolve to see if I could approximate the looks, and
believe me I’m not a pro at this. I also don’t have access to the original files to really get
this right, but I hope it gets the point across. Let’s take a look at those scopes again
for Game of Thrones. Ouch. Many of these shots barely register above 20-30 IRE and
that’s for all the information. What if we tried to make it look similar to a shot
like this from Army of Darkness? They’re fairly alike - both shot at night, both with
sharp rim lights highlighting the subjects, and smoke and fire across the frame.
They’re obviously not exactly the same, but despite how bright this shot is -
look at it, the scope has much softer, denser spread of information almost hitting 50 IRE
- we still register it in our minds as night time. I’m going to mainly use my eye, but the
waveform is our friend here for getting a sense of where specular highlight detail and
shadows need to be. Despite being extremely dark, Game of Thrones still keeps its highlights pretty
high. If we increase the gain and try to reduce our gamma and lift a bit to get the contrast
back, not only do we come closer to Army of Darkness’ look, but we can see that there’s
actually quite a lot of detail hidden in Game of Throne’s grade. This is exactly the kind
of flexibility you get with digital capture, and it’s something any cinematographer will tell
you has changed their lives, but as I said before, this flexibility doesn’t carry over well
on low bitrate streaming and broadcast. These hints of detail barely register to
the codecs, so, they simply discard it. But what about the other way around? What if
we grade Army of Darkness to look more like Game of Thrones? Using the same idea but in
reverse. I start with the offset controls to get the overall balance into a similar place.
This alone gives a huge boost to contrast, and I honestly quite like the look. But the
waveform is still spread much more than Game of Thrones. So then I play with gain, and adjust
the gamma and lift a bit to make sure the black levels aren’t absolutely crushed. I use the
fire to reference where my highlights should be, and the result is what I would call a
claustrophobic use of shadow detail. Huh… in my mind at least, what’s fascinating
about this grade is that it kind of modernizes the image a bit, don’t you think? In a
sense, what you’re seeing is how a remaster, for example, can take on a radically new
look in an attempt to modernize an older film for modern viewing expectations. That’s
a whole ‘nother conversation that I’ve touched upon in the past, so we’re not gonna talk
about it today, but it does make you wonder how exactly we are to reconcile artistic
intent with something like image quality. Let’s try another experiment. Looking at House
of the Dragon, this is a day-for-night shoot that really suffered in the grade. Now, I’m not
sure if it was always intended to be night, given that HBO had promoted the show with
BTS shots of this episode in full daylight, so that’s where it gets tricky. It might
have narratively had to become a night scene in order to fit into the story, but that
doesn’t change the fact that this grade is questionable. That’s before we get into things
like fire that doesn’t cast light on the actors… Even many shots that aren’t in obvious
daylight have been graded so dark that the video levels are cramped down
to obnoxious territories. But again, it’s an intentional decision. Later shots
in the episode show delicate shadow detail that is dimly illuminated by sunlight
that does fill out the dynamic range. So what if I grabbed a similarly backlit shot
from Rome, where light is pouring in from a window behind a character and onto the earthy
interiors and lush fabrics. This is just a great image. Dead-on exposure with a healthy
waveform that shows a naturally warm image. The highlights sit just right on things like
hair, skin, shoulders, but the window in the back is a warm white. It literally only touches
the top of the waveform in the red channel, letting it blow out in a believable, pleasing
way. I really wish we could go back to this. Like before, I brought everything down, but
spent more time dialing in the color offsets and using the color warper to get the cool tones
close enough to House of the Dragon. I threw a little bloom on the shot for good measure. Given
that I’m working with an 8-bit compressed source, I could only push this so far, but I think it
gets the idea across. I don’t like this grade, but I also could see it airing like this today. Doing this little test revealed a few things that
I suspected. In general, these days, there seems to be some desire to protect highlights at all
costs. I kinda get it. Digital highlights can be garish, and this is total conjecture, but
I think blown out highlights have acquired a reputation for being cheap, low budget, or simply
outdated. It’s not like you don’t occasionally see intentionally blown highlights in modern
shows, but they’re not that common. This was actually something that really surprised me when
I watched Severance last year. That show does try to evoke an older aesthetic, but it’s shot
digitally, and on more than a few occasions, shows some ambitiously bright shots that even clip at
times. Honestly, I was kind of happy to see that. The other thing I noticed in doing this comparison
is the change in film transfers over the years. As I mentioned before, the technology
behind film transfers has improved a lot, and the end result for some remastered movies
can be drastically different color grades. It really can be a struggle to know what is
merely being revealed by the new color grade, versus what is being intentionally changed as part
of some creative intent. But what’s interesting is that there is a sort of intersection between these
technological improvements and changing tastes in viewing. Let’s look at Rome again for a second.
Rome is from 2005, right as TV was shifting to HD, and while the show was presumably finished
in HD, the age of its transfer shows us a few things. There’s bad highlight detail in some
shots, where you can see the image clipping in an unnatural way, and the black levels
are definitely too crushed at times. The shot I used was impossible to fully match with
House of the Dragon because the shadows in the foreground figure were completely crushed. But the
overall contrast is very pleasing. It’s visible, it’s neutral, and it makes me wonder what
it would look like if it was remastered today. Looking at Army of Darkness, which I
happened to grab from this new 4K version, you get the sense that there’s a desire
to preserve every bit of information. Even the darkest shots just touch the bottom of the
waveform monitor, never really losing anything, and this is great. But, I notice that a lot of
modern shows that try to emulate film looks, have an almost complete aversion to crushing
the black levels, especially in dark scenes. I think, there’s a sort of confluence of
things happening here. That protection of highlights kind of goes both ways. I
would actually say it’s a refusal to allow any detail to clip except for the
most obvious of specular highlights. High end cameras now can capture 14-16 stops of
information depending on what you’re using, with some like the ARRI Alexa 35 even boasting 17
stops. Cramming all that detail into a standard dynamic range video is doable, and colorists work
very hard to get the most out of their images, but the result can often be flatter, and
in my opinion, sometimes lifeless video. So where do we go from here? Well, trends come
and go. Maybe one day we’ll see higher contrast images again, that aren’t afraid to absolutely
destroy their highlight and shadow detail, or at the very least, to just give us a naturally
contrasty image. I’m sure we will at some point. In the meantime though, there are things you
can do to make the viewing experience better. Filmmaker modes promise a standardarized
approach to TV calibration, and this is one way to set at least a baseline expectation
in the grading suite for what a home consumer TV can display. It maintains the intended D65
whitepoint of all common broadcast formats, and most importantly, turns off all post
processing of the image. No motion interpolation, no noise reduction, and no oversharpening. But
honestly the viewing environment is probably going to be a bigger factor for most people at
home. Brighter settings might be less accurate, but they also might be necessary
for brighter ambient lighting. Not everyone wants to watch in
total darkness and I get that. Most LCD screens are simply not up to the
task. The growing mainstream spread of OLED screens should hopefully mitigate this issue
in the coming years, but it'll take time. Until then, I think filmmakers need to be at least
somewhat more realistic about their audiences viewing capabilities. I feel there’s a similar
argument to be made for cinema and TV sound mixing these days, but I’ll let someone else handle that
one. The neverending pursuit of a more cinematic image is understandable, and the change in
technology available to both creators and viewers in the last 15 years is sometimes hard to believe.
Back in 2002, a standard definition MiniDV camera with 24fps recording was revolutionary. 20 or
so years later - good lord - and now there are phones that can record 4k video at 120fps. The
speed of this change means that filmmakers’ ambitions have grown and grown, while viewers
are, in many ways, always playing catch up. I’ve always loved talking about the
technical production side of things, so if you liked this video and want to see
more, let me know in the comments below.