Movies have dark scenes nowadays mainly because it is a trend. On top of that dark scenes can have practical advantages (set building, VFX, lighting, etc. can be reduced or become much simpler to do which directly translates into money saved during shooting).
If I had to guess, the trend of dark scenes are a direct result of the fact that in the past two decades we our digital sensors got good enough to actually shoot in such low-light environments.
Before that film crews were typically shooting day-for-night which meant waiting for a set of specific weather conditions and then hoping that grading things blue-ish would sell the thing as being shot in the night.
Another aspect is the much higher brightness and contrast (=dynamic range) of today's displays and projectors. Back in the day you had to literally use the whitest white and the darkest dark available to create a readable picture (and you had to do a ton of lighting to squash the extreme dynamic range of a real world environment into the small dynamic range of your target medium). As this dynamic range became bigger it became possible to not use the whole range and still have good looking results. So in that line of thinking pictures became darker because they can be.
Not that I defend the whole thing, sometimes dark pictures with a high contrast can be good and very readable, sometimes it is used as an excuse to not do the propper work.
> what's your take on the dynamic range in over-exposing film vs under-exposing digital?
Depends what you're after and what digital camera you are using (and even which ISO you are using on that specific camera). But more broadly IMO digital has reached dynamic ranges that are "good enough" so you don't have to worry about it in most situations. And if shot correctly digital can look exactly like film.
The major difference for me is the way you work with the material. Actors and everybody else tends to be a tad bit more concentrated when the know how many €/s is running through that camera unless you have an infinite amount of film stock and the funds to develope and digitize them.
But whether that is truly worth the stress of changing a roll of exposed film in the wild during bad weather and praying the material survives is another question.
>what's your take on the dynamic range in over-exposing film vs under-exposing digital?
You should actually overexpose digital (without blowing any highlights) to maximize dynamic range.
Our eyes perceive brightness logarithmically — given something emitting P photons perceived as brightness B, something emitting 2P photons will be perceived as brightness B+1, while something emitting 0.5P photons will be perceived as brightness B-1.
Image sensors are linear and discrete. So going from R,G,B=1,1,1 to 2,2,2 represents a doubling of photons captured, and thus be perceived by the eye as going from B to B+1. But 2,2,2->4,4,4 will go to B+2, 4,4,4->8,8,8 to B+3, etc.
Thus, there is only one bit of dynamic range going from B to B+1, 2 bits of dynamic range from B+1 to B+2, 3 bits from B+2 to B+3, and 2^N bits from B+N-1 to B+N. That’s why you want as much brightness information as possible close to saturating the sensor, since that’s where the most bits of dynamic range are.
While it’s true you should in theory expose to the right, it’s realistically a little riskier unless your environment is well controlled or you’re willing to potentially clip some highlights. So for landscapes that you spend a while composing and metering, it makes sense, but for street photography or something where there’s a ton of contrast, I wouldn’t recommend it because you’ll overexpose more often than not and unlike with film, it’s harder (and maybe impossible) to recover detail from highlights.
Since storage is cheap, I’d rather just bracket my shots that need it than expose to the right, reducing the potential of losing a good shot by losing some highlight detail.
> While a modern "night" scene might look what a dim area looks like the first 10 seconds before your eyes adapt: dim, gray-ish.
And that's if you're watching it in a theater-like setting. If you're in a living room with any lights on, the screen is so dark that little comes through but the reflection of the room. After all these years I still do not understand why glossy screens are so popular.
> if its glossy, you can move so the reflection is completely out of the screen
What? That's really not how reflective surfaces work. When the screen is dark, I can see the walls, the ceiling or the floor, the bookshelves... moving around does not help, it only changes what's in the reflection.
So the same reason basically as for the blue LEDs on every new device since 2010? Something that used to be prohibitively expensive before, and so now communicates high value (until people get tired of it eventually).
Sure. But that is a result of capitalism and the modern "3 simple steps to make a good movie"-approach.
In my experience the best movies are made from people with passion. But those cannot be made if these people don't get the money or have to bend over backwards and betray their ideals to do so.
If it were just about making money I would expect film-makers to go the way of least resistance. What perplexes me is that often viewer-hostile decisions get made despite being more work.
For example in movie adaptations of known franchises changes are made just for change sake so that the writers can show off how smart they are. It would be LESS work to just follow the source material. To be clear: I am NOT talking about changes to better fit a specific medium. I am talking about "look how smart I am, you couldn't predict this twist"-kind of writing that does not improve the story. Still, ego-driven writing seems to be very much accepted?
Same with lighting. No viewer has ever complained about unnatural lighting. This stuff is only made to impress peers.
Why is this kind of behavior so common the industry?
I wouldn't bet on ego-stroking as the main driver: my impression is that much of this stuff is marketing strategy, to drive internet discussion. Remember "Lost"? That strategy might have a stronger effect on the on the next installment than on the current one, but that won't stop them.
I explicitly write "internet" because that's what can be measured. And those studios probably have a metric for that and writers go full Campbell's Law on it. I write Campbell's, not Goodhart's, because I suspect that it still works just fine, economically.
(I think I know which kind of movies you mean and I don't feel drawn to those at all. Perhaps in part because of all the "clever change" drama that I read about and that I know would be completely lost to me, so maybe it's not that far from Goodhart's after all?)
>It would be LESS work to just follow the source material.
But the source material might be less marketable and therefore less profitable. So it might be less work, but it would also mean less money, hence the changes.
I'm not sure that such a path of least resistance will necessarily be obvious from the outside.
Likewise, I'm not sure that (from the audience's perspective) one can so easily infer simple and exact reasons for why an adaptation differs from the source material.
I'm sure many of the people hired to work on a film, adaptation or otherwise, do want to have an impact on the production, or 'leave a mark'. Deviating from the source material might even be the path of least resistance.
>"look how smart I am, you couldn't predict this twist"-kind of writing that does not improve the story
This can improve engagement, fandoms/audiences picking over little details trying to "beat" the next twist. Keeps people talking about your movie/series which helps drive more eyes to it.
> movie adaptations of known franchises changes are made just for change sake
Franchise movies are cash grab paint-by-number productions. You can already rule out any kind of passion from anyone involved. Writing changes are made to appeal to international audience not to look smart.
It is possible the writers know what they are doing. In the end, what matters is if the audience shows up. Who wants to see the same thing again without any new twists or ideas?
apparently marketing departments seem to think it’s everyone… so many of the movies, games, etc studios have been pushing over the past few years have been rehashed IP over and over again.
The studios seem to think "everybody", judging from the 100s of sequels, prequels, reboots, and nth installments of movies, where everything is a "franchize"...
The top ten highest grossing films are all franchise movies. So that looks like a good indication of hat the public wants to watch. If you look at the rest, there's a decent number of non-franchise films that studios financed but audiences evidently didn't show up for in such large numbers.
I can see how we might come to this conclusion, but we would also need look at which films were marketed the most, which films had wider distribution, which films had better casting budgets, etc…
If franchise and rehashed-to-death-IP films all had more marketing, wider distribution, and more famous actors/crew, of course they’re going to get a higher viewership.
Is it also possibly a bit of protest against people watching movies at home on a television? That's assuming the dark scenes are easier to see in a theater.
Yeah, just wondering. I know another trend, indecipherable sound, is also odd. Because, for example, I had to walk out of "Public Enemy" in a theater...I couldn't hear the dialogue. But it was watchable at home where I could mess with settings until I could hear.
Danny Boyle's commentary on The Beach specifically includes a discussion of the technique mentioned by OP of shooting in high light and playing with it later. So much to learn from him, Cameron, heck, Bad Santa pointed out that the Coen Brothers were extensively involved early on. Everything from Thurman Merman now, to me, sounds like a Coen Brothers character. I kinda think he is.
Correct me if I'm wrong, but I'm pretty sure film has a higher dynamic range than digital SDR video? From a quick google search, 13 stops for film vs 6 stops for SDR. Obviously the stock that was used for final delivery to theaters was much worse than the stock that movies were shot on, but still.
It's fundamental to a story - I have to see the actors. I have to hear them say their lines.
When they shoot dark scenes that leave me wondering who is talking, what they're doing, why they said that, it's a profound failure of the Director to make a minimally funcitonal story.
The other failure is mumbling actors. They whisper when they don't understand what emotion to show. They rush their lines to show passion but instead show lack of fundamental acting skill. They turn from the camera so I can't see their face, say something to the wall, leaving me without necessary facial clues, leaving me to puzzle out their accent or whatnot.
I blame it on falling standards of professionalism, driven by an insatiable demand for content.
> I have to see the actors. I have to hear them say their lines.
There are entire scenes where, if you watch it in the average person's living room, the only thing you get from the movie are the subtitles (that you set because you can't hear them talking).
We re-watched The Dark Knight (2008) a few days ago.
There's darkness when it's appropriate. There's light to see what Nolan wants you to see. The fights are both well-choreographed and recorded and cut well -- not the flurry of a thousand ultra-tight cuts that waste the talent of everyone involved.
But the dialogue is about 9 dB under what it should be, and only subtitling makes any of it understandable except for a few scenes. My sound system is well-calibrated and powerful: it's the movie's fault.
Nolan is such a bad offender the audio in Interstellar was, I am not kidding, painful to my ears. I saw it in a theater and I'll never ever watch it again ever. It...just...wrong on so many levels.
Christopher Nolan is to cinematic audio what Chris Gaines is to rock and roll.
pro-tip: wear ear plugs to theaters. makes the experience ten times more pleasant for me and eliminates all of the sudden-surprise-audio volume explosions that hurt your ears. it's never impeded my ability to hear dialog in theaters.
He chose it. The happy moments have roughened up speech and the sad moments it's smooth and legible. He also plays with Dr Brandt mishearing Cooper re-entering the spacecraft "I'm sorry / Mann was lying".
It is becoming the anime of the 90’s, pre cowboy bebop. Voice actors would not learn the nuance of the original delivery, the pay was too low to spend the time. Everyone was either yelling or just rushing through them. This is happening now, shows are rushed to deliver “content” and fill the airwaves, not actually do something good.
Honest question: have in general the (japanese) voices of female anime characters increased their pitch during the last years, or am I imagining things?
This is happening now, shows are rushed to deliver “content” and fill the airwaves, not actually do something good.
This may be so, but shows used to have 22 episodes in the US, now series have 8 or 10. So far, far less work yearly, less cramming to get shooting/processing/writing done.
You'd think this would give more time for better takes, directing, acting.
Hmm, at least those I watch, if they have less episodes than its often 40-60 minutes instead of 20. Or maybe its different type of show, dunno. I think the cost of complexity of shooting have skyrocketed, before for a scene you needed few good actors and very primitive generic setting. Now hunt for perfection is much bigger on both sides of screen, plus required digital changes afterwards.
It stunned me to see ie what actors wore in Game of thrones in later series, when budget was high. A robe which was on screen for 1 minute max, and I focused on face of actress and scene so didn't notice it at all, had such details and beautiful ornaments sewed in by hand, crazy amount of work to create 1 unique piece just for that scene. Multiply by X actors there. Must have cost a small fortune to make, even for such a well oiled machine like Hollywood. Compared to ie Streets of San Francisco, guys wearing the same basic suit outfit over whole series.
Games of Thrones is a very rare series. Almost all other series are not that complex.
You're comparing one of the most expensive series ever, with low cost series.
And even action series, in the old days, would film 60 min. episodes, with live action, explosions, car scenes, really things are very cheap now with CGI.
Imagine having to retake a scene with real explosions?
Generally speaking it just means more shows with fewer or singular writers, and without ensemble casts.
A bit part of making shows with 22 -24 episode seasons was having a cast of circa nine main characters, so you could film three of them at once.
Audiences generally respond better now there is not network pressure to deliver half a year's worth of new episodes per annum with shows where their favour characters are the main characters every week.
Maybe, just maybe movies remastered for home consumption should assume that people are sitting in their living rooms with normal setup rather then in large cinema? And that maybe they dont want to play the movie so loud that the whole building/house hears explosions?
I imagine that can contribute. But then other videos don't have the problem. Old movies don't have the problem. Just this new shabby stuff. Does my sound setup know somehow?
Bottom line a good sound engineer with time and access to the raw audio could mix it to work well with stereo systems. But that ain't happening.
> But then other videos don't have the problem.
They likely would if they were intended for playback on a cinema system and just flattened to mono or stereo.
Christopher Noland doesn't direct/produce movies thinking about how they'll look on a phone on netflix. And the sound engineer he hired doesn't care either.
How much time and money was dedicated to the sound engineering to convert that 20 speaker maybe 100 virtual channel system to your 2 speaker system? Then they need to do it for 2.1, 5.1, Atmos... Whatever else is the new thing these days.
And if you have 5.1 did you consider carefully the layout? Are there multipath effects you haven't dampened? Did you consider the path to each speaker relative to listener?
> Does my sound setup know somehow?
Well yes, kind of. Your TV if it is at all modern will do some processing to the audio. If it's connected to a 5.1 system it might just pass through a 5.1, what does it do with a stereo audio? Or 2.1? What if it gets 7.1 it must decide what to do likely you can control this in the settings.
Let's say you have a 5.1 setup at home. And your TV sends it stereo? What does it do then how does it make 6 channels from 2? The subwoofer is easy, relatively non-direction a low pass filter sends all low end to it and high pass to the regular speakers.
My TV has a decent "clear speech" setting which picks out speech boosts them and then suppresses no speech audio around them. Usually it works perfectly.
Playing stereo audio on a stereo system will work fine.
I would buy your argument if it were true that the sound wasn't muffled and unintelligible in theaters. But that's not the case. The recent Nolan films are almost as unintelligable in theaters as they are on a stereo home system, even on IMAX.
Nolan himself has tried to make this argument, he's trying to invoke sub-bass frequencies, that his goal was never for movies to be understandable, and that it was an artistic choice that maybe the audience doesn't have to understand everything.
I'm all for artistic integrity and him being able to do whatever he wants, but I'm also ok with calling Nolan wrong on this one. I have no interest in seeing his films if audio clarity is not one of his goals in film-making. It breaks immersion (real life isn't unintelligible and mumbled), it isn't an enjoyable experience for me, and frankly I think is poor decision making on his part. I'm ok with calling Nolan wrong, hand-wavy artistic stuff aside.
Interestingly in the case of Tenet the claim is that cinemas intentionally were playing the movie too quiet.
It brings up the usual issues of audio mixing and cinema and home is vastly different. But puts Tenet down to cinemas turning it down to avoid loud explosions.
So is this a case of cinemas turning down well mixed movies because movies are coming out too loud?
But really this is just the same problem over again. Sound engineer mixes audio intending for a high end cinema. Some cinemas plays it poorly, people watching at home have no chance to play it properly...
Sure, that may be the case that they turned it down. It seemed to be a widespread issue though across a lot of reviewers and opinions.
The sound engineer interviews in that article are interesting though, and they seem to be blaming Nolan and the mixing as well.
I also get wanting to turn them down. I've been in theaters so loud that my ears were ringing after the movie, or were actively hurting during certain scenes. The Dark Knight was one of those. It could also just be poor settings at the local cinema in that case as well though.
Given the vast majority of movies are understandable, I just wish Nolan would mix with that goal in mind given I would actually enjoy his movies most of the time if it just weren't for the muffled speech and sound.
I don't buy it, much recent content is nearly unintelligible on my pretty good 5.1 setup with a real AVR and real speakers, thats set up and tuned reasonably well. Older movies, from the 80's and 90's don't have this problem, just newer content. And so often it's not even a level problem, it's like the dialogue is deliberately muffled by an EQ or something. I can boost my center channel or use the AVR's speech enhancement control and it's still hard to understand. I kind of buy into the common complaint that it's the actors and actresses that don't know how to project their voices any more.
Atmos’ big feature is to place audio properly no matter how many - or how few - speakers you have.
As long as the receiver has identified the positions of your speakers accurately (something receivers have been capable of with a small microphone since the early 00’s) it should place sounds appropriately.
That’s the thing, Atmos doesn’t care about channels, except for how to use them to represent a sound at render time. And that render is not embedded in the tracks, it’s calculated.
One would think that movies would be produced for the viewers but apparently the monopolistic landscape of modern entertainment allows film-makers to make insane, ego-driven choices without consequences.
People need to be able to see and hear (the audio side is arguable getting even worse) what is happening. That is the absolute baseline for making a good product.
This is the equivalent of making a website where people struggle to read the text and the designer yelling that it is not made to be viewed on cheap mobile phones. Even in our current world of user-hostile web design, most professional designs try very hard to be accessible on a very large range of devices.
And no "naturalism" has nothing to do with it. I don't think unnatural lighting is such an immersion breaker compared the horrible CGI that people are forced to made on crunch time but sure if you want to only use natural lighting, go ahead BUT you still need to make sure the scenes are well lit. There are lots of avant-garde movies with natural lighting that are well lit.
Audio is great, iff you have a soundbar or a good set of headphones. The problem is that a decent portion of the audience don't and there is no fallback to more classic audio mixing.
This means that you either have to create for the lowest common dominator or accept that what they get is shit. As you said we would expect to be chewed out if we did that, but for some reason films are different.
To be honest I can see it both ways, as I don't have a soundbar or an oled display, but at the same time why should that hold back those who have paid 4 or 5 times the cost of my setup? If what I have is crap, shouldn't I expect that movies are not looking very good?
You can do some truly amazing things with just audio[0], but it does require a certain dynamic range on the receiver.
I don't believe actors mumble in general. I am ESL, but I can follow along just fine without subtitles, as long as I use my headphones. The same show with just the speakers in my Macbook and the actors were heard to understand.
Of course there are a couple actors that makes it harder: Marlow Brandon or people who speak with a pronounced Southern Accent. But in general I keep hearing this complaint from people who are using the inbuilt speakers.
I think dynamic range is so foreign to people nowadays since music hasn’t had any in over 20 years. People aren’t used to it anymore.
And you’re right, many people don’t have the things you need to make it sound good. It’s almost like they should have a mastering for bad TVs with crap audio.
Of course they should. There’s lots of people out there who watch their movies on TV with the built-in speakers, or on their tablet or laptop. A TV mix used to be normal, not everybody has the money, space, or desire for a dedicated movie room with an Atmos setup.
>One would think that movies would be produced for the viewers but apparently the monopolistic landscape of modern entertainment allows film-makers to make insane, ego-driven choices without consequences.
(And yes I am aware that technically they would be an oligopoly. I am purposely using the broader, more practical definition of monopoly that is useful for describing the real world.)
As a consumer, I currently have the option of consuming video media from ~9 large companies (Disney, Netflix, Amazon, Apple, Comcast, Paramount, WarnerBros Discovery, Sony), and then you have smaller ones like AMC Networks and whatever film studios there are, and then you can go all the way down to individuals who upload to Youtube or whatever website they want to host their media. This is just US media too, I am sure there are many more large companies around the world.
And there is zero or near zero friction in consuming media pretty much all of this media thanks to broadband internet and capable computing devices in everyone’s pocket, other than the price you are able and willing to pay.
I would not classify it as a "monopoly" or an "oligopoly", at least in the sense that anyone is restricting viewers from watching what they want or anyone is restricting media creators from making what they want.
The "where is the light coming from" complaints in the Scream scene seem completely ignorant of the fact that our eyes have such a staggering amount of fidelity when compared with a camera that the filmmaker must compensate for that by overlighting the scene. You would be able to pick out details of a character's face in a dimly-lit room that only has external light sources because your eyes are incredible devices. To me, it's mind-bendingly tedious for someone to pick this apart and ask where the light sources are coming from when what they are going for is a mood as opposed to objective realism.
And yet somehow that is not at all obvious. It's a form of art. In my high school memory of sneaking into my girlfriend's pitch black house at night, I could see her face and the room every bit this well. That's the point. It doesn't have to perfectly reflect reality; it just has to evoke a feeling of a situation. And this does.
The author didn't question the artistic aspect. He says:
> It’s a purely stylistic choice, employed for that one moment to cast doubt on Billy’s trustworthiness in the audience’s mind. It’s an extremely stagey choice that fits neatly within the larger series’ heightened, melodramatic style. Scream wouldn’t really be Scream without it.
It doesn't mean that he's ignorant of the aspect that a camera looks differently than our eyes do, as you wrote above. The scene is not bright because of this fact. It's bright for artistic reasons, and you could describe it as overcompensation, since it wouldn't even have to be that bright to deliver a facial expression.
In my opinion, the scene is bright because if we were actually in that scene, our incredible eyeballs would be able to discern a lot of detail, and in order to show us all that detail, the filmmaker must make it bright. If you make it dark to make it more "realistic", you are counterintuitively making it less realistic by making it harder to see what we would be able to see if we were actually there. This balance is part of filmmaking.
I’m glad cinema is mostly past its fling with quick cuts and shaky cam, but the current obsession with drab low-contrast color palette is just as bad.
Another one that irks me is shallow depth of field.
I appreciate the deep-focus cinematography by the likes of Kurosawa and Welles all the more when I see modern filmmakers making 80% of their frame blurry on purpose.
Fads are cyclical, so I was hoping that after Zack Snyder brought the style to its logical unwatchable conclusion with his “Army of the Bokeh”, cinematographers and directors would move on, but I guess it’s not that time yet.
> Another one that irks me is shallow depth of field.
This is even more annoying in YouTube videos. People are so obsessed with shallow depth of field that even in product reviews they focus on the face and leave the product blurry.
I can't find it right now, but there's a video by Tony Northrup where he's reviewing a camera and he's so obsessed with filming himself at f1.4 that his face is barely in focus half the time because of the failed focus tracking.
Personally I don't agree with the take of the article in two ways. The first one is that it gives way too much default credit to realism as a stylistic choice. Movies are works of fiction and the creator has total artistic freedom in what aesthetic to go for, and being 'realistic' isn't necessarily a good thing, you still need to make the case for it.
Secondly I think the bigger reason is a trend towards grittiness, bleakness or a sort of 'scandi-noir crime drama' look. People go for that mood not just visually but also in terms of writing, muted mumbly dialogue, minimalism, and so on.
Villeneuve's Dune that's mentioned in the article is to me, despite the apparent popular appeal, a negative example of this trend. The movie is overly bleak and oppressive, cold, distant in a way that many of his films are and heavy on visual stereotypes. (the hairless, pale, black-dressed 'brutish' Harkonnen's, a caricature that the books deliberately avoided).
Yes, the article assumes the directors know their film will be unwatchable in certain contexts and have chosen to keep it that way. Is there any evidence for this? Have they ever actually watched it on one of those cheap laptop screens or TVs where the black shows up as bright blue, brighter than other colors? Because a lot of people have those screens.
They don't have to do it. The movie is supposed to be re-cut and re-color-graded and a thousand other things before shipping to home media. That's been the case since forever, and it was never the director's job to do that.
These days for some reason studios forego that process and "meh, good enough, stream it".
I liked Baron Harkonnen’s depiction in the movie. The first time I read the book the image I had in my head of him floating around in suspenders just seemed silly to me. I’m hoping they make Feyd Rautha less silly as well. Sting in a metal speedo always seemed a perfectly faithful depiction to me, but it’s just so hard to take seriously. I’m personally more disappointed by the de-islamification of the Fremen then any of the stylistic choices.
I think the stark colors made it easier to follow what you were looking at. The Saradukar wore white in the movie so it was a lot easier to follow that they were imperial forces and not Harkonnen. Anything they can do to make the plot more transparent, the better.
Completely agreed on Dune. Part of it is the barren production design, but the cinematography is drab and ugly as well. Compare and contrast with Villeneuve’s beautiful Blade Runner 2049 shot by Roger Deakins. The difference is night and (murky) day.
> It first materialized in a big way back during the late seasons of Game of Thrones. Episode after episode, people furiously tweeted about how hard it was to see, well, anything going on on screen.
From an enthusiastic GoT fan back then: I don't really know where the author got that "episode after episode" from. I remember exactly one episode, where overdark scenes were an issue, S8E3 "The Long Night" - and in the episode, darkness was a plot device and a deliberate stylistic choice.
There was an unfortunate scene in which the characters were unable to make out whether the enemy was approaching or not, due to intense darkness. The filmmakers chose to visualise this by making the scene almost pitch black, with literally not enough information in the pixels to see the characters. I'm sure that must have looked impressive in a test screening in a cinema, but when streamed, it caused lots of viewers to be distracted by their own reflections and turn up their TVs to maximum brightness - because the scene sort of looked as if you should be able to see something, even though really you weren't.
I think that was a visual experiment by the filmmakers which, frankly, failed - but it was at least a deliberate choice and not blindly following some trend.
There is a segment of the new Dune movie where I feel this happens. I believe it is Paul and Jessica's crossing and their encounter with the worm. The contrast feels so washed out and it distracts me because my eyes start straining for the light... but it seems 100% intentional. It's pre-dawn and the world feels grayscale. I'm complaining about it, but maybe I would miss it if they didn't do that.
With Dune, even though they sent a full crew and cast into the Wadi Rum dessert, they also ended up shooting a bunch of dessert scenes on a sound stage under a projection dome.
Yeah, the whole point of that was that "it's dark as shit, no-one can see anything, everyone is everywhere, no-one can see what's going on, everyone is scared, no-one has a scoob what's happening even a foot away".
I thought it played nicely with the use of light to stylise the rest of the world - Westeros with its bright but slightly murky look, like a busy city in Central Europe, Winterfell, The Wall, and the north with its flat grey shadowless light from an overcast sky, and the searing oranges and yellows of the desert-y parts of Essos that looked and felt hot to look at.
Another factor: If you watch a film in the cinema then the surroundings are almost completely dark, which makes it easier to see what's going on on a dark screen. However we increasingly consume content at home where there is typically more ambient light, especially if we don't choose to turn off all the lights just to watch tv as other family members may be doing other things and find it annoying.
It's not just films, a lot of TV programmes are heading that way too.
One example is the recently released adaptation of Great Expectations[1]. It's not just the overall darkness, but general lack of colour which is striking to me. I think this is just an artistic style and the next stage of colour grading from the Teal & Orange[2] trends of a few years ago.
You get used to it when watching a programme and forget what full colour actually looks like. Watching old films, something like the 1962 film Lawrence of Arabia[3] highlighted it for me, really shows how much colour has been drained from modern film & TV productions.
> Even broad, big-budget blockbusters like Harry Potter and the Deathly Hallows – Part 1 embraced a look torn straight from indie cinema. Not only are the lights in that film always motivated, they’re realistic.
Did you tried to see anything at night in a winter forest? It is impossible. Maybe full Moon can help to see something, I don't know really, but my experience tells that the only thing you can see without an artificial light is darkness and (if you are lucky) a few stars blinking through branches of trees (which are invisible by themselves because they are totally black over a black background).
I always laughed at a scene where Harry Potter follows silver doe to a river and Ron Weasley finds him without a notice (Harry didn't see any lights coming), and Severus Snape secretly watches them. Ok, I can write off Snape, he is probably using some magic to see in the dark, he is a very powerful wizard after all. But all the seven books say not a word about Harry Potter or even Hermione Granger knowing something like that. When they face darkness they use Lumos that works like a flashlight. And a film shows completely unrealistic forest where you can see the scene. Trees, for example. Screen doesn't turn into a black rectangle, but to be realistic it should.
When I visit my parents we sometimes walk the dog after dinner, and we go one of two routes. In the summer seeing is no problem, but in the fall or winter as soon as we turn of the street with its street lights and past the lit church tower the moon makes a massive difference.
You are right that if we walk on a moonless night it is near impossible to see anything, but on a full moon night the entire world is lit in ethereal silver. I won't say it is as lit as during the day, but I would have zero issue navigating at all.
It also takes only one look to understand why werewolves, silver and the moon are connected in folktales.
That scene had a full moon as a plot point. It's pretty easy to see around you in a sparse forest on a moonlit night. The only real issue is how quickly your eyes can adjust to darkness (e.g. they fire off spells and patronus charms that should be ruining their night vision).
Watching the bright and beautiful trailer for Wes Anderson's Asteroid City reminded me how wonderful it is to have a director who isn't interested in a pointless pursuit of "realism" in an inherently unreal medium, and instead joyfully and unashamedly embraces its artificiality and creates his own style, instead of following the herd. I would take him at his quirkiest and most cliched over another dark and colourless movie.
Sadly, this is a trend too. Lush, orchestral music with memorable themes, harmonic development, etc. is being phased out in favor of minimalist, repetitive, electronic sounds. To understand what that gives up, I recommend the "Listening In" channel e.g. https://youtu.be/iGN_5oNla_8
It's not just movies. There was an episode of The Mandalorian with cave scenes so dark that they were literally unwatchable for me with the shades open. You'd think people would have learned after the mess with Game of Thrones.
The new live-action Star Trek shows have taken to this madness as well. Star Trek: The Next Generation and Voyager were both bright, open shows. But you'd think with Discovery, Strange New Worlds, and Picard that we hadn't invented indoor lighting. Babylon 5 in its heyday was visually brighter than any of these three and no one can say that B5 was going for an "optimistic" plot worthy of such luminosity.
It's frustrating to watch and I empathize with people who feel like their cherished shows of before are being "ruined" by spin-offs or reboots.
When the Next Generation switched to movies, Generations did have that odd yellow filter though, which apparently made it more cinematic. While realism clearly doesn't have anything to do with this decision, the science fiction set of the bridge might just look fake when it's well lit. I miss the 90s look too.
As far as people who say their shows are being ruined, the studios are still chasing younger audiences and mainstream popularity. A starfleet academy show was announced the other day for example, so teenage drama here we come.
>While realism clearly doesn't have anything to do with this decision, the science fiction set of the bridge might just look fake when it's well lit.
How so, we are already used with Federation ships having a lot of lighting, so now with the new cool style they also need to somehow explain why they decided that Star Fleet prefers to walk in semi-darkness on the bridge, corridors and working spaces. I can maybe understand the Game of Thrones or Mandalorian caves need to be dark for realism excuse but for Star Trek is obvious a stylistic choice to use the cool new styles and shit on the bright optimisitc one from the past.
I think the thing you're missing here is that Picard isn't about the optimistic Starfleet of the past, it's a lot darker and insular than the old one. A key plot line throughout the first season was Picard's failure to avoid that happening. Strange New World's Enterprise is brightly lit, with a clear nod to the original series' design.
Season 3 is still dark on the bridge, it is a Star Fleet bridge not alternative timelines and they are bringing all the old crew back, I think is OK so far but I wish I could see the bridge.
This is based on no evidence whatsoever, but I think they just focus grouped the decision of how much lighting to use. The neon displays do pop a lot more in dim lighting so making things look futuristic and glowy is an important part of the shows.
Not sure, in Discover for example when you go to the far future the Star Fleet command is very well light, dark lighting was used before in Trek in Klingon ships or other shady aliens. IMO is the people in charge that impose their aesthetic, and maybe it looks good on their giant home cinema. Anyway the issue is you can't see things, if they want to emphasize something they need to do it without hurting accessibility. We also have Strange New Worlds and Lower Descks where they can appear futuristic and be bright.
> You'd think people would have learned after the mess with Game of Thrones.
Given how ridiculously dark House of the Dragon was in some episodes, with the entire season only set in candlelight, I don't think they've learned anything.
Thank you! I wish I knew one tenth of what mpv does! Every time I launch mpv I'm afraid to press any buttons, lest I inadvertently summon up an overlay showing air particle pollution in Yokohama.
You are right, and it works quite well in the dark months to just watch it later when it's dark outside, but that won't work for everybody.
I wonder how many people pull up brightness on their TVs only for that reason and ruin the experience for all that actually bright content just because they have to see something in that darkness.
It's like they filmed it with no light source and figured they could fix it post production (and failed).
Either that they filmed it in a bright setting and tried to use post production to simulate a dark scene.
Same as you, I couldn't see what was happening.
I find I just automatically disengage when it's a "lots of scary/violent sounds and unidentifiable people committing violent acts against other unidentifiable people" scene and I feel like I'm somehow expected to be keeping track of who's hitting who and who's got the upper hand. I end up in this "ok, it's an indecipherable skirmish, can we just skip to what the results are" attitude.
So dark, I can no longer view them on an Apple iPad.
Couple that with some video streamers (looking at you, Netflix) always defaulting to a different audio track, while it is often in its original film’s language, the audio gets bastardized with commentary overlay and overlaps thereby ruining the original movie experience. Often cannot default to “original movie sound”.
Oh, did I mention that the closed captioning has started ignoring the hard-of-hearing and deaf folks? They only transcribe the foreign language into open-captions. I mean, WTF? We folks kinda want ALL languages captioned.
Just the three kinds of darkness we are descending into … nowadays.
> They only transcribe the foreign language into open-captions. I mean, WTF? We folks kinda want ALL languages captioned.
Do you mean that when there’s foreign langage being spoken the CC track just sits unused and only the on-screen translation for hearing audiences remain?
I assume that plays havoc on accessibility equipment? And from what I remember open captions are often styled to not clash with the picture, so they’re sometimes had to read too.
Placement of closed-captioning does not interfere with the viewing experience of folks without hearing loss. They didnt ask for it and always defaulted to "uncaptioned", as it should be.
In turning the closed-captioning ON to English, it is the loss of English (in non-English language movie) captioning when a foreign actor sporadically speaks in English. "Normal"
English viewer would enjoy just the same; deaf and hard-of-hearing, not so much.
One thing that always gets me is kind of specific to science-fiction and space-opera genres: why would anyone build a spaceship with a powerplant that can light up a planet (and sometimes does) get skimpy on lighting?
I get it. It's style, but it's the opposite or realistic, or motivated lighting - one would imagine a workplace would be lit properly, use light walls to reflect more direct light and provide diffuse lighting.
In a similar vein, it seems these days that every time a character walks into a dark room to check something out (not just wandering around aimlessly), they fail to turn on the lights, which is something I think most people would do first. And not just horror films trying to scare you - for example the recent movie "Tár" has at least one scene just like this.
A bit like cellphones taking a long time to become integral to plots. Now I just tell Alexa to turn on the lights well before I get to where I want to look for something.
On a more fundamental level it seems like the reason that movies are dark [or long, or shaky-cammed, or mixed with near-inaudible dialogue, or…] is the same reason movies are such strong and specific cultural touchpoints: they are largely the vision and decisions of the people in charge, relatively unencumbered by prosaic concerns such as accessibility or even profit (at the very least, this last one is true per movie: a director whose films flop quickly stops being able to find funding, so the profit concern does assert itself, but even that director mostly gets to make the movie they want to make while on set and in the editing booth).
The proliferation of sequels trading on nostalgia, cinematic universes like Marvel etc., may seem like counter-examples but I’d argue they represent a directing team whose specific vision is making a lot of profit. Profit is an internal motive and the directing team’s wide latitude lets them pursue it, rather than profit being an external concern that steps in scene by scene and tells them to add more lights. (Also note that Marvel’s MCU is well-known for the very particular creative vision driving the entire franchise.)
When the decision-makers have a unique vision, we get cult classics, new schools like “lighting realism”, and also awful movies like The Room. When the decision-makers are influenced by trends, we get waves of shaky cam or dark lighting.
It's also the problem of "we'll fix it in post". Applies to sound, too.
Too many movie makers these days assume that any issues can be removed by CGI in post-production. But you can only do so much with the material you're given.
Oh, and post-production time and budget are often an afterthought, too.
You can’t leave projection technology out of it. Frequently the bulb is nearly burned out in the projector or they left the 3D polarizer on or the system is otherwise misconfigured. As for “Home Theater” even people with a nice TV and sound system might not have perfectly controlled light. if you’re a guest in people’s homes you’d better act like it.
I started using MPV or VLC or Daum Potplayer to manually adjust the brightness and contrast of tv shows and movies I'm watching. I stopped valuing the "filmmaker's choice" when I saw the analysis of how dark that beach scene was in House of Dragons was. [1] It's not just that consumer equipment is differently calibrated from color grading equipment, that specific scene was graded to 1 nit brigtness. Hbo later tweeted that it will not be fixed and is absoultely intentional. [2] The offical recommended viewing conditions of HDR according to the ITU and Dolby is an ambient 5 nits.
The artists aren't respecting or grading with consumer equipment in mind. I'm not getting an optimal viewing experience or even getting the artist's creative intent by trying to recreate the intended viewing settings. Dolby Vision still isn't completely supported when playing back a file on desktop. Certain DV profiles can be passed through to TVs that support it, but that's not possible for Blu-Ray Dolby Vision files. Neither the Xbox nor PS5 support Dolby Vision disk playback. Many playback solutions for playback of Dolby Vision from PC throw away the dynamic data defeating the point of caring about DV over HDR10. This is also ignoring how streaming changes colors can change due to compression, and some shows and movies are only available via streaming.
My TV can detect ambient light, Dolby Vision IQ already exists, but in my experience fails too often. Some would argue DV IQ also moves away from filmmarker's intent.
This is something I noticed when I was younger, and I guess I’ve just adapted to it over time.
As a 10 year old, trying to watch a dark movie where the main characters whisper or speak in a low volume was basically impossible. I don’t know why, maybe my senses just weren’t that developed yet, but I struggled so much. I remember watching Harry Potter, and in that scene where Harry stands in front of the mirror with Dumbledore, I could barely see or understand anything, despite watching this in the cinema.
As I got older, I was able to perceive darker scenes and listen to whispered or hushed speech better. But it made movie watching way less enjoyable when I was younger. There were large chunks of movies where I basically just zoned out because they were too dark and or quiet. I’m curious if anyone else experienced something similar.
As an adult who probably has some hearing damage, I always watch with closed captions enabled now. There is SO much material that I have lost from mishearing a word here and there. Reading the script along with the audio helps my to follow the storyline, as those "whispered or hushed speech" remain a cognitive problem for me.
Happy to hear it’s a stylistic choice and not a result of the change in recording, lighting, and projection equipment. I can’t stand it. At first I thought my theatre needed new projectors. But as I started to see this at home as well I thought maybe it was me. I verified with others that everything seemed dim and washed out. I started to wonder if it was the lighting - the big, heavy and hot lights have been replaced by LEDs. Could it be that? Are we doomed forever?
I think this style looks great in photography but it’s distracting when trying to watch a movie. I hope they find a new aesthetic.
I feel like it’s because tv sets have more dynamic range now, and cameras are more sensitive, so there’s less reason to compress the brightness values into a narrow, bright range. After all if the tv set has really dark darks and really bright brights you want those to pop. You want to use that if it’s available. If cameras are crappy and tv signals are noisy, then the lighting has to be really bright and the picture has to be really bright too in order to see it at all. We’re just used to a very low quality signal.
Why do streaming services like Netflix compress the fuck out of dark colours. I actually don’t have a bad TV so can watch these dark movies and TV shows, but the blacks are blocky compressed messes. I have a good internet connection and all the rest of the video feed looks fine, but any dark areas just look awful.
Nobody knows that mastering gamma changed from 2.2 to bt1886 (2.4) in/around 2018. All movies now are mastered for 2.4 now. This is a big reason, most unfamiliar with production don’t realize.
Surprisingly no mention of Barry Lyndon, for which Kubrick had special cameras/lenses designed so he could film scenes lit by candlelight without extra lighting. That movie plays like series of paintings.
I’ve often fantasized about building a movie database that averages the lightness of all pixels in all frames to create a sortable score. I hate watching movies where for 120+ minutes it’s night or dark.
There's a solution here and I don't know why nobody's talking about implementing it.
First, some context: the problem of dark scenes is the same problem as unintelligible dialog is the same problem as a lot of classical music being quieter than pop music.
All of these creators are striving for greater range. Dark scenes work fantastically if you watch them on a bright screen in a pitch-black room. Dialog is perfectly intelligible in surround sound at full volume, and marvelously expressive. And classical music is delicately soft and then powerfully loud in an otherwise perfectly quiet room, to incredible effect.
But as soon as you're watching on a screen in a normally lit room, or listening on your TV speakers while people are talking in the kitchen, or trying to listen to classical in your car with the sounds of traffic... it all falls apart.
The solution is what audio engineers have known about for decades, which is called compression. Compression is: make the quiet parts of the classical music almost as loud as the loud parts. Boost the dialog channel so the mumbling parts are almost as loud as regular conversation. And boost the brightness of dark scenes. Compress the range -- go back to less range.
But we don't want to mess with the source material when viewing/listening in ideal conditions, because we want to keep that awesomeness. So we need dynamic compression. Which isn't really that hard.
A television can have an ambient light sensor just like your MacBook does, to boost dark scenes when you watch during the day, but not at light with the lights off.
A television can have a cheap microphone to detect ambient noise, and also be aware of its own volume setting, and so boost dialog at low volume settings and when it's noisy around, using the surround sound signal. (And a MacBook or iPad or iPhone can do this too.)
And your car radio or phone music player can apply dynamic audio compression to your classical music as well, similarly using a microphone to detect the need to compensate for the rumble of traffic or conversation in the coffee shop.
What baffles me is that all of this is possible right now, and it's actually trivial to implement, relatively speaking. (There's a bunch of tuning involved to make it work well and feel perceptually natural, but it's not like we need to invent new types of signal processing or anything.)
But nobody's even talking about dynamic compression as the solution. And I just don't get it. Why not?
Dolby Vision IQ is supposed to do this. No idea if it works or not, but I did just read a comment above here from someone complaining that it doesn't work well.
My AVR has "loudness management" which compresses the audio. It just doesn't work transparently so I find it irritating because it's obvious when it's changing the volume. My partner does prefer it though, but I'm more of a purist.
So the answer to your question seems to be that it doesn't work well as a solution.
I wonder how far we are away from AI real-time user defined filters? Enhance the lighting, dialog, reduce shouting, and especially a shot stabilizer to remove the 'blair witch' camera jitter? that every single movie / TV show seems to use in the last 10? years.
Low light might help with designing digital backgrounds since low dynamic range makes for simpler lighting design...i.e. you can get further with a few diffuse sources in the computer model. That will speed up rendering versus high dynamic range and more sources.
So it's cheaper and faster and movie making is a business.
The entire movie making business has become a race to the bottom, but in a bad way. I always assumed that with the advancements in CGI, we'd get better and better looking movies, released more often. Boy, was I wrong.
Instead, CGI became a lazy way out for quick and sloppy production quality, even in established franchises from studios with deep pockets. In the first IronMan and SpiderMan movies, the heroes actually wore physical suits painstakingly made by artisans in the business. In the last Marvel movies the costumes are just lazy CGI that look like textures over a spandex suit instead of acting like physical items over a human body. Terminator 1 & 2 had practical models and they still look great.
"Oh, we don't need to finalize the real costumes and models now, just dress the actors in gray spandex with tringles, put them behind a green screen and we'll CGI everything later on before release". "It's OK, we don't need to bother having correct lighting and shadows in the studio, the CGI monkeys will just fix it in post with their computer thingies."
CGI now somehow looks worse than VFX 10-30 years ago despite the insanely better tech, everything just feels rushed with no creative direction behind it.
> CGI now somehow looks worse than VFX 10-30 years ago despite the insanely better tech
"Toupees always look so fake. I can spot them instantly. I can't imagine why anyone wears them."
Maybe that example is less pointed now that baldness seems more socially acceptable, but the point stands: Of course you see bad CGI, in that you notice that the CGI you're intended to notice is sometimes flawed. You don't see CGI the moviemaker doesn't want you to see, CGI used to clean up shots, CGI used to replace what would have been matte shots decades ago, and whatever else CGI is used for these days. You see a scene and it registers as a scene, not a shot with some obvious CGI composited in.
Recursing, some matte shots were horrible and obvious, whereas others, like Dana Barret's apartment building in Ghostbusters, were pretty seamless.
I'm gonna be honest. I still think that Star Wars 4/5/6 looks so much more authentic than 7/8/9 because it didn't have as much CGI ability so the use of models etc was more important.
1/2/3 is the worse examples as the CGI quality was particularly bad. Let's not even start about Jar Jar.
Something that Joss Whedon said in the commentary on Firefly and Serenity was that they deliberately took care to make the CGI bits a bit crappy - no perfect tracking shots of ships in space, but a bit of wild panning to catch the action, not quite getting everyone, a crash zoom out. The whole "cinema verité, cameraman caught on the back foot" thing. Even a lot of the "exterior mounted cameras" were done to look a bit wobbly and grainy, because it would be hard to shoot if it was real so you'd model it like you'd just sucker-mounted a VX2000 to the side of your ship and shot on DV as best you could.
It's surprising how it makes it more immersive than perfection.
Things like the new Marvel movies leave me a bit cold because there's no sense of "how did they do that", because the answer is always going to be "green screen and Blender".
> Things like the new Marvel movies leave me a bit cold because there's no sense of "how did they do that", because the answer is always going to be "green screen and Blender".
Interesting comment, because while watching Avatar Way of Water, I was constantly asking to myself "how did they do that?" Technically, the answer is of course "green screen and computers", but that's about as useful as watching a great performance in acting or sports, and saying "it's just technique and practice". No, it's that but while trying to make new things that you haven't seen before, paying attention to every detail - and in the case of Avatar, also doing that for 3 hours straight.
I think the modern Marvel movies leave me cold because of the same issues that the story for Avatar left me cold: it's all easy and digestible, executed with perfect competence but little ability or desire for surprise, subtlety or discomfort.
You can see this with the Lord of the Rings (Fellowship of the Ring, Two Towers and Return of the King).
The Hobbit is much newer but doesn't look so great in comparison. Sure it has some amazing special effects but the cinematic footage doesn't compare as well next to the older films
LotR trilogy from over 20 years ago even with its CGI usage still feels real today and all that realism allows you to submerge within the movies - it doesn't even have to be the extended edition. It somehow has a good balance even if the effects feel dated nowdays.
But Hobbit trilogy - all the time I had feeling I'm watching it from behind some plastic screen, or perhaps an action game. It's not bad (tho the book was "spread" into 3 movies) but excessive use of CGI and post-processing is clearly visible. Hell, the An Unexpected Journey has moments where Gandalf's face is unnaturally lighten up with dodge tool - and that does give me the unpleasant feelings.
I still believe that computers in movie industry should be tools to enhance camera work, not fully replacing it. You can deliver an exciting story portrayed without overfilling it with CGI or... lens flares, bokeh effects in every scene.
Thank you for writing that. I loved the LoTR films and as you say they felt real and natural whereas I've only seen about 15 minutes of the recent Hobbit and I couldn't watch any more. I couldn't put my finger on it, but I just couldn't get into how it looked.
Very heavy on the old man nostalgia glasses. Yes, the very best VFX in the past stands its ground. Taste is still not optional.
If you are however claiming that, on average, visual fidelity has declined in comparable product tiers, well, no. In the past a lot of folks would just not do much FX at all, because it was prohibitively expensive.
> CGI now somehow looks worse than VFX 10-30 years ago
It's somewhat incorrect. We now have significantly more CGI, and most of it significantly better. See, e.g., Why C.G. Sucks (Except It Doesn't) https://youtu.be/bL6hp8BKB24
Movies have dark scenes nowadays mainly because it is a trend. On top of that dark scenes can have practical advantages (set building, VFX, lighting, etc. can be reduced or become much simpler to do which directly translates into money saved during shooting).
If I had to guess, the trend of dark scenes are a direct result of the fact that in the past two decades we our digital sensors got good enough to actually shoot in such low-light environments.
Before that film crews were typically shooting day-for-night which meant waiting for a set of specific weather conditions and then hoping that grading things blue-ish would sell the thing as being shot in the night.
Another aspect is the much higher brightness and contrast (=dynamic range) of today's displays and projectors. Back in the day you had to literally use the whitest white and the darkest dark available to create a readable picture (and you had to do a ton of lighting to squash the extreme dynamic range of a real world environment into the small dynamic range of your target medium). As this dynamic range became bigger it became possible to not use the whole range and still have good looking results. So in that line of thinking pictures became darker because they can be.
Not that I defend the whole thing, sometimes dark pictures with a high contrast can be good and very readable, sometimes it is used as an excuse to not do the propper work.
Thing is, the eye adapts to dim light very well. So an old school scene "night" scene may actually better represent what it feels like being there.
While a modern "night" scene might look what a dim area looks like the first 10 seconds before your eyes adapt: dim, gray-ish.
Curve-ball:
what's your take on the dynamic range in over-exposing film vs under-exposing digital?
Depends what you're after and what digital camera you are using (and even which ISO you are using on that specific camera). But more broadly IMO digital has reached dynamic ranges that are "good enough" so you don't have to worry about it in most situations. And if shot correctly digital can look exactly like film.
The major difference for me is the way you work with the material. Actors and everybody else tends to be a tad bit more concentrated when the know how many €/s is running through that camera unless you have an infinite amount of film stock and the funds to develope and digitize them.
But whether that is truly worth the stress of changing a roll of exposed film in the wild during bad weather and praying the material survives is another question.
You should actually overexpose digital (without blowing any highlights) to maximize dynamic range.
Our eyes perceive brightness logarithmically — given something emitting P photons perceived as brightness B, something emitting 2P photons will be perceived as brightness B+1, while something emitting 0.5P photons will be perceived as brightness B-1.
Image sensors are linear and discrete. So going from R,G,B=1,1,1 to 2,2,2 represents a doubling of photons captured, and thus be perceived by the eye as going from B to B+1. But 2,2,2->4,4,4 will go to B+2, 4,4,4->8,8,8 to B+3, etc.
Thus, there is only one bit of dynamic range going from B to B+1, 2 bits of dynamic range from B+1 to B+2, 3 bits from B+2 to B+3, and 2^N bits from B+N-1 to B+N. That’s why you want as much brightness information as possible close to saturating the sensor, since that’s where the most bits of dynamic range are.
This is called “exposing to the right” [0].
[0] https://digital-photography-school.com/exposing-to-the-right...
Since storage is cheap, I’d rather just bracket my shots that need it than expose to the right, reducing the potential of losing a good shot by losing some highlight detail.
And that's if you're watching it in a theater-like setting. If you're in a living room with any lights on, the screen is so dark that little comes through but the reflection of the room. After all these years I still do not understand why glossy screens are so popular.
if its glossy, you can move so the reflection is completely out of the screen
if it's matte, it'll always look a little frosty, right?
reasonable tradeoff either way, but for home theater, i can see why sleek black sells (not to mention how garbage matte would look in a bright store)
What? That's really not how reflective surfaces work. When the screen is dark, I can see the walls, the ceiling or the floor, the bookshelves... moving around does not help, it only changes what's in the reflection.
In my experience the best movies are made from people with passion. But those cannot be made if these people don't get the money or have to bend over backwards and betray their ideals to do so.
For example in movie adaptations of known franchises changes are made just for change sake so that the writers can show off how smart they are. It would be LESS work to just follow the source material. To be clear: I am NOT talking about changes to better fit a specific medium. I am talking about "look how smart I am, you couldn't predict this twist"-kind of writing that does not improve the story. Still, ego-driven writing seems to be very much accepted?
Same with lighting. No viewer has ever complained about unnatural lighting. This stuff is only made to impress peers.
Why is this kind of behavior so common the industry?
The way of least resistance in an industry controlled by capital is to cave to the interest of the capital.
I explicitly write "internet" because that's what can be measured. And those studios probably have a metric for that and writers go full Campbell's Law on it. I write Campbell's, not Goodhart's, because I suspect that it still works just fine, economically.
(I think I know which kind of movies you mean and I don't feel drawn to those at all. Perhaps in part because of all the "clever change" drama that I read about and that I know would be completely lost to me, so maybe it's not that far from Goodhart's after all?)
But the source material might be less marketable and therefore less profitable. So it might be less work, but it would also mean less money, hence the changes.
Likewise, I'm not sure that (from the audience's perspective) one can so easily infer simple and exact reasons for why an adaptation differs from the source material.
I'm sure many of the people hired to work on a film, adaptation or otherwise, do want to have an impact on the production, or 'leave a mark'. Deviating from the source material might even be the path of least resistance.
This can improve engagement, fandoms/audiences picking over little details trying to "beat" the next twist. Keeps people talking about your movie/series which helps drive more eyes to it.
Franchise movies are cash grab paint-by-number productions. You can already rule out any kind of passion from anyone involved. Writing changes are made to appeal to international audience not to look smart.
apparently marketing departments seem to think it’s everyone… so many of the movies, games, etc studios have been pushing over the past few years have been rehashed IP over and over again.
The top ten highest grossing films are all franchise movies. So that looks like a good indication of hat the public wants to watch. If you look at the rest, there's a decent number of non-franchise films that studios financed but audiences evidently didn't show up for in such large numbers.
If franchise and rehashed-to-death-IP films all had more marketing, wider distribution, and more famous actors/crew, of course they’re going to get a higher viewership.
That's hilarious considering how bad the average writer seems to be these days.
That aside: Any commonly used digital cinema camera today has 13+ stops of dynamic range without even considering any special HDR modes.
When they shoot dark scenes that leave me wondering who is talking, what they're doing, why they said that, it's a profound failure of the Director to make a minimally funcitonal story.
The other failure is mumbling actors. They whisper when they don't understand what emotion to show. They rush their lines to show passion but instead show lack of fundamental acting skill. They turn from the camera so I can't see their face, say something to the wall, leaving me without necessary facial clues, leaving me to puzzle out their accent or whatnot.
I blame it on falling standards of professionalism, driven by an insatiable demand for content.
There are entire scenes where, if you watch it in the average person's living room, the only thing you get from the movie are the subtitles (that you set because you can't hear them talking).
There's darkness when it's appropriate. There's light to see what Nolan wants you to see. The fights are both well-choreographed and recorded and cut well -- not the flurry of a thousand ultra-tight cuts that waste the talent of everyone involved.
But the dialogue is about 9 dB under what it should be, and only subtitling makes any of it understandable except for a few scenes. My sound system is well-calibrated and powerful: it's the movie's fault.
Christopher Nolan is to cinematic audio what Chris Gaines is to rock and roll.
This may be so, but shows used to have 22 episodes in the US, now series have 8 or 10. So far, far less work yearly, less cramming to get shooting/processing/writing done.
You'd think this would give more time for better takes, directing, acting.
Guess not.
It stunned me to see ie what actors wore in Game of thrones in later series, when budget was high. A robe which was on screen for 1 minute max, and I focused on face of actress and scene so didn't notice it at all, had such details and beautiful ornaments sewed in by hand, crazy amount of work to create 1 unique piece just for that scene. Multiply by X actors there. Must have cost a small fortune to make, even for such a well oiled machine like Hollywood. Compared to ie Streets of San Francisco, guys wearing the same basic suit outfit over whole series.
You're comparing one of the most expensive series ever, with low cost series.
And even action series, in the old days, would film 60 min. episodes, with live action, explosions, car scenes, really things are very cheap now with CGI.
Imagine having to retake a scene with real explosions?
A bit part of making shows with 22 -24 episode seasons was having a cast of circa nine main characters, so you could film three of them at once.
Audiences generally respond better now there is not network pressure to deliver half a year's worth of new episodes per annum with shows where their favour characters are the main characters every week.
Cinemas have enough speakers and processing to create dozens of virtual speakers.
The result is that the speech is given it's own location and segment of frequencies.
The brain can easily pick it out in this setting. Think hearing your partner speak at a dinner party.
Then people watch movie on a two backwards facing speaker thin TV and think why can't I hear anything!?
It's like watching a 3D movie with no glasses and wondering why it's blurry.
> But then other videos don't have the problem.
They likely would if they were intended for playback on a cinema system and just flattened to mono or stereo.
Christopher Noland doesn't direct/produce movies thinking about how they'll look on a phone on netflix. And the sound engineer he hired doesn't care either.
How much time and money was dedicated to the sound engineering to convert that 20 speaker maybe 100 virtual channel system to your 2 speaker system? Then they need to do it for 2.1, 5.1, Atmos... Whatever else is the new thing these days.
And if you have 5.1 did you consider carefully the layout? Are there multipath effects you haven't dampened? Did you consider the path to each speaker relative to listener?
> Does my sound setup know somehow?
Well yes, kind of. Your TV if it is at all modern will do some processing to the audio. If it's connected to a 5.1 system it might just pass through a 5.1, what does it do with a stereo audio? Or 2.1? What if it gets 7.1 it must decide what to do likely you can control this in the settings.
Let's say you have a 5.1 setup at home. And your TV sends it stereo? What does it do then how does it make 6 channels from 2? The subwoofer is easy, relatively non-direction a low pass filter sends all low end to it and high pass to the regular speakers.
My TV has a decent "clear speech" setting which picks out speech boosts them and then suppresses no speech audio around them. Usually it works perfectly.
Playing stereo audio on a stereo system will work fine.
Nolan himself has tried to make this argument, he's trying to invoke sub-bass frequencies, that his goal was never for movies to be understandable, and that it was an artistic choice that maybe the audience doesn't have to understand everything.
I'm all for artistic integrity and him being able to do whatever he wants, but I'm also ok with calling Nolan wrong on this one. I have no interest in seeing his films if audio clarity is not one of his goals in film-making. It breaks immersion (real life isn't unintelligible and mumbled), it isn't an enjoyable experience for me, and frankly I think is poor decision making on his part. I'm ok with calling Nolan wrong, hand-wavy artistic stuff aside.
So I looked one up
https://www.theguardian.com/film/2020/sep/03/tenet-dialogue-...
Interestingly in the case of Tenet the claim is that cinemas intentionally were playing the movie too quiet.
It brings up the usual issues of audio mixing and cinema and home is vastly different. But puts Tenet down to cinemas turning it down to avoid loud explosions.
So is this a case of cinemas turning down well mixed movies because movies are coming out too loud?
But really this is just the same problem over again. Sound engineer mixes audio intending for a high end cinema. Some cinemas plays it poorly, people watching at home have no chance to play it properly...
The sound engineer interviews in that article are interesting though, and they seem to be blaming Nolan and the mixing as well.
I also get wanting to turn them down. I've been in theaters so loud that my ears were ringing after the movie, or were actively hurting during certain scenes. The Dark Knight was one of those. It could also just be poor settings at the local cinema in that case as well though.
Given the vast majority of movies are understandable, I just wish Nolan would mix with that goal in mind given I would actually enjoy his movies most of the time if it just weren't for the muffled speech and sound.
In the 90s cutting edge cinema.audio was 5.1 systems (cinema digital audio) before that 2.1. You own a 5.1 system you say.
Today cutting edge is systems like Dolby Atmos Cinema which is a 128 virtual, 64 real speaker system.
Are you really saying you can't grasp that cinemas have moved on in audio system since the 80s?
As long as the receiver has identified the positions of your speakers accurately (something receivers have been capable of with a small microphone since the early 00’s) it should place sounds appropriately.
Perhaps suddenly the very quiet speech the sound engineer carefully made space for in location and frequency is suddenly muddied?
Must I have my own IMAX theater?
Actually, I saw some shitty Nolan movie in the cinema and the audio sucked there as well.
People need to be able to see and hear (the audio side is arguable getting even worse) what is happening. That is the absolute baseline for making a good product.
This is the equivalent of making a website where people struggle to read the text and the designer yelling that it is not made to be viewed on cheap mobile phones. Even in our current world of user-hostile web design, most professional designs try very hard to be accessible on a very large range of devices.
And no "naturalism" has nothing to do with it. I don't think unnatural lighting is such an immersion breaker compared the horrible CGI that people are forced to made on crunch time but sure if you want to only use natural lighting, go ahead BUT you still need to make sure the scenes are well lit. There are lots of avant-garde movies with natural lighting that are well lit.
This means that you either have to create for the lowest common dominator or accept that what they get is shit. As you said we would expect to be chewed out if we did that, but for some reason films are different.
To be honest I can see it both ways, as I don't have a soundbar or an oled display, but at the same time why should that hold back those who have paid 4 or 5 times the cost of my setup? If what I have is crap, shouldn't I expect that movies are not looking very good?
You can do some truly amazing things with just audio[0], but it does require a certain dynamic range on the receiver.
As an example: https://darkerprojects.com/lostfrontier/st-lf-season-1-ep-00..., which is a podcast radio play.
Some cases, yes the sound setup helps. Other cases, such as actors "mumbling", sound setup does nothing.
Gibberish is just gibberish.
Of course there are a couple actors that makes it harder: Marlow Brandon or people who speak with a pronounced Southern Accent. But in general I keep hearing this complaint from people who are using the inbuilt speakers.
And you’re right, many people don’t have the things you need to make it sound good. It’s almost like they should have a mastering for bad TVs with crap audio.
Where is the monopoly?
Disney, Netflix, Amazon and so on.
(And yes I am aware that technically they would be an oligopoly. I am purposely using the broader, more practical definition of monopoly that is useful for describing the real world.)
And there is zero or near zero friction in consuming media pretty much all of this media thanks to broadband internet and capable computing devices in everyone’s pocket, other than the price you are able and willing to pay.
I would not classify it as a "monopoly" or an "oligopoly", at least in the sense that anyone is restricting viewers from watching what they want or anyone is restricting media creators from making what they want.
Isn't that their job description? Which filmmaker doesn't fit this? The best art is often made by an 'insane' person.
http://rubenrevecoarte.blogspot.com/2016/02/un-poco-mas-de-r...
EEEEELÉH!... WHERE DID YOU SEE A REVOLVER FIRE SO MANY BULLETS WITHOUT RELOADING IT? A LITTLE MORE REALISM, WOW!
WELL, YES THAT'S WHY IT'S NOT TIME TO BEING SHOOTING OURSELVES IN A SUPPOSED DESTINY FROM ARIZONA, BUT FROM GOING TO DRINK THE MILK
REALISM, I SAID, NOT REALITY
> heeey! where have you seen a revolver fire so many bullets without a reload? a bit more realism, c'mon!
> well, in that case this isn't time to be shooting each other in a supposed "Arizona gorge" either, but to have tea[1] instead
> realism, I said, not reality...
[1] "time to drink milk" to kids means it's time for an afternoon snack
It was so genre savvy and ahead of its time imo.
You can’t make too many movies like Scream though. It has to be the oddity that exists apart from the rest to really shine.
The frame displayed there is obviously too bright, and this is the point.
> It’s a purely stylistic choice, employed for that one moment to cast doubt on Billy’s trustworthiness in the audience’s mind. It’s an extremely stagey choice that fits neatly within the larger series’ heightened, melodramatic style. Scream wouldn’t really be Scream without it.
It doesn't mean that he's ignorant of the aspect that a camera looks differently than our eyes do, as you wrote above. The scene is not bright because of this fact. It's bright for artistic reasons, and you could describe it as overcompensation, since it wouldn't even have to be that bright to deliver a facial expression.
Another one that irks me is shallow depth of field.
I appreciate the deep-focus cinematography by the likes of Kurosawa and Welles all the more when I see modern filmmakers making 80% of their frame blurry on purpose.
Fads are cyclical, so I was hoping that after Zack Snyder brought the style to its logical unwatchable conclusion with his “Army of the Bokeh”, cinematographers and directors would move on, but I guess it’s not that time yet.
This is even more annoying in YouTube videos. People are so obsessed with shallow depth of field that even in product reviews they focus on the face and leave the product blurry.
I can't find it right now, but there's a video by Tony Northrup where he's reviewing a camera and he's so obsessed with filming himself at f1.4 that his face is barely in focus half the time because of the failed focus tracking.
Personally I don't agree with the take of the article in two ways. The first one is that it gives way too much default credit to realism as a stylistic choice. Movies are works of fiction and the creator has total artistic freedom in what aesthetic to go for, and being 'realistic' isn't necessarily a good thing, you still need to make the case for it.
Secondly I think the bigger reason is a trend towards grittiness, bleakness or a sort of 'scandi-noir crime drama' look. People go for that mood not just visually but also in terms of writing, muted mumbly dialogue, minimalism, and so on.
Villeneuve's Dune that's mentioned in the article is to me, despite the apparent popular appeal, a negative example of this trend. The movie is overly bleak and oppressive, cold, distant in a way that many of his films are and heavy on visual stereotypes. (the hairless, pale, black-dressed 'brutish' Harkonnen's, a caricature that the books deliberately avoided).
These days for some reason studios forego that process and "meh, good enough, stream it".
From an enthusiastic GoT fan back then: I don't really know where the author got that "episode after episode" from. I remember exactly one episode, where overdark scenes were an issue, S8E3 "The Long Night" - and in the episode, darkness was a plot device and a deliberate stylistic choice.
There was an unfortunate scene in which the characters were unable to make out whether the enemy was approaching or not, due to intense darkness. The filmmakers chose to visualise this by making the scene almost pitch black, with literally not enough information in the pixels to see the characters. I'm sure that must have looked impressive in a test screening in a cinema, but when streamed, it caused lots of viewers to be distracted by their own reflections and turn up their TVs to maximum brightness - because the scene sort of looked as if you should be able to see something, even though really you weren't.
I think that was a visual experiment by the filmmakers which, frankly, failed - but it was at least a deliberate choice and not blindly following some trend.
I thought it played nicely with the use of light to stylise the rest of the world - Westeros with its bright but slightly murky look, like a busy city in Central Europe, Winterfell, The Wall, and the north with its flat grey shadowless light from an overcast sky, and the searing oranges and yellows of the desert-y parts of Essos that looked and felt hot to look at.
One example is the recently released adaptation of Great Expectations[1]. It's not just the overall darkness, but general lack of colour which is striking to me. I think this is just an artistic style and the next stage of colour grading from the Teal & Orange[2] trends of a few years ago.
You get used to it when watching a programme and forget what full colour actually looks like. Watching old films, something like the 1962 film Lawrence of Arabia[3] highlighted it for me, really shows how much colour has been drained from modern film & TV productions.
[1] https://www.bbc.co.uk/mediacentre/2023/great-expectations-ai...
[2] https://theabyssgazes.blogspot.com/2010/03/teal-and-orange-h...
[3] I think the recent remasters release may have some colour grading in too though, just looking at some scenes they are darker!
Did you tried to see anything at night in a winter forest? It is impossible. Maybe full Moon can help to see something, I don't know really, but my experience tells that the only thing you can see without an artificial light is darkness and (if you are lucky) a few stars blinking through branches of trees (which are invisible by themselves because they are totally black over a black background).
I always laughed at a scene where Harry Potter follows silver doe to a river and Ron Weasley finds him without a notice (Harry didn't see any lights coming), and Severus Snape secretly watches them. Ok, I can write off Snape, he is probably using some magic to see in the dark, he is a very powerful wizard after all. But all the seven books say not a word about Harry Potter or even Hermione Granger knowing something like that. When they face darkness they use Lumos that works like a flashlight. And a film shows completely unrealistic forest where you can see the scene. Trees, for example. Screen doesn't turn into a black rectangle, but to be realistic it should.
You are right that if we walk on a moonless night it is near impossible to see anything, but on a full moon night the entire world is lit in ethereal silver. I won't say it is as lit as during the day, but I would have zero issue navigating at all.
It also takes only one look to understand why werewolves, silver and the moon are connected in folktales.
https://en.wikipedia.org/wiki/Dogme_95
See rule 2.
Dialog being mumbly and video being dark doesn't follow from actual realism, it's a stylistic choice.
It's frustrating to watch and I empathize with people who feel like their cherished shows of before are being "ruined" by spin-offs or reboots.
As far as people who say their shows are being ruined, the studios are still chasing younger audiences and mainstream popularity. A starfleet academy show was announced the other day for example, so teenage drama here we come.
How so, we are already used with Federation ships having a lot of lighting, so now with the new cool style they also need to somehow explain why they decided that Star Fleet prefers to walk in semi-darkness on the bridge, corridors and working spaces. I can maybe understand the Game of Thrones or Mandalorian caves need to be dark for realism excuse but for Star Trek is obvious a stylistic choice to use the cool new styles and shit on the bright optimisitc one from the past.
Given how ridiculously dark House of the Dragon was in some episodes, with the entire season only set in candlelight, I don't think they've learned anything.
I wonder how many people pull up brightness on their TVs only for that reason and ruin the experience for all that actually bright content just because they have to see something in that darkness.
Couple that with some video streamers (looking at you, Netflix) always defaulting to a different audio track, while it is often in its original film’s language, the audio gets bastardized with commentary overlay and overlaps thereby ruining the original movie experience. Often cannot default to “original movie sound”.
Oh, did I mention that the closed captioning has started ignoring the hard-of-hearing and deaf folks? They only transcribe the foreign language into open-captions. I mean, WTF? We folks kinda want ALL languages captioned.
Just the three kinds of darkness we are descending into … nowadays.
Do you mean that when there’s foreign langage being spoken the CC track just sits unused and only the on-screen translation for hearing audiences remain?
I assume that plays havoc on accessibility equipment? And from what I remember open captions are often styled to not clash with the picture, so they’re sometimes had to read too.
In turning the closed-captioning ON to English, it is the loss of English (in non-English language movie) captioning when a foreign actor sporadically speaks in English. "Normal" English viewer would enjoy just the same; deaf and hard-of-hearing, not so much.
I get it. It's style, but it's the opposite or realistic, or motivated lighting - one would imagine a workplace would be lit properly, use light walls to reflect more direct light and provide diffuse lighting.
The proliferation of sequels trading on nostalgia, cinematic universes like Marvel etc., may seem like counter-examples but I’d argue they represent a directing team whose specific vision is making a lot of profit. Profit is an internal motive and the directing team’s wide latitude lets them pursue it, rather than profit being an external concern that steps in scene by scene and tells them to add more lights. (Also note that Marvel’s MCU is well-known for the very particular creative vision driving the entire franchise.)
When the decision-makers have a unique vision, we get cult classics, new schools like “lighting realism”, and also awful movies like The Room. When the decision-makers are influenced by trends, we get waves of shaky cam or dark lighting.
Too many movie makers these days assume that any issues can be removed by CGI in post-production. But you can only do so much with the material you're given.
Oh, and post-production time and budget are often an afterthought, too.
The artists aren't respecting or grading with consumer equipment in mind. I'm not getting an optimal viewing experience or even getting the artist's creative intent by trying to recreate the intended viewing settings. Dolby Vision still isn't completely supported when playing back a file on desktop. Certain DV profiles can be passed through to TVs that support it, but that's not possible for Blu-Ray Dolby Vision files. Neither the Xbox nor PS5 support Dolby Vision disk playback. Many playback solutions for playback of Dolby Vision from PC throw away the dynamic data defeating the point of caring about DV over HDR10. This is also ignoring how streaming changes colors can change due to compression, and some shows and movies are only available via streaming.
My TV can detect ambient light, Dolby Vision IQ already exists, but in my experience fails too often. Some would argue DV IQ also moves away from filmmarker's intent.
https://www.youtube.com/watch?v=D83SXcguwBU
https://twitter.com/HBOMaxHelp/status/1576793465010135040
https://www.trustedreviews.com/explainer/what-is-dolby-visio...
As a 10 year old, trying to watch a dark movie where the main characters whisper or speak in a low volume was basically impossible. I don’t know why, maybe my senses just weren’t that developed yet, but I struggled so much. I remember watching Harry Potter, and in that scene where Harry stands in front of the mirror with Dumbledore, I could barely see or understand anything, despite watching this in the cinema.
As I got older, I was able to perceive darker scenes and listen to whispered or hushed speech better. But it made movie watching way less enjoyable when I was younger. There were large chunks of movies where I basically just zoned out because they were too dark and or quiet. I’m curious if anyone else experienced something similar.
"The same place as the music"
https://twitter.com/siracusa/status/1122834984228806656?t=_H...
I think this style looks great in photography but it’s distracting when trying to watch a movie. I hope they find a new aesthetic.
See also: https://en.wikipedia.org/wiki/Loudness_war
The article just rules this out without any citations. CGI is outsourced to lowest bidder and it looks as it looks.
The Batman scene shown in article reeks of cheap CGI, probably on a green screen.
(It's magnified in a living room that I can't truly darken during the day).
OLED has such dark blacks that what used to be dim is now no-light-at-all-black.
HDR also seems to affect things, but I'm not certain if and how much it contributes to the problem (or is a solution).
First, some context: the problem of dark scenes is the same problem as unintelligible dialog is the same problem as a lot of classical music being quieter than pop music.
All of these creators are striving for greater range. Dark scenes work fantastically if you watch them on a bright screen in a pitch-black room. Dialog is perfectly intelligible in surround sound at full volume, and marvelously expressive. And classical music is delicately soft and then powerfully loud in an otherwise perfectly quiet room, to incredible effect.
But as soon as you're watching on a screen in a normally lit room, or listening on your TV speakers while people are talking in the kitchen, or trying to listen to classical in your car with the sounds of traffic... it all falls apart.
The solution is what audio engineers have known about for decades, which is called compression. Compression is: make the quiet parts of the classical music almost as loud as the loud parts. Boost the dialog channel so the mumbling parts are almost as loud as regular conversation. And boost the brightness of dark scenes. Compress the range -- go back to less range.
But we don't want to mess with the source material when viewing/listening in ideal conditions, because we want to keep that awesomeness. So we need dynamic compression. Which isn't really that hard.
A television can have an ambient light sensor just like your MacBook does, to boost dark scenes when you watch during the day, but not at light with the lights off.
A television can have a cheap microphone to detect ambient noise, and also be aware of its own volume setting, and so boost dialog at low volume settings and when it's noisy around, using the surround sound signal. (And a MacBook or iPad or iPhone can do this too.)
And your car radio or phone music player can apply dynamic audio compression to your classical music as well, similarly using a microphone to detect the need to compensate for the rumble of traffic or conversation in the coffee shop.
What baffles me is that all of this is possible right now, and it's actually trivial to implement, relatively speaking. (There's a bunch of tuning involved to make it work well and feel perceptually natural, but it's not like we need to invent new types of signal processing or anything.)
But nobody's even talking about dynamic compression as the solution. And I just don't get it. Why not?
My AVR has "loudness management" which compresses the audio. It just doesn't work transparently so I find it irritating because it's obvious when it's changing the volume. My partner does prefer it though, but I'm more of a purist.
So the answer to your question seems to be that it doesn't work well as a solution.
Yes, yes it is. Black Panther's CGI was notoriously awful even by 1990's standards.
So it's cheaper and faster and movie making is a business.
Instead, CGI became a lazy way out for quick and sloppy production quality, even in established franchises from studios with deep pockets. In the first IronMan and SpiderMan movies, the heroes actually wore physical suits painstakingly made by artisans in the business. In the last Marvel movies the costumes are just lazy CGI that look like textures over a spandex suit instead of acting like physical items over a human body. Terminator 1 & 2 had practical models and they still look great.
"Oh, we don't need to finalize the real costumes and models now, just dress the actors in gray spandex with tringles, put them behind a green screen and we'll CGI everything later on before release". "It's OK, we don't need to bother having correct lighting and shadows in the studio, the CGI monkeys will just fix it in post with their computer thingies."
CGI now somehow looks worse than VFX 10-30 years ago despite the insanely better tech, everything just feels rushed with no creative direction behind it.
"Toupees always look so fake. I can spot them instantly. I can't imagine why anyone wears them."
Maybe that example is less pointed now that baldness seems more socially acceptable, but the point stands: Of course you see bad CGI, in that you notice that the CGI you're intended to notice is sometimes flawed. You don't see CGI the moviemaker doesn't want you to see, CGI used to clean up shots, CGI used to replace what would have been matte shots decades ago, and whatever else CGI is used for these days. You see a scene and it registers as a scene, not a shot with some obvious CGI composited in.
Recursing, some matte shots were horrible and obvious, whereas others, like Dana Barret's apartment building in Ghostbusters, were pretty seamless.
1/2/3 is the worse examples as the CGI quality was particularly bad. Let's not even start about Jar Jar.
It's surprising how it makes it more immersive than perfection.
Things like the new Marvel movies leave me a bit cold because there's no sense of "how did they do that", because the answer is always going to be "green screen and Blender".
Interesting comment, because while watching Avatar Way of Water, I was constantly asking to myself "how did they do that?" Technically, the answer is of course "green screen and computers", but that's about as useful as watching a great performance in acting or sports, and saying "it's just technique and practice". No, it's that but while trying to make new things that you haven't seen before, paying attention to every detail - and in the case of Avatar, also doing that for 3 hours straight.
I think the modern Marvel movies leave me cold because of the same issues that the story for Avatar left me cold: it's all easy and digestible, executed with perfect competence but little ability or desire for surprise, subtlety or discomfort.
The Hobbit is much newer but doesn't look so great in comparison. Sure it has some amazing special effects but the cinematic footage doesn't compare as well next to the older films
But Hobbit trilogy - all the time I had feeling I'm watching it from behind some plastic screen, or perhaps an action game. It's not bad (tho the book was "spread" into 3 movies) but excessive use of CGI and post-processing is clearly visible. Hell, the An Unexpected Journey has moments where Gandalf's face is unnaturally lighten up with dodge tool - and that does give me the unpleasant feelings.
I still believe that computers in movie industry should be tools to enhance camera work, not fully replacing it. You can deliver an exciting story portrayed without overfilling it with CGI or... lens flares, bokeh effects in every scene.
If you are however claiming that, on average, visual fidelity has declined in comparable product tiers, well, no. In the past a lot of folks would just not do much FX at all, because it was prohibitively expensive.
It's somewhat incorrect. We now have significantly more CGI, and most of it significantly better. See, e.g., Why C.G. Sucks (Except It Doesn't) https://youtu.be/bL6hp8BKB24
However, the over-reliance on CGI is a problem