Dipping my toes into the “grain reduction” controversy

OK, dear readers, let me lay some truth on you. In the 20th century, most moving images were stored on film. Video tape was pretty common but it was used mostly for live events and low-budget sitcoms. The prevailing feeling at the time was that using film imparted a look that said “quality.” It’s a good thing that a lot of TV shows and movies were shot on film, though. It means the film can be scanned into high definition or 4K, giving new life to old content. There’s a reason that you can find Friends and The Office at the top of streaming lists while similarly popular fare like Will & Grace never make the top 10. Friends and The Office, despite being 21st-century productions, were shot on film, then edited on video tape. Will & Grace used standard-definition video for most of its run, leading to blurry, dark images. People just don’t want to see that.

What’s all this “grain reduction” about?​


No one chose the film used for television production because it was grainy. TV producers chose it because it worked in low light. The fact that it had a low price didn’t hurt either. That means it was generally grainier than the film stock used for movie production. It didn’t matter at the time — scanning that film to standard-definition video eliminated all evidence of that grain. But take that same film and scan it to HD or 4K, and the grain comes through loud and clear. In fact it can be very annoying, providing a snowy appearance if you use your TV’s sharpness control to give you a more pleasing picture.

In the 2010s when a lot of shows started to show up in HD versions, the people doing them commonly used computer filters to remove the grain. These filters correct for slight differences in brightness and color that come from using traditional film. The result is an image that looks as crisp and clear as anything mastered on digital HD video today. In fact, I would put Friends up against anything shot today in terms of quality and watchability. It’s eminently more watchable than the HD feeds of most HBO shows today.

Things have changed​


However, it’s recently become popular to use absolutely no grain reduction when scanning old TV show negatives. I wasn’t able to get a screen capture to show you, but compare any episode of Seinfeld on Netflix with the same episode on Comedy Central. Comedy Central uses the original HD scans, which use modest grain reduction. Netflix uses an all new 4K HDR scan that’s a lot grainier and darker. The Comedy Central version might seem a little flatter and lower in general quality, but the truth is it’s a lot more watchable.

Recently, Paramount+ released the classic Frasier to streaming in HD. It’s really, really grainy. And that’s what led me to write this article.

“the filmmaker’s intent” and the argument against grain reduction​


The argument against doing grain reduction usually boils down to one of two things. The first is the idea that grain reduction destroys detail. This isn’t really true with today’s AI-based grain reduction. That leaves the other argument, that grain reduction “goes against the filmmaker’s intent.” What purists will tell you is, the maker of these shows could have shot them on standard-definition video but they chose not to. There was something about the look of film that they liked. So, we should honor that choice by watching the shows as the originator intended.

I call that hogwash.​


The makers of TV shows shot on film because they liked the look of it. That’s true. But what they liked isn’t the grain. They liked the crushed blacks and muted colors that film brought to the party. Film doesn’t capture as much detail or color in the shadow areas as video tape, and for whatever reason that’s what they wanted. The feeling at the time was that a show shot on film tended to have higher-quality writing and directing just because it was more expensive to shoot on film. That may have been true then. Of course now everything gets shot digitally. So the whole film argument really has no place in the 21st century.

When it came to the grain of the film, the makers of those shows chose film stocks with a lot of grain because it was cheap and because they thought no one would know. Show me one person who was making TV shows in the 1980s who thought that their work product would have a life in 4K on 80″ televisions. I’m guessing you can’t find one.

My take on all of this​


I agree that a little bit of film grain is a good thing if you’re watching a theatrical movie. In cases like that, you did sometimes have cinematographers who consciously decided to make things grainier because they wanted that look. On the other hand, there were cases like the original Star Wars where the filmmakers admit they hated the amount of grain but the technology at the time gave them no choice.

I hope the current trend toward preserving film grain in scanned TV shows just dies. As I said earlier, Friends is a much more watchable show because the overall image quality is fairly modern. The same is true with the recent HD scan of Moonlighting, which also has very little grain inherently. If people worry so much about watching these shows as intended, they should find old copies on VHS. Then they should watch them on a 19″ tube TV. If you’re not doing that, then my suggestion is to stop crowing about watching things as intended.

The post Dipping my toes into the “grain reduction” controversy appeared first on The Solid Signal Blog.

Continue reading...
 

2023: The Year In Signals

FUN FRIDAY: Casio SK-1