Don't you like cranked down video cameras video frame rate feeds?

N5XZS

SatelliteGuys Pro
Original poster
Jan 23, 2005
4,714
2,715
Albuquerque, NM, USA
I find is annoying to see video cameras cranked down from normal 30i or 60p video frame rates on the raw feeds!:p

This is for NA, CA and parts of SA only.

European feeds are exused from there natural video frame rate at 25i and 50p except anything below 25i or 50p video formats.

How do you feel about it especially on live sports and music concerts?

Thanks!:hatsoff
 
  • Like
Reactions: c-spand
Victoria FTA, Yes there is FTA feeds on 97 W* bird a Kuwait TV Network 1,2 and 3 is transmitting In 1920x1080i 25i FPs 50 Hz broadcast system and you will see slight frame rate judders movements.

Depending on your satellite reciever converting back to 30i or 60p NA video formats in 480i or p, 720p, 1080i or p and 2160p video system that we have here.:hatsoff
 
I don't know as I never watch that channel. However, 25 FPS is the standard framerate for TV broadcasts in Europe, the Middle East and Australia, so that feed is being uplinked to 97W at its native framerate.

Additionally, all my displays support 25p so there's no issue watching 25/50 content on it; I simply have my device output a 25p or 50p signal to it when watching European content and I see no judder.

I will say however that 25/50p is an awful standard that shouldn't exist and that Euros are just too proud to admit that they screwed up when choosing a standard framerate for their region all those years ago. With the transition to HD and now 4K they have had two opportunities to correct this problem and they haven't.

Europeans have it much worse than we do in regards to American content being uplinked over there because on the European satellites, the backhauls for American native 29.97/59.94 content is molested down to 25 fps. WWE, NBA, the Super Bowl, the various awards shows that originate here, you name it. They molest the feed down to 1080i25 when they uplink it on a European bird. And then of course all the American taped content that Euros watch is produced at 23.976 or 29.97 so they screw it up when they broadcast it on their 25 fps channels.

Occasionally you might see a backhaul for a live European event uplinked to an American satellite that gets its framerate messed with from 25 -> 29.97, or maybe an American TV channel will air something produced in Europe, but the reverse scenario is much more common as there just isn't that much European content that makes it over here. Americans produce more stuff than anyone else in the world.
 
  • Like
Reactions: N5XZS
Actually you know British TV is heavily shown on PBS's feeds and some Euro soccer feeds also being shown on NBC, Telemundeo and Bein Sports Network.

NBC and Telemundeo have better professional video converters make the video looks much smoother, however PBS, Bein Sports Network and others network, especially your and my reciever have poor handling of converting 50 Hz to 60 Hz world.

Of course that's depends on the software, chips itself do the complex work on the video conversion stuffs.:cool::hatsoff
 
I challenge anyone to tell the difference between video shot and displayed in progressive 25 vs 30 FRAMES per second. Unless it is extremely fast motion, most viewers will be unable. Conversion of progressive scan formats provides a significantly better transfers than transfers from interlaced formats. Fortunately, most production now originates with progressive formats. Crappy format conversions typically are due to the choice of the conversion solution.

How many complain when watching current release movies on our 4k 30fps home theaters without pulldown? Why, because they use the correct solutions to transfer Most movies' are shot at 24fps. Personally, I prefer watching footage shot at 24 vs 30fps, as the image has fluidity rather than sharpness of "reality". Sure, there are a few film connoisseurs that have issues with the big screen film transfers to video, but most do not.

The world operates on the 50hz standard, only a few countries are based on the 60hz standard. This is not simply a poor choice made by Europeans. This decision goes back before the invention of television to the choice of frequency used for AC distribution. The decision on the refresh standard was primarily based on synchronization. Imagine attempting to produce moving images with fraction per second exposures based on the 29.97hz standard synching with lighting fixtures operating at 23.976 hz. This is a production nightmare that I often dealt with when shooting film.

Most TVs are now multi-standard. Maybe the awful look is due to your conversion process. Is your TV capable of syncing to 25 and 30hz? If so, why not keep the 25fps format native? Change the output to display in the native format rather than conversion. In the native mode, switching channels will take a moment to sync and handshake between the STB and the monitor, but then there will not be any conversion as you will view the source in the format determined by the uplinker.
 
I can certainly tell a difference between something filmed at 25 fps and something that's 29.97 fps. Have watched many a live event captured from satellite either over North America or the European satellites (it pays to have European friends who are into FTA satellite), and the American ones always look a bit smoother since they're a 20% higher framerate. As an example of similar content, the annual MTV Video Music Awards backhaul always looks smoother in motion than the MTV Europe Music Awards backhaul because of that 20% higher framerate that America uses.

25 fps is only a marginally higher framerate than 23.976, which is what films and scripted TV shows are shot at. Surely you can tell a visual difference between a scripted TV show/movie filmed at 23.976 and live action content filmed at 29.97 fps, right? I often see people whining about the 'soap opera effect' and that's because they've become used to seeing scripted material at 23.976 and find it jarring when it has a higher framerate similar to unscripted material.

And it's not just pre-recorded video that the 25/50 Hz European standard has been screwing up. For a long time, it was video games. PAL versions of video games were notoriously worse than their NTSC counterparts because in video games higher framerates are essential. Thankfully for Europeans, video game developers saw fit to discard the limitations of 50 Hz years ago with the transition away from analog outputs to HD and the HDMI video standard in consoles, and now European versions of video games run at the same superior higher framerate as the American and Japanese versions. Why Europeans insist on continuing to use the inferior lower PAL framerate for live action video material when it's no longer a technical necessity remains a mystery to me, as higher framerates benefit both video games and linear video.

Games ported to PAL have historically been known for having game speed and frame rates inferior to their NTSC counterparts. Since the NTSC standard is 60 fields/30 frames per second but PAL is 50 fields/25 frames per second, games were typically slowed by approximately 16.7% in order to avoid timing problems or unfeasible code changes. Full motion video rendered and encoded at 30 frames per second by the Japanese/US (NTSC) developers was often down-sampled to 25 frames per second or considered to be 50 frames per second video for PAL release—usually by means of 3:2 pull-down, resulting in motion judder. In addition to this, PAL's increased resolution was not utilised during conversion, creating a pseudo letterbox effect with borders top and bottom, which looks similar to a 14:9 letterbox, and leaving the graphics with a slightly squashed look due to an incorrect aspect ratio caused by the borders. This was especially prevalent during the 8-bit and 16-bit generations when 2D graphics were used almost exclusively. The gameplay of many games with an emphasis on speed, such as the original Sonic the Hedgehog for the Sega Genesis/Mega Drive, suffered in their PAL incarnations.

Despite the possibility and popularity of 60 Hz PAL games, many high-profile games, particularly for the PlayStation 2 console, were released in 50 Hz-only versions. Square Enix have long been criticised by PAL gamers for their poor PAL conversions. Final Fantasy X, for example, runs in 50 Hz mode only, meaning it runs 16.7% slower than the NTSC release and features top and bottom borders; while this practice was common in previous generations, it was considered inexcusable by contemporary consumers at the time of release.[5] In contrast, the Dreamcast was the first system to feature PAL60, and the overwhelming majority of PAL games offered 50 and 60 Hz modes with no slow speeds. The PAL GameCube also offered 60 Hz on almost every title released. The Xbox featured a system-wide PAL60 option in the Dashboard, with almost every game supporting PAL60. Seventh generation PAL consoles Xbox 360, PlayStation 3 and Wii also feature system-wide 60 Hz support.[citation needed]

As of the eighth generation, consoles such as the Wii U, PlayStation 4, Xbox One and Nintendo Switch have all games solely in 60 Hz, with 50 Hz only being used for video playback and, in the Wii U's case, backwards compatibility with Wii and Virtual Console games.
 
  • Like
Reactions: N5XZS
Is interesting to see the newer games are now pushing up to 120FPs or 120 Hz, heck some are even pushing up to 240 FPs, 240 Hz both are in P mode, no interlacing video formats.

While the our new broadcasting system HEVC mostly says 25p, 30p, 60p and there's a new one is 120p, which might be rarely used except for Sports feeds in the future. :hatsoff
 
Is interesting to see the newer games are now pushing up to 120FPs or 120 Hz, heck some are even pushing up to 240 FPs, 240 Hz both are in P mode, no interlacing video formats.

While the our new broadcasting system HEVC mostly says 25p, 30p, 60p and there's a new one is 120p, which might be rarely used except for Sports feeds in the future. :hatsoff

Oh yes. I can't wait to see more high framerate non-video game material. My current monitor is a 5120x1440 @ 240 Hz, so there is definitely a huge benefit from playing video games at 30 -> 60 FPS, and a rather noticeable benefit going even further from 60 -> 120 FPS. After 120 FPS though I start having a hard time perceiving even higher framerates.

Over the years I've amassed a fair bit of non-video game content as well with framerates of 60 FPS. You have things like ESPN's Monday Night Football games which the backhauls are actually uplinked at a native 1080p59.94, and the various 4K feeds and channels uplinked around the world are usually 2160p59.94 or 2160p50 as well. I don't recall recording any 4K content broadcast via satellite at a framerate lower than 50, actually. Since they are often producing brand new content to fill out these new 4K channels they are producing it at the higher framerate too.

Anyways, I've now watched high FPS content so much between video games and these feeds that it's actually messed with my perception in a way that most normies haven't been re-wired to appreciate yet as they haven't been exposed to enough high framerate material. I actually now quite like the "soap opera effect" motion interpolation feature found on 4K TVs and I actually find 23.976 content pretty difficult to watch without using a frame interpolation method to increase its perceived framerate.

I'm pretty excited to see 4K in television broadcasting become more mainstream because I think we will see a lot more programs produced and released at 50 and 60 FPS framerates than we have been seeing until recently. There's no reason why every live event and non-scripted TV series shouldn't just start being produced and distributed at 50/60 FPS instead of just 25/30 when channels start offering a 4K feed and ideally it would be nice if Hollywood gradually started producing more scripted movies and TV series at higher framerates too. The streaming platforms have offered 4K streams for years, but for some reason Netflix doesn't like to produce their original series at framerates higher than 23.976. Even their reality shows and stand-up comedy specials they usually film at 2160p23.98 instead of 2160p29.97 or higher which I find quite frustrating. You don't see this hesitance in producing content with higher framerates when the TV networks produce reality based content.

Additionally If you've never seen Billy Lynn's Halftime Walk or Gemini Man on 4K Blu-ray I would highly advise checking them out to see the potential we're missing out on when Hollywood keeps using 23.976 for scripted productions. There hasn't been a technical necessity to shoot productions at that low framerate for a long time and it's only stuck around for so long because "that's how it was always done." It's going to take a concentrated effort to convince people to give higher framerates a honest try in scripted productions instead of just immediately writing them off because they haven't been accustomed to them like they have 23.976.
 
  • Like
Reactions: N5XZS

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 2)

Top