Rant: High-Def TVs are Lies

CowboyDren

SatelliteGuys Pro
Original poster
Jul 18, 2005
990
2
64133
Now that 1080p TVs are on the market and taking over the shelves, this isn't as much of an issue, but...

My TV, an older ViewSonic 37" LCD, is the common "720p/1080i" format. A lot of people don't know (most don't care) that the LCD panel itself is actually 768*1366. If you wanted to be a jerk about it, you'd say that the panel is technically a 768p model. Most LCDs labeled "1080i" are just like mine in this regard. This is a problem for me because I paid for a 768p panel to display a 720p or 1080i image. Here's the grid:

480p = 480h * 640v * 1 (307,200 pixels 30 times per second)
720p = 720h * 1280v * 1 (921,600 pixels 30 times per second)
768p = 768h * 1366v * 1 (1,049,088 pixels 30 times per second)
1080i = 540h * 1920v * 2 (1,036,800 pixels per frame, 60 times per second)

ViewSonic had to insert an image scaler to take my 720p input frame, scale it up to 768p by "creating" 6.6% more vertical AND horizontal data that just wasn't there to start with, filter it through an anti-aliaser, and then display it. In the case of a 1080i input frame, the scaler discards 40% of the horizontal data, but still has to interpolate 42% of the vertical data. How the heck do you tell a chip to interpolate 1 out of every 16 pixels? Better still, how do you tell it to discard 13/32nds of a complete frame?

If my TV had a true 720*1280 pixel panel, none of the interpolation shenanigans would be necessary for 720p input sources. For a 1080i source, it would interpolate 25% per frame vertically (for every 4 lines the panel lights up, only one of them is interpolated) and straight-up discard 33% (one line out of 3) to display the image dot-for-dot. You still need a scaler and an anti-aliaser, but it seems to me that neither would have to work as hard to get the same result. Let the satellite tuner, cable box, or DVD player worry about that stuff, and just display the stupid image.

When you bring Bluray or some other "TrueHD" 1080p source, the exra circuitry looks even more ridiculous. If you're talking about watching a cinematic aspect ratio source (read: a movie) from a Bluray player, you're sending a 1080h * 1920v image to a screen that's going to discard 33% of the horizontal data AND is 33% black bars on the top and bottom...the end result is a 540h * 1280v image, which happens to be beautifully simple math on a REAL 720p display!
 
There are no LCDs or DLPS made that are 1080i. They do not make 1280x 720 TVs anymore either. The ones they did make were CRT based and generally tube displays.
Don't confuse with accept a signal to what a TV actually displays. Most TV's have better conversion electronics than STBs whether cable or satellite. I would blame the brand of TV before I would go off on the technology.
 
I wasn't aware there were any CRT TVs that were 1280x720. Most of the original DLP,LCOS,LCD rear projection as well as LCD and Plasma HD flat panels TVs were 1280x720.
If memory serves 480i or 480p are about 340,000 pixels (709x480).:D
 
I wasn't aware there were any CRT TVs that were 1280x720. Most of the original DLP,LCOS,LCD rear projection as well as LCD and Plasma HD flat panels TVs were 1280x720.
If memory serves 480i or 480p are about 340,000 pixels (709x480).:D

The big single CRT/Tube TVs were all 720p, Most rear projection CRT were 1080i and you are correct RP- DLP, LCD and Lcos are 720p. Flat panel are 768p
 
There are no LCDs or DLPS made that are 1080i. They do not make 1280x 720 TVs anymore either. The ones they did make were CRT based and generally tube displays.
Don't confuse with accept a signal to what a TV actually displays. Most TV's have better conversion electronics than STBs whether cable or satellite. I would blame the brand of TV before I would go off on the technology.

Better not let Greg Loween hear you, Kevin. Last time I made such a blashepmous statement, he came at me full force. :)

(BTW, I agree with you)
 
Uh, I still have a monster Sony 36XBR85 crt. That set is definitely 1080i.

The 34" Toshibia direct view CRT I just sold was 1080i as well.

I couldn't be happier with the PQ on my 32" 768p Samsung as a bedroom TV. I've heard that it's actually better to feed it a 1080p/i signal and let it downscale rather than upscale a 720p signal.
 
The 34" Toshibia direct view CRT I just sold was 1080i as well.

I couldn't be happier with the PQ on my 32" 768p Samsung as a bedroom TV. I've heard that it's actually better to feed it a 1080p/i signal and let it downscale rather than upscale a 720p signal.

That has been my experience with reasonable 720P sets as well.
 
ViewSonic had to insert an image scaler to take my 720p input frame, scale it up to 768p by "creating" 6.6% more vertical AND horizontal data that just wasn't there to start with, filter it through an anti-aliaser, and then display it.(1) In the case of a 1080i input frame, the scaler discards 40% of the horizontal data, but still has to interpolate 42% of the vertical data. (2) How the heck do you tell a chip to interpolate 1 out of every 16 pixels? Better still, how do you tell it to discard 13/32nds of a complete frame?


1) For a fixed pixel device, any non-native resolution must be scaled to fit the panel if you want the screen to be full. Otherwise, you have blank screen. Most people kvetch about not having a screen with every pixel lit.

As for the choice of 1366x768 resolution, this is WXGA in 16:9 ratio. Utilizing the common format allows manufacturing to gain economies of scale. There are many many more computer monitors sold than HD displays.

Welcome to the world of video scaling. It works quite well mathematically.

2) For 1080i you've assumed that there's no de-interlacing going around. The smarter way to scale this is to deinterlace the 1080i fields into 1080p frames then scaling to 768p and frame doubling. This works very well on film-based content (most content is film-based) but not well without motion adaptive deinterlacing for content that was natively shot interlaced (like live sports).


If my TV had a true 720*1280 pixel panel, none of the interpolation shenanigans would be necessary for 720p input sources.

Great, but see my previous re: economy of scale. Do you know what to look for as scaling errors?

For a 1080i source, it would interpolate 25% per frame vertically (for every 4 lines the panel lights up, only one of them is interpolated) and straight-up discard 33% (one line out of 3) to display the image dot-for-dot. You still need a scaler and an anti-aliaser, but it seems to me that neither would have to work as hard to get the same result. Let the satellite tuner, cable box, or DVD player worry about that stuff, and just display the stupid image.

See my previous re: 1080i to 1080p prior to the scaling. That's how the displays usually do this.


When you bring Bluray or some other "TrueHD" 1080p source, the exra circuitry looks even more ridiculous. If you're talking about watching a cinematic aspect ratio source (read: a movie) from a Bluray player, you're sending a 1080h * 1920v image to a screen that's going to discard 33% of the horizontal data AND is 33% black bars on the top and bottom...the end result is a 540h * 1280v image, which happens to be beautifully simple math on a REAL 720p display!

The math for 720p and 768p is not dramatically different.
 
1) As for the choice of 1366x768 resolution, this is WXGA in 16:9 ratio. Utilizing the common format allows manufacturing to gain economies of scale. There are many many more computer monitors sold than HD displays.
Except that 32" is as big as most PC/Kiosk displays go. Why did they (manufacturers) continue to add all of the extra pixels in consumer TV devices that are primarily designed for TV viewing? I use my 37" panel as a PC monitor, and that's great, but most people in this size class don't need WXGA resolution, so why did we have to pay for it?
2) For 1080i you've assumed that there's no de-interlacing going around. The smarter way to scale this is to deinterlace the 1080i fields into 1080p frames then scaling to 768p and frame doubling.
So does the scaler take a 720p image, find the lowest common denominator to 768p, scale it back down, filter it, and display it? Or does it in a dumb way just scale up to 768p and then filter and display the result? I still have a hard time wrapping my head around frame doubling and scan rates, but I can't see that 1080i content is any easier to deal with on a 768p display.
Great, but see my previous re: economy of scale. Do you know what to look for as scaling errors?
Honestly, I don't. I sit about 10' away from a screen that's barely 3' across. A 420p EDTV set would likely look almost as good from my viewing distance. Although I can pick out compression artifacts that Dish adds to the signal, and I can clearly see a difference between OTA and the Dish feed of the same channel.
The math for 720p and 768p is not dramatically different.
Except for all of the floating points, yeah. Math is math, but on a 720p panel, you only need 2 decimal places of precision. I think what's the most aggrivating is that they spent all of this money and R&D making the scaler magical, when I'd much rather have a perfect 720p panel and a better set of image filters for the same money. Or a 720p panel and the same set of image filters for less money.
 
How about instead of worrying about scalers and math, you just get a display that looks good? There are plenty out there, no matter what they are "natively".

I can understand your frustration with the picture from Dish Network because they downrez the poop out of their HD signals. Try Directv on a good display and you will be very happy!
 
Except that 32" is as big as most PC/Kiosk displays go. Why did they (manufacturers) continue to add all of the extra pixels in consumer TV devices that are primarily designed for TV viewing? I use my 37" panel as a PC monitor, and that's great, but most people in this size class don't need WXGA resolution, so why did we have to pay for it?

The mask for the LCD pixels scales to the size of the glass substrate it's being created on. It's the same mask over and over again for 768p.

So does the scaler take a 720p image, find the lowest common denominator to 768p, scale it back down, filter it, and display it?

No, that's throwing away resolution.

Or does it in a dumb way just scale up to 768p and then filter and display the result?

Why is it a "dumb way"? If we take the 1280x720p as the edges of the display and fill in lines we have to add 1 line of resolution for every 15 existing lines. That's barely anything at all.

So they calculate the intermediates and do a filter to smooth any jagged edges left over. I wouldn't call that dumb.

I still have a hard time wrapping my head around frame doubling and scan rates, but I can't see that 1080i content is any easier to deal with on a 768p display.

Assuming we're talking about native interlaced content, it isn't easy to deal with on any progressive display. To work well it has to be, motion compensating and directionally corellated.

For native progressive (film-based) content, it's a simple matter of recovering the original frames from the fields and scaling to fit as needed.

Honestly, I don't. I sit about 10' away from a screen that's barely 3' across. A 420p EDTV set would likely look almost as good from my viewing distance. Although I can pick out compression artifacts that Dish adds to the signal, and I can clearly see a difference between OTA and the Dish feed of the same channel.

At 3+ screen widths you can't even hit 720p resolution, assuming 20/20 vision.


Except for all of the floating points, yeah. Math is math, but on a 720p panel, you only need 2 decimal places of precision. I think what's the most aggrivating is that they spent all of this money and R&D making the scaler magical, when I'd much rather have a perfect 720p panel and a better set of image filters for the same money. Or a 720p panel and the same set of image filters for less money.

Your assumption is that 720p from 1080 is better because the math is simpler. Also, the cost of the processing engine is relatively low in relation to the cost of the entire set.

It's a tremendously complex field and there are any number of tutorials available on the web if you're sincerely interested.
 

Help getting new TV in future

Harmony 880 $99 shipped

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)