Now that 1080p TVs are on the market and taking over the shelves, this isn't as much of an issue, but...
My TV, an older ViewSonic 37" LCD, is the common "720p/1080i" format. A lot of people don't know (most don't care) that the LCD panel itself is actually 768*1366. If you wanted to be a jerk about it, you'd say that the panel is technically a 768p model. Most LCDs labeled "1080i" are just like mine in this regard. This is a problem for me because I paid for a 768p panel to display a 720p or 1080i image. Here's the grid:
480p = 480h * 640v * 1 (307,200 pixels 30 times per second)
720p = 720h * 1280v * 1 (921,600 pixels 30 times per second)
768p = 768h * 1366v * 1 (1,049,088 pixels 30 times per second)
1080i = 540h * 1920v * 2 (1,036,800 pixels per frame, 60 times per second)
ViewSonic had to insert an image scaler to take my 720p input frame, scale it up to 768p by "creating" 6.6% more vertical AND horizontal data that just wasn't there to start with, filter it through an anti-aliaser, and then display it. In the case of a 1080i input frame, the scaler discards 40% of the horizontal data, but still has to interpolate 42% of the vertical data. How the heck do you tell a chip to interpolate 1 out of every 16 pixels? Better still, how do you tell it to discard 13/32nds of a complete frame?
If my TV had a true 720*1280 pixel panel, none of the interpolation shenanigans would be necessary for 720p input sources. For a 1080i source, it would interpolate 25% per frame vertically (for every 4 lines the panel lights up, only one of them is interpolated) and straight-up discard 33% (one line out of 3) to display the image dot-for-dot. You still need a scaler and an anti-aliaser, but it seems to me that neither would have to work as hard to get the same result. Let the satellite tuner, cable box, or DVD player worry about that stuff, and just display the stupid image.
When you bring Bluray or some other "TrueHD" 1080p source, the exra circuitry looks even more ridiculous. If you're talking about watching a cinematic aspect ratio source (read: a movie) from a Bluray player, you're sending a 1080h * 1920v image to a screen that's going to discard 33% of the horizontal data AND is 33% black bars on the top and bottom...the end result is a 540h * 1280v image, which happens to be beautifully simple math on a REAL 720p display!
My TV, an older ViewSonic 37" LCD, is the common "720p/1080i" format. A lot of people don't know (most don't care) that the LCD panel itself is actually 768*1366. If you wanted to be a jerk about it, you'd say that the panel is technically a 768p model. Most LCDs labeled "1080i" are just like mine in this regard. This is a problem for me because I paid for a 768p panel to display a 720p or 1080i image. Here's the grid:
480p = 480h * 640v * 1 (307,200 pixels 30 times per second)
720p = 720h * 1280v * 1 (921,600 pixels 30 times per second)
768p = 768h * 1366v * 1 (1,049,088 pixels 30 times per second)
1080i = 540h * 1920v * 2 (1,036,800 pixels per frame, 60 times per second)
ViewSonic had to insert an image scaler to take my 720p input frame, scale it up to 768p by "creating" 6.6% more vertical AND horizontal data that just wasn't there to start with, filter it through an anti-aliaser, and then display it. In the case of a 1080i input frame, the scaler discards 40% of the horizontal data, but still has to interpolate 42% of the vertical data. How the heck do you tell a chip to interpolate 1 out of every 16 pixels? Better still, how do you tell it to discard 13/32nds of a complete frame?
If my TV had a true 720*1280 pixel panel, none of the interpolation shenanigans would be necessary for 720p input sources. For a 1080i source, it would interpolate 25% per frame vertically (for every 4 lines the panel lights up, only one of them is interpolated) and straight-up discard 33% (one line out of 3) to display the image dot-for-dot. You still need a scaler and an anti-aliaser, but it seems to me that neither would have to work as hard to get the same result. Let the satellite tuner, cable box, or DVD player worry about that stuff, and just display the stupid image.
When you bring Bluray or some other "TrueHD" 1080p source, the exra circuitry looks even more ridiculous. If you're talking about watching a cinematic aspect ratio source (read: a movie) from a Bluray player, you're sending a 1080h * 1920v image to a screen that's going to discard 33% of the horizontal data AND is 33% black bars on the top and bottom...the end result is a 540h * 1280v image, which happens to be beautifully simple math on a REAL 720p display!