Hammerdown-Rather my suggestion is for you to go back and review the basics. Maybe I'm just not reading you right but there are too many flaws in what you state. Joe Kane's "HDTV Basics" is an excellent starter. It's on BluRay disk You'll enjoy it.
1080i x 1440 has been around since the Sony HDCAM standards were set back in the early 90's. I recognize that many consumers, ignorant of the engineering basics don't agree with the HDCAM standards, and have pull quoted the an excerpt from the ATSC and ran with it because it was a bigger number(1920). If we dumped all the content that was produced with HDCAM, we'd have about 2% left. Simply stated, 1080i x 1440 is not HD Lite, it is the HDCAM standard. Personally, I have always felt the use of the slang term HDlite was a catch all for telling an engineer that you don't understand HDTV production variables.
Not necessarily. All digital displays will internally convert any pixel bandwidth to the native pixel display. (assuming it didn't exceed the upper frequency end of the circuit design.)The actual resolution is a variable dependent on more than frequency and imager native design. Plasma and JVC DILAs are excellent examples and has some rather weird design translations, yet will handle less than or more than native display signals just fine. Monitors don't choke, they just suffer quality when deviating from the native design specs.
1080i x 1440 has been around since the Sony HDCAM standards were set back in the early 90's. I recognize that many consumers, ignorant of the engineering basics don't agree with the HDCAM standards, and have pull quoted the an excerpt from the ATSC and ran with it because it was a bigger number(1920). If we dumped all the content that was produced with HDCAM, we'd have about 2% left. Simply stated, 1080i x 1440 is not HD Lite, it is the HDCAM standard. Personally, I have always felt the use of the slang term HDlite was a catch all for telling an engineer that you don't understand HDTV production variables.
On HDMI with fixed pixel displays you would be getting 1920x1080i. Since 1440 is not a real standard I think any display would choke on a 1440x1080x signal fed into the hdmi port, and give an error message like "no signal".
Not necessarily. All digital displays will internally convert any pixel bandwidth to the native pixel display. (assuming it didn't exceed the upper frequency end of the circuit design.)The actual resolution is a variable dependent on more than frequency and imager native design. Plasma and JVC DILAs are excellent examples and has some rather weird design translations, yet will handle less than or more than native display signals just fine. Monitors don't choke, they just suffer quality when deviating from the native design specs.