And yet another thread of the ignorant, goofy, explanations vs. the science.
Folks, the original purpose of the 1080i HDTV signal was speced to accomodate the, back then popular, CRT which used electron beam scanning. The H "pixel" or line resolution spec was derived mathematically as the maximum permissible allowed for a broadcast signal to fill the licensed 6 Mhz channel bandwidth for broadcast TV. It was not arbitrarily decided on just to have a cool looking HDTV image that had to be achieved in order to be known as HDTV spec. The 720P x 1280 signal was designed to also fill the channel bandwidth as a progressive HDTV signal to accomodate the digital display devices that paint a solid image instantly on the screen as opposed to the interlaced scanning method that used phosphor persistance to paint the full 2 field image. Cross technologies allowed an interlaced scanned signal to display on digital progressive monitors using de-interlaced methods, some worked better than others and vice versa for P images on an interlaced display but both were tainted with artifacting. The recent technology of a progressive display has had dominance in the 720 x 1280 pixel array imagers that matched the 720P x 1280 HDTV spec so that a good match up of signal to "native" resolution could be achieved. Other devices as was already mentioned needed some interpolation to achieve the native non-standard HD resolution for display. This has varying degrees of artifacting. In the 1080i CRT devices, most are not able to achieve full line resolution detail in the display, especially in the H res spec of 1920. Sony once published a white paper on the lab test of the G90 where they actually measured the G90 at 1920 line resolution under laboratory conditions. It was their only PJ capable of doing this but the image was not considered usable for HT viewing. It had a to be reduced in beam current to such an extent to achieve beam focus that the image was quite dark and then had to be reduced in size to a rather small screen size not typical of G90 installations. Using a typical 72" diag screen size the maximum resolution of the H lines was a tad over 1100 as best achieved. Surprised? If you understood the physics involved it should not be. But that was then and today, the science of HDTV had evolved, not in the CRT camp but in the digital with several leading progressive technologies. Not until mid year 2006 did 1080P x 1920 displays begin to appear on the market and then only in very limited production. Nearing the end of 2006 holiday season the offering was quite expanded but still most of the sets I saw in the stores were in the 720P camp, not 1080P. What Scott and others stated is essentially correct in that many monitors can receive a 1080p signal but can't image that signal due to the lessor native resolution of the screen. IN 2007 this will not be the case as the 1080P x 1920 imagers become the main production line. During 2007, I do believe 1080P x 1920 monitors will be the main ones available.
But what about signal? What about program source? If you want true 1080P x 1920 program source, better stick with the latest DVDs for HD, both HD DVD and Blu Ray as no broadcast TV station will be allowed to transmit this. IT is possible that a closed circuit system may be capable of sending you a 1080P x 1920 image signal but don't look to the traditional acquisition and distribution channels for this. Until recently, ALL field acquisition for broadcast was restriced to 1080i x 1440 pixels or less. Recent advancements in equipment has allowed this to now record at the full 1920 lines for the 1080i acquisition. Unfortunately, the infrastructure of the acquisition industry for HDTV just doesn't upgrade to the latest technology as soon as it is released en mass. For years to come the majority of acquisition will be limited to 1440 H pixels maximum. Dumbing down from this, other "limiting" circumstances may be added to limit this resolution even more for both business and artistic reasons. So what this means is, that if you insist on only watching HDTV if it is a full 1920 pixels H resolution and you do not have one of the new HD DVD formats, you may as well just shut off your TV and go play checkers since it is a long way off from ever seeing that resolution on either broadcast or DBS or cable programming source.
Possible sources of true 1080i x 1920 could come from sources like HBO, Showtime, and other companies that convert film to D5 tape and distribute these D5 tapes to providers that have the modified D5 playback for transmission and allocate the full 19.4 Mbs bandwidth for that signal. Will that happen? Doubt it ( just doesn't make good business sense as so few people will be able to view it anyway)so again, if you want this level of quality, get a HD level DVD technology for viewing.
The game plan should be that you have all levels of program sources, from SD highly compressed, to Blu Ray HD DVD. But don't insist that all HD signals meet one standard. that of the highest limit allowed under the spec. Instead, enjoy the programming for it's content and have your home theater capable of displaying what is being made available for that program. Continue to complain about providers who are dumbing down the programming beyond the current popular viewing capability of the public. My opinion, is that this, today is 720P x 1280 but maybe in 2 years it will be 1080P x 1920. Remember it's not what was pervasive at CES2007, nor what is pervasive being sold now in 2007 in the stores, but what is common in homes across America for the HDTV viewing audience. In fairness, that is what I expect the majoriety of programming to be offered at for HDTV within the law, but also, to have cutting edge technologies available, like HD DVD and Blu Ray for those who want to witness all that we can have today.