We Got A Problem!!!

caam1 said:
There are articles published in the late 1930's / early 1940's that refered to the 525 scan line system (480i) as "high definition", because the early electronic television systems used in the 1930's were less than 400 scan lines.

Although, the only ATSC broadcast resolutions currently used are 720p x 1280 and 1080i x 1980, several other resolutions are used to aquire and record HD video. Sony HDCAM (non SD version) is 1080i x 1440 and Panasonic DVCPRO HD is 1080i x 1280. The broadcast injdustry considers these formats as "high definition" and not "HD-Lite".

On the reception end there are many different resolutions that are considered "high definition". LCD and plasma displays that have resolutions of 768 x 1280, 768 x 1024, 1024 x 1024, etc. are all considered to be "high definition and not "HD-Lite". The only ATSC requirement for a display to be called HD, is that it must support 720p scan lines.

I don't like the fact that both D* and E* down-convert channels from their origional format, because each conversion produces artifacts. I also don't like it when OTA stations that broadcast 1080i start adding sub-channels that eat up the bandwidth, because that creates more artifacts. Even though the quality is reduced in both cases the result is still technically "high definition".
The issue we're dealing with here is not what resolutions can or can't be used. It's fairly obvious that the FCC has left that end completely open on purpose, and pointed to the ATSC for "standards" so manufacturers would have at least something to shoot for from a compatibility standpoint. Let's be clear here, the ATSC standards are guidelines, not law or dictates.

The issue is the matter of the definition of the term "High Definition". The FCC has done very little to define this term, but rather has deferred to the ATSC
publications as inclusions by reference. The ATSC publications, notably a/54, have the definition of HD to be 1920x1080i/p or 1280x720p. Of this there is can be no argument.

All this is fine and good, but it doesn't mean a thing if the common perception departs from this standard. It all comes down to what the consumer is to resonably expect when he/she purchases a high definition product.

I contend at this point and time, the reasonable expectation of the consumer is still in line with the ATSC publication. There hasn't been enough of HD-Lite for it to become synonymous with HD in the public's eye, but it looks like E* and D* are trying to push it that way.

That's what all the hand-wringing is about. If the FCC comes out and says "This is our definition of HD", then it will be cemented as such in public perception, IMO. But they've only done it in a back-handed way that leaves some wiggle room for those who want to change the definition.

-sc
 
Television as we know it, less color, has been around since the 1940s. I recently saw an item about an FDR speech that was televised. Of course, only a hand full of folks had TVs and only those in a couple of east coast cities saw the broadcast, but it was the same TV technology that powered the 17 inch B&W set that my father bought for the family in 1954.

Having been watching TV on and off since 1954, I can saw that excluding the Japanese analog HDTV, I never heard anyone refer to any TV as High Definition (until the mid-1990s). The term that was being used and mis-used to sell new color TVs in the late 1980s and early 1990s was "high resolution". The claim was that a certain high priced TV had maybe 800 or so scans lines. Folks didn't understand that referred to capability only, since the broadcast received was only 480 visible scan lines.
 
Last edited:

Newbie needs advice

Does anyone know....

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)