1080 uses more bandwidth?

rgarbonzo

SatelliteGuys Family
Original poster
Jun 16, 2004
86
0
Oregon, Oh
I was talking to an HD salesman and he said something that didn't sound right. That 1080i uses more bandwidth than 720p. Is this true cuse to me it sounds like total BS
 
rgarbonzo said:
I was talking to an HD salesman and he said something that didn't sound right. That 1080i uses more bandwidth than 720p. Is this true cuse to me it sounds like total BS
It's true, but what makes you think otherwise?:confused:
 
I disagree... please note that I have no idea what I'm talking about... but, I think that the bandwidth is the mbps used to send the signal on a particular frequency, while 1080i is the number of frames sent in 1/30th of a second (1080i being 540 frames verticaly in 1/60th of a second and 540 frames horizontaly in 1/60th of a second) interlaced and 720p is the number of frames sent in 1/30 of a second (no interlacing)... if I'm wrong, could someone help me understand...
 
voomvoom said:
I disagree... please note that I have no idea what I'm talking about... but, I think that the bandwidth is the mbps used to send the signal on a particular frequency, while 1080i is the number of frames sent in 1/30th of a second (1080i being 540 frames verticaly in 1/60th of a second and 540 frames horizontaly in 1/60th of a second) interlaced and 720p is the number of frames sent in 1/30 of a second (no interlacing)... if I'm wrong, could someone help me understand...
The guy writing this web page knows what he is talking about:
http://www.alvyray.com/DigitalTV/DTV_Bandwidths.htm
 
First of all, when we are talking about bandwidth, we need to look at compressed signal, and 720p may not necessarily have the same compression rate as 1080i.

Second, if we look at uncompressed signals and count the number of pixels in a frame:

720p is 1280x720=921,600 pixels
1080i is 1920x1080/2=1,036,800 pixels​

As you can see, there are more pixels with 1080i, then with 720p (which by the way, does not necessarily mean that 1080i is always better than 720p ;))
 
Conjuror said:
He is right, rgarbonzo.
I'm not so sure.

If the provider of HD content uses compression to fill a fixed maximum bandwidth the correct thing to do is to measure bandwidth AFTER compression, right? So, it would appear to me that unless the provider uses different compression levels and ultimately reaches different bandwidth pipe sizes then a 1080i channel is likely being broadcast with the SAME amount of bandwidth a 720P channel.

Also, I've heard that some/many/all current HD providers are broadcasting a lower quality, smaller size version of "1080i" format with only 1035 lines vertically by only 1440 pixels horizontally (which differs measurably from the defined HDTV 1080i standard which is 1920 pixels by 1080 lines). If this is true then BEFORE compression actual signals claiming to be "1080i" would be pumping out fewer pixels per second than a "720P" signal. Here's the math:

1080i (defined HDTV standard): 1920 pixels x 1080 lines x 30 frames per second = 62,208,000 pixels per second

1080i (if broadcast at 1035i): 1440 pixels x 1035 lines x 30 frames per second = 44,712,000 pixels per second

720P: 1280 pixels x 720 lines x 60 frames per second = 55,296,000 pixels per second

I honestly don't know if VOOM or anyone else is broadcasting their "1080i" as 1035 x 1440 but I'd be interested to find out.
 
subdude212 said:
I'm not so sure.

If the provider of HD content uses compression to fill a fixed maximum bandwidth the correct thing to do is to measure bandwidth AFTER compression, right? So, it would appear to me that unless the provider uses different compression levels and ultimately reaches different bandwidth pipe sizes then a 1080i channel is likely being broadcast with the SAME amount of bandwidth a 720P channel.

Also, I've heard that some/many/all current HD providers are broadcasting a lower quality, smaller size version of "1080i" format with only 1035 lines vertically by only 1440 pixels horizontally (which differs measurably from the defined HDTV 1080i standard which is 1920 pixels by 1080 lines). If this is true then BEFORE compression actual signals claiming to be "1080i" would be pumping out fewer pixels per second than a "720P" signal. Here's the math:

1080i (defined HDTV standard): 1920 pixels x 1080 lines x 30 frames per second = 62,208,000 pixels per second

1080i (if broadcast at 1035i): 1440 pixels x 1035 lines x 30 frames per second = 44,712,000 pixels per second

720P: 1280 pixels x 720 lines x 60 frames per second = 55,296,000 pixels per second

I honestly don't know if VOOM or anyone else is broadcasting their "1080i" as 1035 x 1440 but I'd be interested to find out.
Voom, nor any other provider, will ever give out this information. The standard line is that "we constantly adjust compression depending upon the material." I have never believed this, because I don't think any provider has the ability or desire to exercise this level of quality control except in very rare circumstances. I can say this, Voom is sending zero channels in 720p right now. ESPN-HD has been stuck at 1080i for a couple months now. So even if someone had software on their PC to test out data rates, they can't accurately do a comparison with Voom.
 
andrzej said:
It's true, but what makes you think otherwise?:confused:
I thought that 1080i and 720p were so close BW wise that it would be a negligable difference, BW wise that is. If that is the case then why doesn't Voom send out everthing in 720p and save bandwidth to be redistributed along the channels increasing PQ?
 
rgarbonzo said:
I thought that 1080i and 720p were so close BW wise that it would be a negligable difference, BW wise that is. If that is the case then why doesn't Voom send out everthing in 720p and save bandwidth to be redistributed along the channels increasing PQ?
I think it is because there still are many with the older analog type displays; i.e. rear projection crt displays. They complain loudly because many of those sets do not scale from 720p to 1080i very well. Most newer progressive scan displays (plasma, LCD, rear projection LCD, DLP etc.) do scale from 1080i to 720p well enough to reduce the number of complainers. 720p does use less bandwidth and often provides a better picture than 1080i because it is not constantly wasting bandwidth on the interlace motion differences.
 
I think you can compress 720p more, thats why it uses less bandwidth. 1080i will pixelate etc and doesnt tolerate compression as well.
 
rgarbonzo said:
I thought that 1080i and 720p were so close BW wise that it would be a negligable difference, BW wise that is. If that is the case then why doesn't Voom send out everthing in 720p and save bandwidth to be redistributed along the channels increasing PQ?
NOOOOOOOOOOOOOOOOOOOOOOO dont ruin the picture. 1080i has more "wow" than 720p on still images or slow moving images.
 
subdude212 said:
I'm not so sure.

If the provider of HD content uses compression to fill a fixed maximum bandwidth the correct thing to do is to measure bandwidth AFTER compression, right? So, it would appear to me that unless the provider uses different compression levels and ultimately reaches different bandwidth pipe sizes then a 1080i channel is likely being broadcast with the SAME amount of bandwidth a 720P channel.

Also, I've heard that some/many/all current HD providers are broadcasting a lower quality, smaller size version of "1080i" format with only 1035 lines vertically by only 1440 pixels horizontally (which differs measurably from the defined HDTV 1080i standard which is 1920 pixels by 1080 lines). If this is true then BEFORE compression actual signals claiming to be "1080i" would be pumping out fewer pixels per second than a "720P" signal. Here's the math:

1080i (defined HDTV standard): 1920 pixels x 1080 lines x 30 frames per second = 62,208,000 pixels per second

1080i (if broadcast at 1035i): 1440 pixels x 1035 lines x 30 frames per second = 44,712,000 pixels per second

720P: 1280 pixels x 720 lines x 60 frames per second = 55,296,000 pixels per second

I honestly don't know if VOOM or anyone else is broadcasting their "1080i" as 1035 x 1440 but I'd be interested to find out.

Nobody broadcasts 1035i. There isn't even option for it. It is always 1080i. Besides if you did that many display devices woulg go crazy. You can't do tricks with vertical resolution as you can with horizontal. What many channels do is filter horizontal resolution. Also lots of equipment limits it too to 1440 pixels. For example Sony HDCAM, which records 1440x1080i. Lowering horizontal resolution will yield better compressed picture with less artifacts.
 
What Voom should do is have certain channels like WorldSport or ESPN-HD in 720p the full screen refresh bodes well for action. Other that than 1080i has a higher resolution and as mentioned here already has a nicer WOW factor (with enough b/w) than 720p for certain sources. This point is moot however, when we go to a bigger dish, and WM9. I think there will be enough b/w then.
 
CKNA said:
Nobody broadcasts 1035i. There isn't even option for it. It is always 1080i. Besides if you did that many display devices woulg go crazy.
I googled this info from a number of sources:

1035i is detailed in SMPTE 260M which describes 1035i as an HDTV format with 1125 total lines per frame, 1035 active lines per frame, 2:1 interlaced with a 30Hz or 30Hz/1.001 (NTSC-friendly) frame rate.

Do you know any way to determine whether someone is broadcasting 1035i versus 1080i? I checked the manufacturer's site for my set but I couldn't find any specs which listed which HDTV formats the set could decipher.
 
subdude212 said:
I googled this info from a number of sources:

1035i is detailed in SMPTE 260M which describes 1035i as an HDTV format with 1125 total lines per frame, 1035 active lines per frame, 2:1 interlaced with a 30Hz or 30Hz/1.001 (NTSC-friendly) frame rate.

Do you know any way to determine whether someone is broadcasting 1035i versus 1080i? I checked the manufacturer's site for my set but I couldn't find any specs which listed which HDTV formats the set could decipher.

1035i is detailed in SMTPE 260M but it is not part of ATSC HD spec. Japanese analog MUSE HD system used 1035i but it is now phased out.

Using stream analyzers like TSreader you can determine what is the resolution of the signal. I have never seen 1035i. It is either 1080i, 1088i or 720p.

I have seen some Japanese analog 1035i converted to 1080i digital but it did not look as good as anything shot digitally nowadays in 1080i.
 
OK, assuming that nobody out there is sending 1035i signals then the uncompressed numbers would look like this:


1080i (defined HDTV standard): 1920 pixels x 1080 lines x 30 frames per second = 62,208,000 pixels per second

1080i (if broadcast at 1440 pixels wide): 1440 pixels x 1080 lines x 30 frames per second = 46,656,000 pixels per second

720P: 1280 pixels x 720 lines x 60 frames per second = 55,296,000 pixels per second
So whether a provider trims the horizontal width from 1920 to 1440 pixels will determine whether their flavor of 1080i (uncompressed) requires more bandwidth than 720p. If they use fixed pipes the channels pumping the most pixels per second would require the most compression and therefore could suffer with the worst PQ - right? If the compression is dynamic I suppose this would lessen the detrimental effect of compression. Thanks for engaging in this discussion I'm just trying to understand the mechanics of what is going on behind the scenes.
 

Be aware of New Virus with New Price zip file

Channel 321 may be a great opportunity!

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)