Is my dish network broadcasting 1080p?

Short Circuit, filmed in 1986, was shot on 35mm film. That is higher resolution than even 1080P. It can be made to look just as good any new, or old, movie. I have bluray copies of Ben Hur (1959) and Casablanca (1942) and they are stunning to see.....
A lot depends on the condition of the original negatives. If in not so great shape, then the next step is the very expensive frame by frame restoration, and some films just don't warrant that. A movie like Jaws, is another story, and worth it. But it was a shame Universal allowed it to deteriorate so.
 
Bit rate has a far more obvious and even jarringly noticeably difference than resolution. The reason Blu-ray easily blows away what can be seen on cable, sat, etc., or local HDTV directly into the TV is because of BIT RATE!!!!!! Presuming a an OTA digital HDTV source at full resolution, ATSC can only provide about a maximum effective 19Mbs for all video, but keep in mind that most broadcasters have sub-channels on that same bandwidth, so the main HD channels is actually LESS than the best case 19Mbps, perhaps only at about 10Mbs (or even less in some extreme cases). That is about ONE THIRD the bit rate of some of the best Blu-ray discs out there that average about 30Mbps and can even peak around 40Mbps. This is also why many consumer HD cameras never provide an image as good as Blu-ray because most consumer cameras, while providing full resolution at 1920X1080, are limited to about 10Mbs. The best any channels sends to MVPD's is at 19Mbps, as those channels have adopted the ATSC effective 19Mbps as their own limit (they can send higher bit rate if they wanted to, but it would cost in bandwidth for their place on the satellite they use to sent to MVPD's), but the can and do in a number of cases send LESS than 19 Mbps. So, essentially, 19Mbps is the top end "standard" and best bit rate for any TV source and about 30Mbps for the best Blu-rays. That's the difference we can all see like night and day, less so than resolution as long as the reduced resolution isn't too low. Also, this is why a lot of people have a very had time telling 720P and 1080i apart because it is the BIT RATE that has the far more noticeable effect.

Here is a response I posted in another thread with virtually the same question/concern:

I think he is referring to the fact that for 1080i channels, Dish re-encodes it at the 1440X1080 HD approved standard and not the 1920X1080 resolution that is sent to MVPD's by the channel content providers. Some folks grumble about this slightly lower resolution, but it is among the approved HD standards (some OTA broadcasters are doing this, too). However, it is often bit rate, not necessarily slightly lower resolution, that can have a much bigger impact on HD PQ (DirecTV's Ka sats used for its HD channels have transponders with greater bandwidth than Dish's DBS Ku sats it uses for HD channels, so DirecTV has an advantage there). In other words a full 1920X1080 at a lower bit rate can look WORSE than a 1440X1080 higher bit rate. In addition, the Dish STB's can further take advantage of low bit rates and resolutions with software and processing before outputting to your HDTV that can provide a PQ that looks like it was sent at higher bit rate and resolution. Also, there are other ways of reducing the data needed on a xpndr or OTA channel that could also be worse to PQ than slightly lower resolution such as lower chrominance data, lower luminance data, etc., but the use of Turbo Coding (used by Dish for HD channels) can allow for higher throughput (more data) in the same limited bandwidth, thereby not having to reduce other picture data nor use as much compression than without Turbo Coding.

Also, one has only to view some of our LA OTA digitals to see how inferior the HD PQ is when local broadcsters are running an HD channel at about half the bit rate OR LESS, even at a presumed full resolution rate, but to accommodate other multiplex (sometimes referred to as sub-channels) channels on the same stream, and KABC shoving TWO HD channels (at 720P with the 2nd HD channel at a noticeably inferior PQ that is probably at a lower bit rate, and it could also be at a lower horizontal resolution) and an SD takes it toll, especially since OTA broadcasters are stuck using MPEG2 (instead of the far more efficient MPEG4--more channels using less bandwidth with superior PQ--as Dish and Direct do) and an effective maximum 19Mbs bandwidth in which to cram all this. I have found Dish's 1440X1080 to be often superior to some of our big network locals I have viewed LIVE using OTA antenna via TiVo or DIRECTLY into my HDTV. Yes, FiOS would be noticeably superior at the full 1920X1080 at a generous bit rate, and I would love that in the perfect world (FiOS can't necessarily improve the inferior OTA being sent to it) but I've found that Dish HD running directly into my TV can be pretty impressive, and even more so with my AV receiver processing chip and a DVDO on another HDTV. I find it an acceptable compromise. However, I understand why some do not, and are irked by it, and I agree it ought to be full resolution for those fancy HDTV's we buy, but except for FiOS, nobody achieves that, not even most broadcasters. I will say that KCBS seems to be noticeable superior to ANY other LA OTA and I wouldn't be surprised if they are running full resolution at full bit rate. Keep in mind that while Blu-ray can provide around 30Mbs, I don't think there is any HDTV channel content provider running at more than 19Mbps per content "channel" to MVPD's. Once you start watching Blu-ray enough, even the best full resolution at full bit rate OTA starts to look pretty diminished and not so impressive, after all.
 
Last edited:
Re: Blackberry vs apple vs andoid

I'll bet the op found out more than he imagined or even wanted to know......

And his hd probably looks the worse for it.

Sent from my SCH-I535 using Tapatalk 2
 
;)
Bit rate has a far more obvious and even jarringly noticeably difference than resolution. The reason Blu-ray easily blows away what can be seen on cable, sat, etc., or local HDTV directly into the TV is because of BIT RATE!!!!!! Presuming a an OTA digital HDTV source at full resolution, ATSC can only provide about a maximum effective 19Mbs for all video, but keep in mind that most broadcasters have sub-channels on that same bandwidth, so the main HD channels is actually LESS than the best case 19Mbps, perhaps only at about 10Mbs (or even less in some extreme cases). That is about ONE THIRD the bit rate of some of the best Blu-ray discs out there that average about 30Mbps and can even peak around 40Mbps. This is also why many consumer HD cameras never provide an image as good as Blu-ray because most consumer cameras, while providing full resolution at 1920X1080, are limited to about 10Mbs. The best any channels sends to MVPD's is at 19Mbps, as those channels have adopted the ATSC effective 19Mbps as their own limit (they can send higher bit rate if they wanted to, but it would cost in bandwidth for their place on the satellite they use to sent to MVPD's), but the can and do in a number of cases send LESS than 19 Mbps. So, essentially, 19Mbps is the top end "standard" and best bit rate for any TV source and about 30Mbps for the best Blu-rays. That's the difference we can all see like night and day, less so than resolution as long as the reduced resolution isn't too low. Also, this is why a lot of people have a very had time telling 720P and 1080i apart because it is the BIT RATE that has the far more noticeable effect.

Here is a response I posted in another thread with virtually the same question/concern:

I think he is referring to the fact that for 1080i channels, Dish re-encodes it at the 1440X1080 HD approved standard and not the 1920X1080 resolution that is sent to MVPD's by the channel content providers. Some folks grumble about this slightly lower resolution, but it is among the approved HD standards (some OTA broadcasters are doing this, too). However, it is often bit rate, not necessarily slightly lower resolution, that can have a much bigger impact on HD PQ (DirecTV's Ka sats used for its HD channels have transponders with greater bandwidth than Dish's DBS Ku sats it uses for HD channels, so DirecTV has an advantage there). In other words a full 1920X1080 at a lower bit rate can look WORSE than a 1440X1080 higher bit rate. In addition, the Dish STB's can further take advantage of low bit rates and resolutions with software and processing before outputting to your HDTV that can provide a PQ that looks like it was sent at higher bit rate and resolution. Also, there are other ways of reducing the data needed on a xpndr or OTA channel that could also be worse to PQ than slightly lower resolution such as lower chrominance data, lower luminance data, etc., but the use of Turbo Coding (used by Dish for HD channels) can allow for higher throughput (more data) in the same limited bandwidth, thereby not having to reduce other picture data nor use as much compression than without Turbo Coding.

Also, one has only to view some of our LA OTA digitals to see how inferior the HD PQ is when local broadcsters are running an HD channel at about half the bit rate OR LESS, even at a presumed full resolution rate, but to accommodate other multiplex (sometimes referred to as sub-channels) channels on the same stream, and KABC shoving TWO HD channels (at 720P with the 2nd HD channel at a noticeably inferior PQ that is probably at a lower bit rate, and it could also be at a lower horizontal resolution) and an SD takes it toll, especially since OTA broadcasters are stuck using MPEG2 (instead of the far more efficient MPEG4--more channels using less bandwidth with superior PQ--as Dish and Direct do) and an effective maximum 19Mbs bandwidth in which to cram all this. I have found Dish's 1440X1080 to be often superior to some of our big network locals I have viewed LIVE using OTA antenna via TiVo or DIRECTLY into my HDTV. Yes, FiOS would be noticeably superior at the full 1920X1080 at a generous bit rate, and I would love that in the perfect world (FiOS can't necessarily improve the inferior OTA being sent to it) but I've found that Dish HD running directly into my TV can be pretty impressive, and even more so with my AV receiver processing chip and a DVDO on another HDTV. I find it an acceptable compromise. However, I understand why some do not, and are irked by it, and I agree it ought to be full resolution for those fancy HDTV's we buy, but except for FiOS, nobody achieves that, not even most broadcasters. I will say that KCBS seems to be noticeable superior to ANY other LA OTA and I wouldn't be surprised if they are running full resolution at full bit rate. Keep in mind that while Blu-ray can provide around 30Mbs, I don't think there is any HDTV channel content provider running at more than 19Mbps per content "channel" to MVPD's. Once you start watching Blu-ray enough, even the best full resolution at full bit rate OTA starts to look pretty diminished and not so impressive, after all.
After I read all this, I realized I'm out of Tylenol. Road Trip!
 
I have my 722k set to 1080i & my Logitech Revue up-converts the resolution to 1080p(unless the power goes out in my house,then afterwards it down-converts the resolution to 720p after the power comes back on. The resolution on my Logitech Revue is set to Auto,which is why that happens).
 
I just bought "Lord of War" on Blu-Ray and the pq is horrible. So many artifacts it looks like old school snow when you get up close.
 
I have my 722k set to 1080i & my Logitech Revue up-converts the resolution to 1080p(unless the power goes out in my house,then afterwards it down-converts the resolution to 720p after the power comes back on. The resolution on my Logitech Revue is set to Auto,which is why that happens).

I have my google revue set to 1080p and it stays 1080p no matter if the power goes out and comes back. Set it to 1080p and see for your self.
 
I have noticed that on talk shows that are in HD, that the networks blur the faces of the hosts, so you do not see their aging signs and imperfections like acne, etc. Watch and see if you agree. How vane our stars are.
 
Regular series shows too. NCIS comes to mind.

NCIS is one of the most out of focus looking HD shows I've ever seen. Other shows on CBS in hd don't look anything like NCIS. If I was watching the show strictly for picture quality, I would of stopped a long time ago. Luckily the show writing and characters are great , so it makes up for the out of focus & soft picture.
 
It's know as soft focus. Has been standard operating procedure ever since HD came out. Talent get's all freaked out about showing that they are human and age.


There is nothing wrong with people showing their age, especially men. It shows character and it looks like you might actually know what you are talking about when you have a little gray on top and some crows feet, etc. It makes the males look like they actually could be veteran military officers that they are supposed to play. I don't even mind the older women on tv. But when I see a vasoline blurred face when they show a woman like Barbara Walters, it kind of pisses me off. I want to see true , real life images with HD, not an out of focus, soft, blur.
 
1. All things being equal, i.e. no additional compression in the stream, 1080i == 1080p for film-based content. With film-based content, the frame is split into two fields. On the display side the 2 fields are reassembled (weave deinterlacing if we're being particular). This can be either at 48i/24p or 60i/30p. However the 60i/30p content for 24/60 conversion (inverse telecine) requires more effort. Certainly feasible though. This reverses the 2:3 cadence used to get 24p to 60i.

1080i from interlaced source not quite == 1080p. Here the two fields are from different moments in time (odd and even lines are seperated by 1/60th of a second), and this can't be properly reassembled with a simple weave. You need motion compensation and interpolation algorithms to get top notch results.

2. You can't simply compare bit rates and say that higher == better. Newer codecs (H.264 and VC-1) are in the area of twice as efficient as MPEG-2. Comparing (as an example) a 20 Mbit/second MPEG-2 vs a 15 Mbit/second H.264 would almost always win. I qualified this as their could be some pathological content which makes the newer codecs look worse than the older codec.

If the codec is the same, that's true until you get up to the point where gains are not visible to the naked eye. Keep in mind I'm just throwing these bit rates out... you might not see any visible gain going from 40 Mbit/second to 50 Mbit/second with H.264 and VC-1.

3. Every broadcaster is dealing with the same type of issues -- the available delivery bandwidth is finite and they are all trying to deliver as many channels as possible within their available bandwidth. Newer codecs help out with that.




Sent from my Transformer Prime TF201 using Tapatalk 2
 
Soft focus has been going on before HD. Baba Wawa has been soft focus for a couple of decades now ;) It's just painfully obvious in HD.



Sent from my Transformer Prime TF201 using Tapatalk 2
 
It's know as soft focus. Has been standard operating procedure ever since HD came out. Talent get's all freaked out about showing that they are human and age.


The day before the Olympics I saw Meridith Viera on Today and let's face the facts she's not a spring chicken any more, she looked normal but you could see alot of age wrinkles,she must have come in late.

That night she had on her HD makeup,no wrinkles visible.
 
I have my google revue set to 1080p and it stays 1080p no matter if the power goes out and comes back. Set it to 1080p and see for your self.


Plus if the Logitech Revue can upconvert a signal,I would also believe if DISH wanted to put in 1080p upconversion in their receivers they could,heck my Samsung Bluray player upconverts my regular DVDs to 1080p and it looks real close to Bluray quality.
 
Plus if the Logitech Revue can upconvert a signal,I would also believe if DISH wanted to put in 1080p upconversion in their receivers they could,heck my Samsung Bluray player upconverts my regular DVDs to 1080p and it looks real close to Bluray quality.

I don't see why DISH doesn't allow this option to upconvert their signal to 1080p. It would be a nice perk for those of us who would like to see 1080p on everything.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 4)

Top