Does signal size really matter?

Status
Please reply by conversation.

trainerman

New Member
Original poster
Sep 12, 2007
4
0
Missouri
Opinions wanted...

If the signal you are receiving from the satellite is digital, then as with other digital tv signals, you are either getting it or not getting it...there is no in between fuzzy partial picking it up barely as with analog signals.

Therefore, if you are receiving the signal from the satellites and you have a picture on your tv that is not pixelating, the actual strength of the signal past the point of actually receiving it is meaningless except for overcoming weather...rain fade, trees, other obstacles...correct?

In other words, if the point of either getting the picture on your tv on not receiving it or it pixelating is say 60 and you are getting a constant 65 which is considered low, you are still as good off in the quality of the picture that you see on your tv as the guy who gets say, 95's across the board. It's just 95 is better off for overcoming the weather, clouds, trees, etc.

Agree or disagree?

I'm asking this in regards to curiousity as to whether or not my HD picture quality would be as good with a signal average of 80 as opposed to a guy who's signal average is 95 in optimum conditions taking out the different tv factor also.


Thanks for your time!
 
IMHO, going from a solid signal of 65 to the 90's will not improve the picture quality. As you mentioned a higher signal reading will help keep a signal lock when the weather causes problems.
 
If your picture is not pixilating, that is as good as it is going to get (your picture won't look any better even with better signal strength).
 
Last edited:
Opinions wanted...

If the signal you are receiving from the satellite is digital, then as with other digital tv signals, you are either getting it or not getting it...there is no in between fuzzy partial picking it up barely as with analog signals.

Therefore, if you are receiving the signal from the satellites and you have a picture on your tv that is not pixelating, the actual strength of the signal past the point of actually receiving it is meaningless except for overcoming weather...rain fade, trees, other obstacles...correct?

In other words, if the point of either getting the picture on your tv on not receiving it or it pixelating is say 60 and you are getting a constant 65 which is considered low, you are still as good off in the quality of the picture that you see on your tv as the guy who gets say, 95's across the board. It's just 95 is better off for overcoming the weather, clouds, trees, etc.

Agree or disagree?

I'm asking this in regards to curiousity as to whether or not my HD picture quality would be as good with a signal average of 80 as opposed to a guy who's signal average is 95 in optimum conditions taking out the different tv factor also.


Thanks for your time!

Agree, because you asked ...

Jimbo
 
Status
Please reply by conversation.

Circuit City has free NFL ticket to new subscriber this weekend?

Questions about new install

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)