bhelms said:
Since coax cable is passive and "lossy", every bit of it causes signal "degradation".
That's not true for two reasons:
If you look at the expression for System Noise temperature, which defines the Carrier-to-Noise-Ratio available for detection, it is an expression with many additive terms, and Cable loss only affects the denominator of some terms, and thus only materially affects the CNR when cable loss is so large that those terms dominate. So up to a threshold point it has virtually no effect on CNR.
And then Bit Error Rate, which determines whether or not the errors are fully correctable, i.e. invisible, rises abruptly at a specific threshold CNR, which is normally somewhat worse than the expected CNR in a normal install.
So two things have to happen for cable loss to have ANY effect:
It has to be so large the CNR is degraded, and
The CNR degradation has to be big enough to affect BER.
Below that point reducing cable loss has no measurable effect on the picture, even to compulsive videophile standards. It's just physics.