Length of cable wire before degrade of signal loss

Status
Please reply by conversation.
That question is worded in a way that makes it difficult to answer easily. Since coax cable is passive and "lossy", every bit of it causes signal "degradation". The longer the cable, the more degradation you have, and that is also a function of the frequency - higher frequencies attenuate faster. Example - one type of standard RG-6 is 6.11dB attanuation per 100 ft. at 1GHz and 9.53dB at 2.2GHz.

I think what you are trying to ask is how long a cable you can run before the signal you're trying to send through it is unusable. To answer that we need to know more about your application...
 
bhelms said:
Since coax cable is passive and "lossy", every bit of it causes signal "degradation".

That's not true for two reasons:

If you look at the expression for System Noise temperature, which defines the Carrier-to-Noise-Ratio available for detection, it is an expression with many additive terms, and Cable loss only affects the denominator of some terms, and thus only materially affects the CNR when cable loss is so large that those terms dominate. So up to a threshold point it has virtually no effect on CNR.

And then Bit Error Rate, which determines whether or not the errors are fully correctable, i.e. invisible, rises abruptly at a specific threshold CNR, which is normally somewhat worse than the expected CNR in a normal install.

So two things have to happen for cable loss to have ANY effect:
It has to be so large the CNR is degraded, and
The CNR degradation has to be big enough to affect BER.

Below that point reducing cable loss has no measurable effect on the picture, even to compulsive videophile standards. It's just physics.
 
Status
Please reply by conversation.

KC Local HD Audio/video problems

Burried my coax cables and lost signal?

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)