Received an email about PLL LNB stability and blind scan accuracy. The user wanted to know how LNB stability specification was referenced and why some blind scanned transponder frequencies were the same as published, but other blind scan logged transponders were different. Here is my reply and thought it might be of interest to other FTA hobbyists.
- - - - - - -
The stability specification provides the stability of the IF in the range of operational temperature. The time to reach a stable operational temperature is dependent on ambient temperature. A crystal is used to provide reference is typically within the +/-50KHz range and is not user adjustable. The LNB has been confirmed on the factory test bench to output a signal within +/-50KHz of the closed source calibration signal over a 10 minute test period at 25c ambient.
There are many variables when attempting to calibrate a consumer system using live satellite downlink signals. The logging process analyzes the incoming signal and divides the signal by 1/2 to establish a centered frequency. Downlink frequencies are more often than not centered on the rounded frequencies published on public sites. The calculated center of the carrier signal can be affected by signal processing errors introduced by interference (terrestrial, adjacent satellite, cross pol, etc), hardware variances and software processing. To accurately reference the downlink frequency, both the LNB and the PCI card would need to to be externally referenced to the same source. One would need to qualify both the LNB stability/offset and the tuner LO stability/offset. I would not consider the results a blind scan as an accurate reflection of minute frequency stability and would likely only indicate major hardware stability or signal value changes.
Carrier centering frequency variances on different carriers (not offset universally - systematic) not as documented by the downlink service provider would likely indicate interference or processing rather than hardware related.
As a hobbyist aligning a system with live satellite signals, I would suggest operating your system for 5 minutes after sunset, blindscan and reference a confirmed downlink of a narrow SCPC service. You will need to know the exact (not Lyngsat or similar reported) downlink frequency. This information is often published on the downlink service website or by referencing link budget parameter agreements. It would be best to test on a carrier (SR 1000 or lower) with a high FEC, high SNR and as near perfect BER. After several hours of run time run this test blind scan again and compare the logged frequency on this carrier.
Is the logged carrier center frequency the exact as the service's downlink frequency? Did the center frequency of the carrier shift between the tests? If so, how much was the shift? Were there changes with the SNR or BER indicating a change in error free processing that could point to other factors in evaluating the performance of the system hardware?
If the frequency remained within +/- 50KHz between the two tests with no significant change in SNR / BER, this should be noted as the offset off your system LNB, card, software processing. Adjust the LO to calibrate for any systematic offset.
I typically will find the operational drift to be less than 20KHz total during similar testing. If you find that the center carrier frequency remained within a 20KHz range, but is repeatedly 100KHz high, then you will assume that the hardware and software processing should be calibrated with a -100KHz offset. You might even use the results of this testing to further flowchart what component or processing is responsible for introducing the offset.
You will find variances between blindscan logged transponder frequencies, but if you know the baseline of your system, you will be able to identify why center frequency logging may be reported differently.
- - - - - - -
The stability specification provides the stability of the IF in the range of operational temperature. The time to reach a stable operational temperature is dependent on ambient temperature. A crystal is used to provide reference is typically within the +/-50KHz range and is not user adjustable. The LNB has been confirmed on the factory test bench to output a signal within +/-50KHz of the closed source calibration signal over a 10 minute test period at 25c ambient.
There are many variables when attempting to calibrate a consumer system using live satellite downlink signals. The logging process analyzes the incoming signal and divides the signal by 1/2 to establish a centered frequency. Downlink frequencies are more often than not centered on the rounded frequencies published on public sites. The calculated center of the carrier signal can be affected by signal processing errors introduced by interference (terrestrial, adjacent satellite, cross pol, etc), hardware variances and software processing. To accurately reference the downlink frequency, both the LNB and the PCI card would need to to be externally referenced to the same source. One would need to qualify both the LNB stability/offset and the tuner LO stability/offset. I would not consider the results a blind scan as an accurate reflection of minute frequency stability and would likely only indicate major hardware stability or signal value changes.
Carrier centering frequency variances on different carriers (not offset universally - systematic) not as documented by the downlink service provider would likely indicate interference or processing rather than hardware related.
As a hobbyist aligning a system with live satellite signals, I would suggest operating your system for 5 minutes after sunset, blindscan and reference a confirmed downlink of a narrow SCPC service. You will need to know the exact (not Lyngsat or similar reported) downlink frequency. This information is often published on the downlink service website or by referencing link budget parameter agreements. It would be best to test on a carrier (SR 1000 or lower) with a high FEC, high SNR and as near perfect BER. After several hours of run time run this test blind scan again and compare the logged frequency on this carrier.
Is the logged carrier center frequency the exact as the service's downlink frequency? Did the center frequency of the carrier shift between the tests? If so, how much was the shift? Were there changes with the SNR or BER indicating a change in error free processing that could point to other factors in evaluating the performance of the system hardware?
If the frequency remained within +/- 50KHz between the two tests with no significant change in SNR / BER, this should be noted as the offset off your system LNB, card, software processing. Adjust the LO to calibrate for any systematic offset.
I typically will find the operational drift to be less than 20KHz total during similar testing. If you find that the center carrier frequency remained within a 20KHz range, but is repeatedly 100KHz high, then you will assume that the hardware and software processing should be calibrated with a -100KHz offset. You might even use the results of this testing to further flowchart what component or processing is responsible for introducing the offset.
You will find variances between blindscan logged transponder frequencies, but if you know the baseline of your system, you will be able to identify why center frequency logging may be reported differently.