Receiving Signal versus Quality

New to outernet as of last week. My dish is pointed to galaxy 19. My initial values for signal was 85 with quality between 72 and 80. Fine tuning the dish a bit I was able to get my quality up to 96 but this caused my signal to drop around 34. Should I be concerned with this value dropping so much?

Signal strength is only marginally useful for rough aiming (and often second only to inline meters in unreliability), but Quality is what you should be peaking once you have locked onto the signal. If you are getting the highest possible Quality, then you have correctly aimed and peaked your dish/LNBF, regardless of the Signal indication.

Thank your for that. I was able to get a bit more signal but seems the gauge cannot display the value correctly.

Is this on HDStar?

Yes this is on the HDStar. The newer 3000 model with the led light that stays red during operation.

On HDStar, the quality indication will probably be a bit off. The specs aren’t very clear about how drivers shold report it, so driver authors tend to choose arbitrary values that make sense to them.

1 Like

Is there any guideline for how low the signal and quality can drop before you start losing data?
On satellite STBs you have something called bit error rate (BER) which indicate how much of the data stream is missing out. When this value is increasing the forward error correction (FEC) is no longer able to handle the faults and then data is lost.
I guess the outernet is not only benefitting from FEC on the transponder stream, it is also redundant by sending the same data multiple times?

Yes, it will keep the blocks it had already received and wait for another turn.

When I installed my FTA dish with the DISEqC 1.2 motor a while ago it took me weeks to get it perfect (My OCD took over). When I point to the Galaxy 19 sat and connect the HDStar/outernet RPI I get these values: I noticed the signal level jumping to random numbers from time to time.