Hello,
I have build my own SDR software using SoapySDR and it works also with SDRplay (I also use other SDR receivers like Adalm pluto, Radioberry, RTL). I use a RSP1A for test.
When I tune to a frequency for instance on 80m band, the signal level depends on the offset in the base band of the RSP1A.
For instance with a sampling rate of 1MSP, I have a bandwidth of 500 KHZ, if the station I am listing too is in the higher part of the baseband the signal level will be much lower than when the signal is in the lower part of the baseband. I don’t have this effect with other SDR receivers.
Also the general signal level from the RSP1A is much lower than other devices. But maybe that because RSP1A is 14 bits while other receivers are 12 or even 8 bits. Is this a common issue? PS USing AGC does not influence the effect, currently mostly I switch off the ACG because it reduces the signal level too much. Also when I change the bitrate the effect is the same. I am currious if other people also experience this effect.