I have build my own SDR software using SoapySDR and it works also with SDRplay (I also use other SDR receivers like Adalm pluto, Radioberry, RTL). I use a RSP1A for test.
When I tune to a frequency for instance on 80m band, the signal level depends on the offset in the base band of the RSP1A.
For instance with a sampling rate of 1MSP, I have a bandwidth of 500 KHZ, if the station I am listing too is in the higher part of the baseband the signal level will be much lower than when the signal is in the lower part of the baseband. I don’t have this effect with other SDR receivers.
Also the general signal level from the RSP1A is much lower than other devices. But maybe that because RSP1A is 14 bits while other receivers are 12 or even 8 bits. Is this a common issue? PS USing AGC does not influence the effect, currently mostly I switch off the ACG because it reduces the signal level too much. Also when I change the bitrate the effect is the same. I am currious if other people also experience this effect.
I’ve noticed something similar using SoapySDR and an RSP1A. I get a high noise floor and strong signals 250MHz to either side of the baseband, and then it drops off significantly – This seems to be irrespective of the sample rate I’m using, i.e. 500kSps through to 8MSps.
You can clearly see the drop off at either edge of the 500kHz around the baseband, and then you can see it roll off at the end of the full sample space.
I’ve attached an example screenshot from GQRX. The sample rate is set to 4MS/s.