Hi. Using the spectrum analyzer software I’m trying to measure a signal from my signal generator and it seems as if the measured value is attenuated. Or am I doing something wrong?
This is what I did:
Signal generator, sine wave 7 MHz with varying amplitude (10mV – 1V) connected to the RSP1A. By this I mean that I set the signal generator to several different amplitude settings.
What I see: nice peak at 7 MHz (great!) but the dBm value is off about 7 dBm (lower).
I calculated the expected value using the peak-to-peak measurement on my oscilloscope: log10 ((Vpp squared) * 1000) / 400) * 10
I double checked my expected values with various on-line calculators / tables.
Do I miss something here? Is there any attenuation I have to take into account?
What’s the op impedance of the sig gen? Professional types are 50r and should provide the stated output into the 50r (approx) of the RSP. There is also the difference between EMF and PD from the generator – do you know which it provides?
It’s just one of these cheap ones (I don’t know what the impedance or other exact specs are). I figured that the reading my scope is giving me would be enough to calculate the power it puts out. But it probably isn’t that simple. I will try to figure out some other tests. Thnx
Just to report back on this. Turned out I forgot to terminate the cable with my scope measurements. This gave me wrong peak-to-peak values and so my calculations for the dBm values I expected were incorrect. After using a T-connector with 50 ohm termination I got mush closer to the readings of the SDRPlay. Now the difference is only 2 dBm, which could be the cable/connector loss (different cables for scope / SDRPlay). The readings are very consistent. So I’m very happy with it. Great tool! Thnx.