 Softrock40 Interest Group

 Public Group,
 10133 members
Using HDSDR to measure RF noise level
 0 Attachment
In a recent thread involving measuring the receive performance of the Softrocks radios I stated that the displayed noise level on the HDSDR was irrelevant for quantifying the actual noise level. After thinking about it for 24 hours and doing some testing this morning I am once again declaring myself in error. HDSDR can be used to quite accurately measure the absolute noise level of the Softrocks.
When using digital signal processing like HDSDR, the displayed noise level does not change as you change the filter settings. Try it for yourself; as you change the filter settings the noise coming from the speaker varies. Lower bandwidth = lower noise. But the displayed noise level does not change.
However the is a setting on the HDSDR display that does change the displayed noise level and this calibrated control is the key to using the software to determine actual noise power for a given bandwidth. That setting is the "RBW" or Resolution Bandwidth control. Changing this setting dramatically affects the displayed noise level but does not change the level coming from the speaker.
The RBW value is displayed in Hz. On the upper HDSDR display the maximum value is 187.5 Hz. On the smaller display in the lower right it is 93.8 Hz. Both work in exactly the same way.
It turns out that the RBW setting correlates directly with the measurement bandwidth for noise. If RBW could be set to 500Hz it would be ideal, but it cannot. However, the noise level you see at 187.5 RBW can be adjusted to tell you the level for any bandwidth. For example, if we want to know the noise floor of our radio in a 500 Hz bandwidth, we take the ratio of 500 divided by the 187.5 RBW setting, which is 2.67 and convert it to dB: 10 log(2.67) equals 4.3 dB.
So if you add 4.3 dB to the level of noise read from the display you have the "real" noise floor of your receiver for a 500 Hz bandwidth. The same principle applies for any desired noise bandwidth. For 2700 Hz the added factor is 11.6 dB, 5000 is 14.3 dB, etc.
Important caveat: in order to get an absolute number it is critical that you have calibrated your HDSDR using a known level into the receiver and then using the "S Meter Calibration" (Options/Misc Options/Smeter calibration) to set the level displayed to the known level of the source.
Hint: Use the "Spectrum" sliders to expand the scale and to center it vertically on the scale. Use the "Avg" pulldown to smooth the display so you can determine the number plus or minus 1 dB.
So there you have it. One more extremely useful and valuable measurement tool courtesy of the HDSDR folks.
For free.
Warren Allgyer
9V1TD 0 Attachment
Warren:
Is the RBW equal to the noise bandwidth? The two are not necessarily the same. There are some other subtle differences in detector method, i.e., envelope detector versus RMS.
http://cp.literature.agilent.com/litweb/pdf/59664008E.pdf is worth reading in this regard.
Jack K8ZOA
On 4/25/2014 8:21 PM, allgyer@... wrote:In a recent thread involving measuring the receive performance of the Softrocks radios I stated that the displayed noise level on the HDSDR was irrelevant for quantifying the actual noise level. After thinking about it for 24 hours and doing some testing this morning I am once again declaring myself in error. HDSDR can be used to quite accurately measure the absolute noise level of the Softrocks.
When using digital signal processing like HDSDR, the displayed noise level does not change as you change the filter settings. Try it for yourself; as you change the filter settings the noise coming from the speaker varies. Lower bandwidth = lower noise. But the displayed noise level does not change.
However the is a setting on the HDSDR display that does change the displayed noise level and this calibrated control is the key to using the software to determine actual noise power for a given bandwidth. That setting is the "RBW" or Resolution Bandwidth control. Changing this setting dramatically affects the displayed noise level but does not change the level coming from the speaker.
The RBW value is displayed in Hz. On the upper HDSDR display the maximum value is 187.5 Hz. On the smaller display in the lower right it is 93.8 Hz. Both work in exactly the same way.
It turns out that the RBW setting correlates directly with the measurement bandwidth for noise. If RBW could be set to 500Hz it would be ideal, but it cannot. However, the noise level you see at 187.5 RBW can be adjusted to tell you the level for any bandwidth. For example, if we want to know the noise floor of our radio in a 500 Hz bandwidth, we take the ratio of 500 divided by the 187.5 RBW setting, which is 2.67 and convert it to dB: 10 log(2.67) equals 4.3 dB.
So if you add 4.3 dB to the level of noise read from the display you have the "real" noise floor of your receiver for a 500 Hz bandwidth. The same principle applies for any desired noise bandwidth. For 2700 Hz the added factor is 11.6 dB, 5000 is 14.3 dB, etc.
Important caveat: in order to get an absolute number it is critical that you have calibrated your HDSDR using a known level into the receiver and then using the "S Meter Calibration" (Options/Misc Options/Smeter calibration) to set the level displayed to the known level of the source.
Hint: Use the "Spectrum" sliders to expand the scale and to center it vertically on the scale. Use the "Avg" pulldown to smooth the display so you can determine the number plus or minus 1 dB.
So there you have it. One more extremely useful and valuable measurement tool courtesy of the HDSDR folks.
For free.
Warren Allgyer
9V1TD
 0 Attachment
Jack, I honestly don't know the answer to that. But the testing I did this morning confirmed there is a direct correlation so I believe so. Each step of the RBW selector doubles the value. And each step raised the displayed noise floor by exactly 3 dB. What I cannot say for certain is whether or not the displayed value, with the correction factor as calculated, in fact equals the noise power in that bandwidth. I believe that to be the case but my math skills are not up to proving it.
Warren Allgyer
9V1TD 0 Attachment
Further to Jack's comments: There is one other parameter that affects the values displayed and that is FFT Windowing (Options/Visualization/FFT Windowing). After reading Jack's article my intuition says the appropriate selection for this purpose would be "Rectangular" but my intuition regularly fails me so if anyone has the key please chime in. I ran through the selections and, depending on the choice, the indicated noise value seems to vary by 13 dB.
Warren Allgyer
9V1TD 0 Attachment
Jack:
After reading the article you suggested, along with the Wiki that is referenced within the HDSDR app itself:
http://en.wikipedia.org/wiki/Window_function
I think I have satisfied myself that:
1) For purposes of determiining the noise power levels, RBW does equal noise bandwidth. So I am comfortable that scaling the calibrated level value displayed at RBW of 187.5 in the manner I described is valid.
2) A Rectangular FFT window is the most valid visualization when calibrating and using HDSDR for these types of measurements.
On the other hand I have to admit this is completely new territory for me and I am very willing to be corrected by anyone who knows the subject better.
Warren Allgyer
9V1TD 0 Attachment
Warren:
I think the question is mostly whether one is after relative comparisons or absolute measurement of noise power.
If comparing two measurements with all settings unchanged other than those related to the device under test, or changing the RBW keeping everything else unchanged, then the absolute noise power isn't all that important in most cases. Hence even if the RBW <> noise bandwidth, doubling RBW will increase the displayed noise power by close enough to 3 dB so as to not make a difference in most circumstances.
If it's critical to know the absolute noise power level, we have to dig further into the details. A proper piece of test equipment will provide internal compensation for noise versus RBW, detector method, etc.., so that changing the vertical axis to dBm/Hz or dBm/sqrt(Hz), the instrument makes an appropriate correction, invisible to the user, to accurately display the selected vertical scale factor. E.g., going back as far as the HP 8568B spectrum analyzer, if not earlier. Or if an automatic conversion is not provided due to the equipment's age and lack of internal computing power, the equipment manual will often provide enough information to make the conversion manually.
I don't know enough about the internals of the HDSDR software to make an intelligent answer on how the displayed levels correlate with an absolute noise power measurement. One could characterize at least some of the parameters with external measurements, of course.
Jack K8ZOA
On 4/26/2014 1:41 AM, allgyer@... wrote:Jack:
After reading the article you suggested, along with the Wiki that is referenced within the HDSDR app itself:
http://en.wikipedia.org/wiki/Window_function
I think I have satisfied myself that:
1) For purposes of determiining the noise power levels, RBW does equal noise bandwidth. So I am comfortable that scaling the calibrated level value displayed at RBW of 187.5 in the manner I described is valid.
2) A Rectangular FFT window is the most valid visualization when calibrating and using HDSDR for these types of measurements.
On the other hand I have to admit this is completely new territory for me and I am very willing to be corrected by anyone who knows the subject better.
Warren Allgyer
9V1TD
 0 Attachment
I think I agree Jack.
The origin of this exercise has been where a couple of us were comparing the noise rise when we connect antennas to our 10 meter RXII/RXTX. Some, like me, see a significant noise rise. Others, like Alan, see little or none. So the question in my mind has been: Is this the result of variances in the builds or, more likely, a function of environmental noise and antenna performance. It would be helpful to the discussion to have some knowledge of the absolute noise level in each location.
I have worked with it for several hours today and I am less certain now that my results are valid. There seem to be too many variables at work in HDSDR for me to be sure that I have measured an absolute. I think I am going to back off and be satisfied that I can have a reliable comparison tool rather than an absolute reference.
Anyway, thanks so much for your input. I really appreciate you thoughtful and insightful approach to these things.
Warren Allgyer
9V1TD 0 Attachment
Warren:
If the purpose is to determine the relative noise floors, and if one assumes that the observers have identical equipment, i.e., no appreciable unittounit variation amongst the receivers, software parameter settings, etc., that all observers use, then your method should be usable so say that observer X has 3 dB higher noise floor than observer Y. This assumes that both observers see a noise increase. If one observer sees a noise increase and the other does not, then the noise figure of the receiver may be a limiting factor that obviates the measurement result, or at least requires restatement of the results as "X has noise that is at least 3 dB above Y, but it could be as much as 3 + the receiver NF."
All the usual caveats apply; if antennas are different then one has to worry about the direction the noise arrives at, etc. And the test should be run with antenna connected and with shielded termination connected to the receiver, not just antenna removed. (Another confounding factor is that the receiving antenna may or may not be 50 ohms impedance, and the receiver input impedance may not be 50 ohms either.)
If you wish to establish an absolute measurement and say that observer X has a noise floor of Z dbm/sqrt(Hz), then one needs an absolute calibration. This likely involves knowing the noise figure of the receiver as well as the bits associated with the digital signal processing.
If you have access to a modern spectrum analyzer for an hour or two you could calibrate the receiver against its readings set for noise density measurement mode.
Ultimately it depends on the accuracy you are trying to achieve. Ye olde test method of loosely coupling a signal generator to the receiver input whilst the antenna is connected and observing the signal level necessary to achieve some target S/N, compared with the same arrangement but with the antenna replaced with a termination, will give you an absolute measurement if you can characterize all the coupling parameters, losses, etc.
However, pursued sufficiently deeply, it will eventually lead you back into needing to know some things about the receiver and software that may not be readily determinable.
Jack
On 4/26/2014 8:21 AM, allgyer@... wrote:I think I agree Jack.
The origin of this exercise has been where a couple of us were comparing the noise rise when we connect antennas to our 10 meter RXII/RXTX. Some, like me, see a significant noise rise. Others, like Alan, see little or none. So the question in my mind has been: Is this the result of variances in the builds or, more likely, a function of environmental noise and antenna performance. It would be helpful to the discussion to have some knowledge of the absolute noise level in each location.
I have worked with it for several hours today and I am less certain now that my results are valid. There seem to be too many variables at work in HDSDR for me to be sure that I have measured an absolute. I think I am going to back off and be satisfied that I can have a reliable comparison tool rather than an absolute reference.
Anyway, thanks so much for your input. I really appreciate you thoughtful and insightful approach to these things.
Warren Allgyer
9V1TD
 0 Attachment
I have posted a screenshot showing HDSDR tuned to a signal that is measuring exactly 3 dB S+N/N as measured with an RMS voltmeter at the audio ouput.
https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
It is pretty interesting that this "minimum detectable signal" is a full 20 dB above the displayed noise floor.
The filter is set to 500 Hz. Display is set at RBW 2.9 Hz with a 128 sample average. FFT windowing is set to Hann and using Amplitude Spectrum display (all settings under "Options").
I believer if someone were to duplicate these settings and display any signal of a know level on a calibrated HDSDR, and subtract 20 dB from the displayed value (which, of course should be exactly as you know it, otherwise recalibrate), the result would tell you how much that signal was above the MDS of your radio.
For those are interested to know their MDS and don't have attenutors, RMS voltmeters, and the like, this should allow you to do it.
Warren Allgyer
9V1TD 0 Attachment
Original Message 
Subject: Re: [softrock40] Using HDSDR to measure RF noise level
Warren,
LC's description here https://sites.google.com/site/g4zfqradio/hdsdrsignalmeasurement says
"Power Spectrum Density (PSD) uses mean  leading to lower level values. But these should be better for measurement purpose. You may
also try using some averaging .."
So he thinks PSD in the RF spectum is preferable? With some averaging.
Certainly a standard measurement setting is a good idea.
Comparative checks are easy.
But for absolute figures my main problem is being certain of my signal reference to calibrate.
I made a Elecraft XG1 clone, checked against a lab tested one I was lent. But even the XG2 only goes up to 14MHz.
73 Alan G4ZFQ
>I have posted a screenshot showing HDSDR tuned to a signal that is measuring exactly 3 dB S+N/N as measured with an RMS voltmeter
>at the audio ouput.
>
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
>
> It is pretty interesting that this "minimum detectable signal" is a full 20 dB above the displayed noise floor.
>
> The filter is set to 500 Hz. Display is set at RBW 2.9 Hz with a 128 sample average. FFT windowing is set to Hann and using
> Amplitude Spectrum display (all settings under "Options").
>
> I believer if someone were to duplicate these settings and display any signal of a know level on a calibrated HDSDR, and subtract
> 20 dB from the displayed value (which, of course should be exactly as you know it, otherwise recalibrate), the result would tell
> you how much that signal was above the MDS of your radio.
>
> For those are interested to know their MDS and don't have attenutors, RMS voltmeters, and the like, this should allow you to do
> it.
> 0 Attachment
Alan
Yes I agree... but I checked both the PSD and ASD setting and, for the screenshot I took, there was no difference in either signal or noise.
I also agree on the calibration..... none of this works unless you have at least on level you can rely on. I was actually wondering if I could use HDSDR to record a short file with a calibrated signal and email it to you...... have to think about that one and I am not home now to test it.
And I could have stated my recommendation better:
If you have calibrated your HDSDR then I believe you can reliably say the actual noise power is 20 dBm more than the level shown on the screen provided you set up the RBW, averaging, and FFT Windowing as I have specified.
For those who look at my screen shot and think the actual levels look a little screwy, you are right. I was using a noise generator set at about 92 dBm for these tests. It does not give a very good MDS but it let's me discount the effects of my leaky home made signal generator.
In softrock40@yahoogroups.com, <alan4alan@...> wrote :Original Message 
Subject: Re: [softrock40] Using HDSDR to measure RF noise level
Warren,
LC's description here https://sites.google.com/site/g4zfqradio/hdsdrsignalmeasurement says
"Power Spectrum Density (PSD) uses mean  leading to lower level values. But these should be better for measurement purpose. You may
also try using some averaging .."
So he thinks PSD in the RF spectum is preferable? With some averaging.
Certainly a standard measurement setting is a good idea.
Comparative checks are easy.
But for absolute figures my main problem is being certain of my signal reference to calibrate.
I made a Elecraft XG1 clone, checked against a lab tested one I was lent. But even the XG2 only goes up to 14MHz.
73 Alan G4ZFQ
>I have posted a screenshot showing HDSDR tuned to a signal that is measuring exactly 3 dB S+N/N as measured with an RMS voltmeter
>at the audio ouput.
>
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
>
> It is pretty interesting that this "minimum detectable signal" is a full 20 dB above the displayed noise floor.
>
> The filter is set to 500 Hz. Display is set at RBW 2.9 Hz with a 128 sample average. FFT windowing is set to Hann and using
> Amplitude Spectrum display (all settings under "Options").
>
> I believer if someone were to duplicate these settings and display any signal of a know level on a calibrated HDSDR, and subtract
> 20 dB from the displayed value (which, of course should be exactly as you know it, otherwise recalibrate), the result would tell
> you how much that signal was above the MDS of your radio.
>
> For those are interested to know their MDS and don't have attenutors, RMS voltmeters, and the like, this should allow you to do
> it.
> 0 Attachment
Warren:
From the photo, the FFT bins are 2.9 Hz wide and, I believe, the overall bandwidth is 500 Hz.
There are thus 500 Hz / 2.9 Hz/bin = 172 FFT bins within the 500 Hz bandwidth (rounding the number of bins)
Assuming the noise power is evenly distributed (seems reasonable) then each FFT bin has 1/172nd of the total noise power in 500 Hz, or to state it differently, each FFT bin's noise power is 22.3 dB below the total noise power in 500 Hz.
The test signal injected has 3 dB S/(S+N) so the test signal has the same power as does the noise.
Assuming (again this seems reasonable) that the test signal has sufficient spectral purity so that nearly all of its power is contained within a single FFT bin and that there is minimum scalloping across bins, then the FFT bin centered upon the test signal will contain all of the energy of the signal generator. There is clearly some bintobin leakage from the image, probably a combination of the FFT and noise sidebands in the signal source, but for our purpose we'll ignore the leakage.
So, 100% of the signal power is in one bin. Call this 0 dBx where x represents whatever dBm value the signal generator is outputting. 0 dBx is also the total noise within 500 Hz bandwidth since S/(S+N) = 3 dB. (This bin also contains noise power, but since the signal generator level measured in a bandwidth of 2.9 Hz is so much greater, we can neglect the effect of noise in this single FFT bin.)
Each of the other 172 FFT bins will contain (1) no signal and (2) noise being 20 dBx.
Hence, the difference between the FFT bin with the signal and all the other bins should be 23 dB.
My eyeball reading of your image says the difference between the maximum level FFT bin and the noise level of the other bins is more or less 23 dB.
I particularly note that the single bin containing the signal (one pixel wide?) looks to be only 3 dB or so above the neighboring bins. At this level of bintobin leakage you can't safely say all the signal power is contained within one bin. This will alter the back of the envelope calculations presented above. Likewise, the noise floor across the 500 Hz bandwidth isn't all that flat.
With respect to absolute levels, kTB noise is 174 dBm/Hz at 290 deg K, and your image shows 108 dBm or so for 2.9 Hz bandwidth. Making a whole bunch of assumptions, 2.9 Hz bandwidth should contain 4.6 dB more noise than 174 dBm, or 169.4 dBm.
This means the noise floor in your image is 108 (169.4) = 61.4 dB above kTB noise. Assume a typical HF front end noise figure is 10 dB, you then have 51.4 dB excess noise.
You said something about the noise generator being set at 92 dBm ... I don't know in what bandwidth the 92 dBm is defined, unlikely to be per Hz, so without that information information more calculations are not possible.
If you look at the noise floor in a 2.9 Hz FFT bandwidth with the receiver input terminated with 50 ohms, and if the receiver noise figure is around 10 dB (probably on the low side, but I don't know what filtering, amplification, etc. is in your receiver) then in theory one should see a bin noise power around 174 dBm/Hz + 4.6 dB + 10dB = 160 dBm to keep the results in zero decimal places.
For what its worth, my netSDR shows about 132 dBm noise floor with "optimal Blackmann" windowing, and "default" resolution but I have no idea what the FFT bin width is. Selecting "highest" resolution clearly reduces the bin width and the noise floor drops to around 140 dBm under this setting. As a guess, it's something around 5 Hz looking at how many distinct peaks are visible in 1 KHz span. This suggests a noise floor on a 1 Hz basis around 147 dBm.
There are lots of assumptions in these calculations, so don't take them as the last word.
Jack K8ZOA
On 4/27/2014 5:11 AM, allgyer@... wrote:Alan
Yes I agree... but I checked both the PSD and ASD setting and, for the screenshot I took, there was no difference in either signal or noise.
I also agree on the calibration..... none of this works unless you have at least on level you can rely on. I was actually wondering if I could use HDSDR to record a short file with a calibrated signal and email it to you...... have to think about that one and I am not home now to test it.
And I could have stated my recommendation better:
If you have calibrated your HDSDR then I believe you can reliably say the actual noise power is 20 dBm more than the level shown on the screen provided you set up the RBW, averaging, and FFT Windowing as I have specified.
For those who look at my screen shot and think the actual levels look a little screwy, you are right. I was using a noise generator set at about 92 dBm for these tests. It does not give a very good MDS but it let's me discount the effects of my leaky home made signal generator.
In softrock40@yahoogroups.com, <alan4alan@...> wrote :
Original Message 
Subject: Re: [softrock40] Using HDSDR to measure RF noise level
Warren,
LC's description here https://sites.google.com/site/g4zfqradio/hdsdrsignalmeasurement says
"Power Spectrum Density (PSD) uses mean  leading to lower level values. But these should be better for measurement purpose. You may
also try using some averaging .."
So he thinks PSD in the RF spectum is preferable? With some averaging.
Certainly a standard measurement setting is a good idea.
Comparative checks are easy.
But for absolute figures my main problem is being certain of my signal reference to calibrate.
I made a Elecraft XG1 clone, checked against a lab tested one I was lent. But even the XG2 only goes up to 14MHz.
73 Alan G4ZFQ
>I have posted a screenshot showing HDSDR tuned to a signal that is measuring exactly 3 dB S+N/N as measured with an RMS voltmeter
>at the audio ouput.
>
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
> https://groups.yahoo.com/neo/groups/softrock40/photos/albums/80470281
>
> It is pretty interesting that this "minimum detectable signal" is a full 20 dB above the displayed noise floor.
>
> The filter is set to 500 Hz. Display is set at RBW 2.9 Hz with a 128 sample average. FFT windowing is set to Hann and using
> Amplitude Spectrum display (all settings under "Options").
>
> I believer if someone were to duplicate these settings and display any signal of a know level on a calibrated HDSDR, and subtract
> 20 dB from the displayed value (which, of course should be exactly as you know it, otherwise recalibrate), the result would tell
> you how much that signal was above the MDS of your radio.
>
> For those are interested to know their MDS and don't have attenutors, RMS voltmeters, and the like, this should allow you to do
> it.
>
 0 Attachment
Jack that is fantastic! I even understood most of it because of the way you wrote it. It is a huge help to know there are numbers supporting the 2023 difference in displayed levels.
My reference to 92 dBm for the noise generator was based on the displayed 500 Hz bandwidth and I actually backed into that number. The noise generator is uncalibrated and is simply a zener biased right at breakdown. The resulting level is what it is. I put it into one port of a hybrid combiner and the signal generator, through a switchable attenuator, into the other and feed the result to the receiver under test. I know and can measure the attenuated output of the signal generator and, since I was meeting the 3 dB S+N/N criteria, I could say the total noise power in the bandwidth was the same as the signal: 92 dBm.
I am still struggling with the whole concept of noise temperature and trying to wean myself from the concept of noise "level". It is not a "level" is it?... it is only power and the amount of that power that competes with the desired signal is determined by the bandwidth of the system.
I will get there.... and I sure appreciate your help and patience.
Warren Allgyer
9V1TD 0 Attachment
Warren:
Glad to be of some assistance.
If the noise source has a power of 92 dBm into 500 Hz, then each of the 172 FFT bins of 2.9 Hz width will be 10 log(172) down from the 92 dBm, or 104 dBm.
This isn't too far from the 108 or thereabouts your screen capture shows.
But, this is all simple math related to how energy splits evenly with bandwidth, assuming there is no bin overlap, etc. Note the bin overlap is critical in these calculations.
If the FFT bins are numbered 1...172:
 1  2  3  ...  172 
If we are to say that the total noise power in the sum of bins 1...172 equals 172 times the noise power in any FFT bin then there is an implicit assumption that each FFT bin rejects completely any noise power other than that within its bandwidth, or 2.9 Hz in this case. In other words, the FFT bins have 0 dB loss within the 2.9 Hz pass width and infinite rejection outside of the pass width.
This is obviously not the case; because the bins have finite skirt selectivity to put it into ham radio terms, some noise power in the frequency range of bin 2 will leak into bin 3, and some noise power from bin 4 will leak into bin 3 as well. And, so forth for all 172 bins and a few bins above and below the 172 bins. Hence, there is uncertainty depending upon the bin shape exactly how much noise power leaks between bins. The leakage can be computed from a knowledge of the windowing function and some other things and an appropriate correction made.
Moreover, there is clear unequal distribution of noise power within the 500 Hz bandwidth from the screen image.
Likewise, consider your signal generator output. Assume for the moment that it's perfect, has no phase noise, etc. If the FFT bins are rectangular, i.e., within each bin's 2.9 Hz bandwidth, there is 0 dB variation in amplitude then it does not matter where the signal generator is at with respect to the FFT bin center; whatever bin it is in will capture 100% of the signal generator output.
But, again depending on the windowing and some other factors, suppose the bin has not a perfect rectangular shape, but rather a rounded response. Depending on the windowing function, the peak response within the 2.9 Hz bandwidth may be 3 to almost 4 dB greater than the response at the edges of the bin. Right away we've introduced 3 dB error in measuring your signal generator level. And since repeatability will call for positioning the signal generator and bin center within a couple tenths of a Hz, lots of luck on reproducing your data tomorrow, let alone by someone else, even with identical equipment. Or, perhaps the signal generator happens to land exactly halfway between two adjacent bins. The energy will be split evenly between the two and presumably the receiver will display half the signal in each bin, again a 3 dB error.
And, of course, there's a relationship between bin leakage, windowing and scalloping, such that the best window from the prospective of a flat top bin is not the best window function with respect to minimum bintobin leakage.
I'm not saying at all that the receiver and software can't be used to make accurate absolute noise level measurements; just that to do so requires more knowledge concerning the internals of the software, etc. as well as calibration measurements anchored to absolute levels. I'm sure all this is possible, but I'm not at all certain all the bits and pieces required to bring it about are available.
Jack K8ZOA
On 4/27/2014 10:05 AM, allgyer@... wrote:Jack that is fantastic! I even understood most of it because of the way you wrote it. It is a huge help to know there are numbers supporting the 2023 difference in displayed levels.
My reference to 92 dBm for the noise generator was based on the displayed 500 Hz bandwidth and I actually backed into that number. The noise generator is uncalibrated and is simply a zener biased right at breakdown. The resulting level is what it is. I put it into one port of a hybrid combiner and the signal generator, through a switchable attenuator, into the other and feed the result to the receiver under test. I know and can measure the attenuated output of the signal generator and, since I was meeting the 3 dB S+N/N criteria, I could say the total noise power in the bandwidth was the same as the signal: 92 dBm.
I am still struggling with the whole concept of noise temperature and trying to wean myself from the concept of noise "level". It is not a "level" is it?... it is only power and the amount of that power that competes with the desired signal is determined by the bandwidth of the system.
I will get there.... and I sure appreciate your help and patience.
Warren Allgyer
9V1TD
 0 Attachment
Let me rephrase this sentence as it isn't as clear as it should be:
If we are to say that the noise power in the bandwidth represented by
172 times the bandwidth of each FFT bin (500 Hz in your data), then
there is an implicit assumption that each FFT bin rejects completely any
noise power other than that within the bin bandwidth, or 2.9 Hz in this
case.
Jack
On 4/27/2014 10:56 AM, Jack Smith wrote:
> If we are to say that the total noise power in the sum of bins 1...172
> equals 172 times the noise power in any FFT bin then there is an
> implicit assumption that each FFT bin rejects completely any noise
> power other than that within its bandwidth, or 2.9 Hz in this case. 0 Attachment
Jack
You have shown how power is distributed into bins in the passband and warned about inaccuracies do to bins spilling over into each other and some noise power being double counted. Would not a quick check of this be to do to different RBW calculations and see if the total number is the same?
This is so so helpful in trying ti understand what is happening. For me this is why this hobby still fascinates after 45 yeara.
Thanks again Jack
Warren Allgyer
9V1TD 0 Attachment
Hello all,
first of all let me state that HDSDR never can be a precise measurement tool for absolute power!
There are mathematical caveats like Amplitude/Power spectrum, RBW, windowing and averaging. But besides these it will depend on the receiver and the soundcard. I don't know any receiver (below ~ 1000 EUR) having a really flat frequency response within 1 dB over all it's frequency range. The soundcard may be a big source of error, too. Even with direct samplers with the necesity for a soundcard, e.g. ELAD's FDMS2, there may be errors.
Having said this, of course you may try to measure having the above in mind.
But, do NOT trust and build on your measured values!
From the discussion above i noticed an error in my description at
https://sites.google.com/site/g4zfqradio/hdsdrsignalmeasurement
I had stated that all FFT results were normalized to 1 Hz, which was wrong. I just fixed it.
Normalization to 1 Hz is applied only with Power spectrum  not for Amplitude spectrum. This is why the level increase by 3 dB when doubling the RBW resolution with Amplitude spectrum.
By the way, I'm going to change the 1 Hz normalization when using PSD to some other value in the next HDSDR release.
kind regards,
LC