[hpsdr] phase noise

John Miles jmiles at pop.net
Thu Oct 25 20:33:03 PDT 2007


Reading back over that, I don't think I phrased it very well.  An example
would probably make more sense.

Say your ADC is clocked at 10 MHz, and your Nyquist bandwidth is 4 MHz (to
allow for antialiasing filter rolloff in the absence of an oversampling
scheme).  You feed the output of the ADC to a 1,000-point FFT.  Each bin of
the FFT output window, then, represents 4 kHz of spectrum.  The noise level
in that bin can be normalized to 1 Hz by subtracting 10*log(4000), or 36 dB.

Now you have to consider what your spectrum display actually does.  If you
tell it to look at +/-200 kHz of spectrum, mapping that to a 100-pixel-wide
display, then it can just display the contents of the FFT bins on a 1:1
basis (remember, 4 kHz/bin * 100 pixels = 400 kHz, centered on whatever bin
represents the 'tuned' frequency of interest.)

If you tell it to zoom in to +/- 20 kHz, you may think of that as a
"bandwidth change", but I'll bet that all the software does in that case is
replicate every FFT bin 10 times, maybe with a basic smoothing algorithm
(low-pass function) to hide the stairsteps.  Your noise level wouldn't
change in that case, because you didn't actually change the sampling
bandwidth, just the size of the chunk of it that you wanted to see.

Does that help?

-- john, KE5FX

  -----Original Message-----
  From: hpsdr-bounces at lists.hpsdr.org
[mailto:hpsdr-bounces at lists.hpsdr.org]On Behalf Of John Miles
  Sent: Thursday, October 25, 2007 6:26 PM
  To: hpsdr at lists.hpsdr.org
  Subject: Re: [hpsdr] phase noise


  Frank,

  I don't have one of the receivers in question, but generally the apparent
noise level is tied to the sampling bandwidth.  If your sampling clock is 10
MHz, your noise bandwidth is 5 MHz, and you can normalize it to 1 Hz by
subtracting 67 dB*.

  But you also have to realize that the noise energy is distributed evenly
between the FFT bins.  Digitally speaking, what does this rig do when you
"change bandwidths", as you say?  Does that mean it just uses a different
number of bins to draw the spectrum graph?  This is important: if the ADC's
sampling clock and front-end bandwidth didn't change, and the FFT kernel
size (# of bins) didn't change, then the noise level in each bin won't
change, either.

  I'm not personally crazy about the idea of using an 8640B cavity as an LO
for a narrowband receiver LO.  They drift, and they are nothing special when
it comes to close-in phase noise (< 10 kHz).  Instead, consider adding a PLL
to clean up any DDS spurs at inter-channel offsets, or upgrading to a DDS
with better SFDR.

  -- john, KE5FX

  *: In reality, FFTs have "equivalent noise bandwidths" just like
conventional filters do, so the noise-normalization function is not strictly
10*log(BW).  This is a property of the window.  There's a decent tech note
on that phenomenon here:
http://www.bores.com/courses/advanced/windows/files/windows.pdf
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openhpsdr.org/pipermail/hpsdr-openhpsdr.org/attachments/20071025/b7a3f1b3/attachment-0004.htm>


More information about the Hpsdr mailing list