TransWikia.com

DDS generated chirp and antialiasing filter : what is the best suited filter for a chirp?

Signal Processing Asked by AtoM_84 on October 24, 2021

I am defining the required antialiasing (analog) filter for a chirp generated by a DDS. The chirp is produced in a FPGA (NCO generation) and forwarded to the DAC for the samples generation (with interpolation from the DAC). Because of a dynamic range lower than 16 bits and the natural behavior of the DAC to produce high frequency nonlinearities, I need to include and antialiasing filter which I try to define.

I guess I need to fulfill these requirements:

  • I need to low pass my chirp bandwidth (let’s say 10-50Mhz) and cut frequencies above the high frequency BW,
  • I have to ensure that the group delay is flat within my bandwidth,
  • I have to ensure a given stop band value before the image frequencies (so for instance with a 400MSPS DAC, I would have a 350 MHz image frequency),
  • I need to push the dB level of such frequencies under the quantization level for the DAC to avoid distorsions so for instance with a 10 bits DAC I need to have a dB margin from the Gain 0dB of roughly 60dB.

As I am not very used to filters I read articles on the web and could determine those few rule of thumbs to design the best suited filter for my application:

  • Butterworth are good for sharp (brick wall style) low pass filter but the group delay is bad (so my chirp will be distorded in phase domain)
  • Bessel filters are flat for group delay (means that my frequencies will not be phase delayed or at least will have the same delay), but are not as sharped.

For the considered case, with a stopband at 350MHz and a gain ~1 at 50MHz it seems possible to achieve a filtering, especially with Bessel filters (order 3 or 4 are OK).But if I would like to decrease my cost, hence the sampling frequency of my DAC, it turns quickly to be impossible because my stop band and my cut off frequency are getting closer and only a Butterworth filter could do the job? but because it is a group delayed filter it is not suitable for chirp generation am I right?

I tried to simulate this with ADIsimDDS (interesting online software) for instance AD9859 (400MSPS 10 bit) and my case study looks good but i cant see any of the group delay impact with respect to the different proposed external analog filters.

Thanks for your answers and advices.

One Answer

I recommend a digital FIR filter to both compensate for the DAC Sinc droop and added group delay distortion combined with a simpler analog filter focused on rejecting the higher images out of the DAC with the minimum number of components (this would eliminate using a Bessel filter, while a Butterworth or elliptic filter would be viable candidates). The analog filter cost and complexity can be minimized by utilizing digital pre-distortion to equalize the analog filters group delay variation with a reasonable tolerance.

For further details on the design details for the anti-alias filter, please see this post: Where should I set my anti-aliasing filter corner frequency for this signal?

This post may also be of interest as an implementation approach where I detail all the code for an optimized chirp for a flat FFT response which would also have minimum aliasing effects under the same sampling conditions. Instead of a time domain filter, the response is scaled with a Tukey window which provides a constant envelope chirp over most of its duration, while scaling the very start and end of the chip minimize aliasing effects. This could be done with a weighting factor of the amplitude of the chirp at the output of the DDS. Please see the very bottom of this post for more details showing a flat FFT response over most of the chirp duration: How can I plot the frequency response on a bode diagram with Fast Fourier Transform?

Answered by Dan Boschen on October 24, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP