John Monro
06-06-2007, 04:03 AM
prad wrote:
> Hi all,
>
> I am a newbie to DSP. I have a huge number of data samples that I want
> to perform FFT on. But due to the enormous amount of the data samples, I'd
> like to use only 1M samples and perform FFT. I know that this is severe
> undersampling and that it does not satisfy the Nyquist rate. But all I
> want is to prevent aliasing so that I can get correct information about
> the low frequency content in the data. I cannot use an analog anti-alias
> filter as the data is inherently digitized. Is there any time-domain
> method to help with my problem? I read about a half-band digital filter
> that eliminates the frequencies higher than half the sampling frequency.
> Can this filter help my problem as I read a lot of articles that the high
> frequencies have to be eliminated before digitization? If so, can someone
> point me to some link that discusses the design of such a filter?
>
>
> Thanks,
> Pradeep.
Pradeep,
The half-band filter you mention attenuates components above a quarter
of the sampling frequency, but its only advantage over other similar
(FIR) filters is that it is twice as fast. Otherwise it is not worth
the trouble as it is more complicated to code.
The filter needs to attenuate frequency components that are above one
half your NEW sample rate. (See http://www.iowegian.com/ for an
evaluation copy of some excellent FIR filter design software.)
The NEW sample rate is the rate you establish, in effect, by selecting
every Nth output sample, N being calculated so that you end up with 1M
output samples.
You can speed up the whole process by only calculating the output
samples that you are going to use, but this is at a cost of a little
more complication in the code.
Regards,
John
> Hi all,
>
> I am a newbie to DSP. I have a huge number of data samples that I want
> to perform FFT on. But due to the enormous amount of the data samples, I'd
> like to use only 1M samples and perform FFT. I know that this is severe
> undersampling and that it does not satisfy the Nyquist rate. But all I
> want is to prevent aliasing so that I can get correct information about
> the low frequency content in the data. I cannot use an analog anti-alias
> filter as the data is inherently digitized. Is there any time-domain
> method to help with my problem? I read about a half-band digital filter
> that eliminates the frequencies higher than half the sampling frequency.
> Can this filter help my problem as I read a lot of articles that the high
> frequencies have to be eliminated before digitization? If so, can someone
> point me to some link that discusses the design of such a filter?
>
>
> Thanks,
> Pradeep.
Pradeep,
The half-band filter you mention attenuates components above a quarter
of the sampling frequency, but its only advantage over other similar
(FIR) filters is that it is twice as fast. Otherwise it is not worth
the trouble as it is more complicated to code.
The filter needs to attenuate frequency components that are above one
half your NEW sample rate. (See http://www.iowegian.com/ for an
evaluation copy of some excellent FIR filter design software.)
The NEW sample rate is the rate you establish, in effect, by selecting
every Nth output sample, N being calculated so that you end up with 1M
output samples.
You can speed up the whole process by only calculating the output
samples that you are going to use, but this is at a cost of a little
more complication in the code.
Regards,
John