I am trying to determine frequency using the PS2005 using both channels @ 8064 samples (maximum value looped back to the dll) and a timebase of 1.
I have used this process before and it has worked, but on a different application.
I basically get the power spectrum and find the highest peak and determine what frequency that is.
I must divide the frequency given by the spectrum by a block size (seconds) to get the right value.
To find this, I used Sampling Frequency/# of samples, which is 100Mhz/8064.
The problem is, if I use a function generator (correctly calibrated with a seperate oscilloscope), I get a frequency that is off by about 0.01MHz. It changes nonlinearly as I sweep the function generators frequency from 1Mhz to 4Mhz. Accuracy to about 100Hz is necessary.
Am I doing this process correctly?
Is there an easier way to calculate frequency of the signal?
Is there a function in the dll files that will help me?
Post your LabVIEW discussions here
2 posts • Page 1 of 1