I'm trying to get to grips with a new ADC-100, which I'm using with a Dell 1.6 GHz machine/Win 2000. I needed the parallel converter and had to set up the patch.
My problem is that, in scope mode, the timebase scaling seems to be wrong by a factor of 3 (a 1kHz sinewave occupies 3ms/cycle rather than 1ms. When I use the spectrum analyser, the frequency is displayed correctly when I set the maximum frequency to less than 10 kHz. At 10kHz, the trace extends only to ~2.5 kHz and the measured frequency appears to be around 330 Hz.
Is this a sample speed/aliasing problem, or have I simply failed to set up the software correctly? The 'help|about' dialog shows that the device is only achieving 37,622 samples/second (which is a problem as I bought the device to capture a signal running at 20-30 kHz!)