I am trying to use the ADC-212/3 for capturing multiple times over a fixed timebase. For example I want to be able to capture 20 times at 0.5 millisecond intervals. I looked at the api documentation, and the Visual Basic Example and thought I would have a go. Firstly, I the documentation does not make it entirely clear how the first parameter of set_timebase works. Quoting from the documentation:
unsigned short adc200_set_timebase (
unsigned long * ns,
unsigned char * is_slow,
ns This is the time interval, in nanoseconds, between readings at the selected timebase.
is_slow This is TRUE if the ADC200 works in slow mode at this timebase (ie you can
transfer readings from the ADC200 without stopping it).
timebase a code between 0 and 19 (not all codes are valid for all units- check the return
Timebase 0 is the fastest timebase, Timebase 1 is twice the time per sample,Timebase 2 is four times, etc.
The time per sample is normally fastest_ns * 2timebase * oversample.
For an ADC-212/3 (333ns fastest) with oversample 8,
0 2664 ns
1 5328 ns
I therefore thought I would try setting timebase to 0 and doing 1 captures and time it, so that I could work out the overhead for the setup and getting of the values. My timer has a resolution of 1 millisecond, I can ignore the 333 nanosecond fastest time. 7 milliseconds was the overhead, and seemed to be constant regardless of the number of captures (less then 50), so from here on in all figures will ignore that overhead.
Right, so to business. I wanted around 0.5 millisecond intervals, so thought that I would select timebase=10 , because (2 to the power of ten) * 333ns * 1 times oversample = 340992ns = 0.34 milliseconds. So when I ran the capture of 20 samples I expected it to take a minimum of 6.8 milliseconds, but it took 26 milliseconds. Playing about with the timebase I found that :
10 takes 26ms (expected 6.8ms)
9 takes 13ms (3.4ms)
8 takes 6ms (0.85ms)
7 takes 3ms.
For the sake of arguement I guessed that maybe timebase 7 was what i wanted (against my understanding of the documentation), then tried to force it to capture at 500 milliseconds. I tried using parameter one, the time interval between readings in ns to 'make up' the extra time, so set it to 160000, expecting it to make it up to 0.5 milliseconds, but it made no difference. In fact whatever value I put in, it made no difference at all.
I am confused, but it may just be the way that I have intepreted the documentation. Please explain where this is going wrong!