Using ADC-212/3 with adc200_set_timebase

Post general discussions on using our drivers to write your own software here
Post Reply
Olly
User
User
Posts: 3
Joined: Fri Oct 15, 2004 8:55 am

Using ADC-212/3 with adc200_set_timebase

Post by Olly »

Hi,

I am trying to use the ADC-212/3 for capturing multiple times over a fixed timebase. For example I want to be able to capture 20 times at 0.5 millisecond intervals. I looked at the api documentation, and the Visual Basic Example and thought I would have a go. Firstly, I the documentation does not make it entirely clear how the first parameter of set_timebase works. Quoting from the documentation:
unsigned short adc200_set_timebase (
unsigned long * ns,
unsigned char * is_slow,
A200_TIME timebase)

ns This is the time interval, in nanoseconds, between readings at the selected timebase.
is_slow This is TRUE if the ADC200 works in slow mode at this timebase (ie you can
transfer readings from the ADC200 without stopping it).
timebase a code between 0 and 19 (not all codes are valid for all units- check the return
value).
Timebase 0 is the fastest timebase, Timebase 1 is twice the time per sample,Timebase 2 is four times, etc.

The time per sample is normally fastest_ns * 2timebase * oversample.
.....

For an ADC-212/3 (333ns fastest) with oversample 8,
0 2664 ns
1 5328 ns
2 10656ns
....
15 87293952ns
16 174587904ns
I therefore thought I would try setting timebase to 0 and doing 1 captures and time it, so that I could work out the overhead for the setup and getting of the values. My timer has a resolution of 1 millisecond, I can ignore the 333 nanosecond fastest time. 7 milliseconds was the overhead, and seemed to be constant regardless of the number of captures (less then 50), so from here on in all figures will ignore that overhead.

Right, so to business. I wanted around 0.5 millisecond intervals, so thought that I would select timebase=10 , because (2 to the power of ten) * 333ns * 1 times oversample = 340992ns = 0.34 milliseconds. So when I ran the capture of 20 samples I expected it to take a minimum of 6.8 milliseconds, but it took 26 milliseconds. Playing about with the timebase I found that :
10 takes 26ms (expected 6.8ms)
9 takes 13ms (3.4ms)
8 takes 6ms (0.85ms)
7 takes 3ms.

For the sake of arguement I guessed that maybe timebase 7 was what i wanted (against my understanding of the documentation), then tried to force it to capture at 500 milliseconds. I tried using parameter one, the time interval between readings in ns to 'make up' the extra time, so set it to 160000, expecting it to make it up to 0.5 milliseconds, but it made no difference. In fact whatever value I put in, it made no difference at all.

I am confused, but it may just be the way that I have intepreted the documentation. Please explain where this is going wrong!

User avatar
markspencer
Site Admin
Site Admin
Posts: 598
Joined: Wed May 07, 2003 9:45 am

Post by markspencer »

Hi,

I am sorry to hear that you are experiencing this problem.

After looking into what you have said it seems that the pre-tigger that the unit uses to ensure that the unit is running at the required sample rate is causing this. unfortunately, this cannot be avoided.

When you get the samples back form the unit, if you use the get_times_and_values function you should be able to see that the time interval between samples is 341 microseconds at timebase 10.

Best regards,
Regards,

Mark Spencer

Olly
User
User
Posts: 3
Joined: Fri Oct 15, 2004 8:55 am

Using ADC-212/3 with adc200_set_timebase

Post by Olly »

Hi,

I am trying to follow your advice and use get_times_and_values() to see what is actually going on, but I find that the function declaration is missing from the visual basic asc20032.bas file. I have the latest installed version downloaded from your website (the beta). I will try to create my own declaration, but can you confirm that I have the correct version, i.e. there is no later version with the declarations in.

Thanks in advance

Olly

Olly
User
User
Posts: 3
Joined: Fri Oct 15, 2004 8:55 am

Using ADC-212/3 with adc200_set_timebase

Post by Olly »

Ok, I created my own visual basic declaration for adc200_get_times_and_values(), and it seems to work. I now understand that parameter one of adc200_set_timebase() is an output parameter, rather then something that can be set (I feel that this could be explained better in the documentation).

My issue with the unexpected difference in timebase is because the formula in the documentation does not include the number of channels that are being captured. It currently reads:
The time per sample is normally fastest_ns * 2timebase * oversample.
I believe it should read:
The time per sample is normally fastest_ns * 2timebase * oversample * number_of_channels
Thanks to Mike Green at PicoTech for pointing me in the right direction. I can now get things to work as expected.

Regards,

Olly

Post Reply