Acquiring a large number of samples using ADC-216.

Post general discussions on using our drivers to write your own software here
Post Reply
agaikwad

Acquiring a large number of samples using ADC-216.

Post by agaikwad »

Hi,
This is a followup question to a previous question I had asked about acquiring a large amount of samples. I am using ADC-216 for sampling a voltage signal at 41666 samples/second for a some amount of time.I assume total number of samples to be atleast 5 million. I want to store the data in a single file. When I used PicoLog for acquiring data,my file was restricted to about 32700 samples,no matter what sampling rate I used. Therefore I am thinking of writing my own C program. Is it possible to acquire millions of sample at high sampling using my own program? More specifically,can I control the buffer size? How can I control buffer read/write operations? Do you have any sample code?

I have enough disk space (50GB),fast processor(P4,2GHz) and 1GB RAM.
Really appreciate your help,
Thanks.

User avatar
markspencer
Site Admin
Site Admin
Posts: 598
Joined: Wed May 07, 2003 9:45 am

Post by markspencer »

Hi,

When using Picolog in fast block mode, as you are (previous post) the file size is resticted to the size of the buffer on the unit. The ADC-216 has a buffer of 32k when one channel is used and 16k for each channel when both are used.

To write your own software, with the specification you described is possible. However, there are a few limiting factors that you have to be aware of. The number of samples that can be requested at once cannot be more that the units buffer size. When downloading data from the unit to PC no sampling takes place.

The number of samples can be determined by you as can the timebase. However, there may be factors that will effect the actual time interval achievable and the number of samples it will be able to collect. To get an understanding please download our simple examples and manual to see a description of the function prototypes.

examples/manuals

Best regards,
Regards,

Mark Spencer

Post Reply