ADC-20 Sample time

Post any questions you may have about our current range of USB data loggers
Post Reply
Shaun
User
User
Posts: 2
Joined: Wed Jan 18, 2006 8:43 pm

ADC-20 Sample time

Post by Shaun » Wed Jan 18, 2006 9:00 pm

Hi.

I'm using the ADC-20 in a VB application. To sample a single channel requires 60mS which seems quite long. Is this a hardware limitation or a software driver limitation ?

Thanks

Michael
Advanced User
Advanced User
Posts: 656
Joined: Thu Jul 07, 2005 12:41 pm
Location: St Neots, Cambridgeshire

Post by Michael » Thu Jan 19, 2006 3:59 pm

Hello Shaun,

The limitation is down to the resolution of the measurement which equates to 1,048,576 bits. Running the ADC20 at 60ms conversion times reduces the resolution but runs the ADC20 faster.

Best regards,
Michael
Michael - Tech Support
Pico Technology
Web Support Forum

User avatar
RobH
User
User
Posts: 6
Joined: Wed Nov 01, 2006 12:05 pm

Post by RobH » Wed Nov 08, 2006 11:24 am

This is going to sound like a bit of a dumb question :oops: ... but here goes nothing:

I am using an ADC-16 to log voltage, I am measuring only a few milli volts and hence the 16 bit accuracy is nice to have, but I would also like to sample at 50-100ms intervals. Whilst I appreciate there is no way to increase the conversion time (~600ms), is it worth sampling at 50ms intervals? Might I "get lucky" every so often as recieve more frequent value updates?

Thanks in advance :wink:

Rob

Post Reply