adc200.vi, set_timebase routine

Post your LabVIEW discussions here
Post Reply
User avatar
Florin
Active User
Active User
Posts: 10
Joined: Mon Oct 09, 2006 5:30 pm
Location: Italy

adc200.vi, set_timebase routine

Post by Florin » Tue Dec 05, 2006 2:19 pm

Hi,

I have some problems setting the timebase in the adc200.vi example for LabView.

I updated the software for my pico adc212 card using PicoFull_r5_15_6.exe
The set_timebase routine from the adc200.vi exemple has tho following inputs and outputs:
inputs: 0 -for the time interval between readings
0 -for is slow mode
0 - for the timebase

these give me an output of 333 ns, which coresponds to the fastest for an ADC-212/3 (as I found in the pico driver documentation).
The problem is that my real time/sample (it works well with picoscope software) is of the order of microseconds, possibly 333 microseconds (for those input parameters).
My question: Could it be microseconds, instead of ns ?

Regards,
Florin

ziko
Advanced User
Advanced User
Posts: 1705
Joined: Fri Dec 01, 2006 10:03 am
Location: St Neots

Post by ziko » Thu Dec 28, 2006 2:00 pm

Hi and thank you for your post.

Currently we are running a limited service over the Christmas period and as such I cannot answer any programming questions. However we will be operating fully from next week.

Sorry for any inconvenience.

Kind regards
Ziko

Technical Specialist

User avatar
Florin
Active User
Active User
Posts: 10
Joined: Mon Oct 09, 2006 5:30 pm
Location: Italy

solved

Post by Florin » Thu Dec 28, 2006 2:19 pm

Hi,
I solved that problem allready, thanks



ziko wrote:Hi and thank you for your post.

Currently we are running a limited service over the Christmas period and as such I cannot answer any programming questions. However we will be operating fully from next week.

Sorry for any inconvenience.

Kind regards

Post Reply