Test and Measurement Forum

adc200.vi, set_timebase routine

Post your LabVIEW discussions here

adc200.vi, set_timebase routine

Postby Florin » Tue Dec 05, 2006 2:19 pm

Hi,

I have some problems setting the timebase in the adc200.vi example for LabView.

I updated the software for my pico adc212 card using PicoFull_r5_15_6.exe
The set_timebase routine from the adc200.vi exemple has tho following inputs and outputs:
inputs: 0 -for the time interval between readings
0 -for is slow mode
0 - for the timebase

these give me an output of 333 ns, which coresponds to the fastest for an ADC-212/3 (as I found in the pico driver documentation).
The problem is that my real time/sample (it works well with picoscope software) is of the order of microseconds, possibly 333 microseconds (for those input parameters).
My question: Could it be microseconds, instead of ns ?

Regards,
Florin
User avatar
Florin
Active User
Active User
 
Posts: 10
Joined: Mon Oct 09, 2006 5:30 pm
Location: Italy

Postby ziko » Thu Dec 28, 2006 2:00 pm

Hi and thank you for your post.

Currently we are running a limited service over the Christmas period and as such I cannot answer any programming questions. However we will be operating fully from next week.

Sorry for any inconvenience.

Kind regards
Ziko

Technical Specialist
ziko
Zen Master
Zen Master
 
Posts: 1705
Joined: Fri Dec 01, 2006 10:03 am
Location: St Neots

solved

Postby Florin » Thu Dec 28, 2006 2:19 pm

Hi,
I solved that problem allready, thanks




ziko wrote:Hi and thank you for your post.

Currently we are running a limited service over the Christmas period and as such I cannot answer any programming questions. However we will be operating fully from next week.

Sorry for any inconvenience.

Kind regards
User avatar
Florin
Active User
Active User
 
Posts: 10
Joined: Mon Oct 09, 2006 5:30 pm
Location: Italy


Return to LabVIEW

Who is online

Users browsing this forum: No registered users and 0 guests