I have some problems setting the timebase in the adc200.vi example for LabView.
I updated the software for my pico adc212 card using PicoFull_r5_15_6.exe
The set_timebase routine from the adc200.vi exemple has tho following inputs and outputs:
inputs: 0 -for the time interval between readings
0 -for is slow mode
0 - for the timebase
these give me an output of 333 ns, which coresponds to the fastest for an ADC-212/3 (as I found in the pico driver documentation).
The problem is that my real time/sample (it works well with picoscope software) is of the order of microseconds, possibly 333 microseconds (for those input parameters).
My question: Could it be microseconds, instead of ns ?