this is not a beginner's question but rather a setup / calibration of the device question. Please move, if this is wrong for this sub-forum.
We are characterising a picoscope 3406D MSO and try to measure the gap time (deadtime + setup time) between rapid block mode triggers by using regular pulses from a signal generator (10mV Amplitude, 20ns wide, 1kHz-10MHz). Using the method layed out in this article*, Chapter 3.1, we got a gap time of 2.161us as opposed to the <2us stated in the programmer's guide. We took the data using a self-written python program**
1) Any idea how the gap time was measured from the company to arrive at <2us?
Doing these measurements we observed strange effects:
2) "Saturation" of the rate (number of captures divided by time for the rapid block (taken from CPU time)) at 1 and 2 MHz giving a fixed upper rate of ~360kHz. See attached Fig. showing the histogram of rates
3) "Oscillation" of the rate at 5 and 10MHz input rate: the rate changes between 2 values in subsequent blocks. See attached Fig. showing our "raw" data over the time of 1h.
Any thoughts on these questions would be great!
*https://www.sciencedirect.com/science/a ... 3318302596
To answer your question the dead-time in between captures for Rapid Block Mode/Rapid trigger mode captures is heavily dependent upon the time required to re-arm the Trigger (which, for instance, changes quite dramatically with the Sample Rate). I created a method to determine the re-arm time, for another customer who sent us a support request on our Help Desk, but didn't have time to formally turn it into a guide for the forum.
I'm responding to this quite late, so if this is still an problem for you then I can finish that off and post it here.
yes this is still an open question. My student is still working on it. I think she wants to finalize within a week, though.
I think you have my mail address as we are writing about the PicoPyGui App of mine. So you could also send a not so polished version via mail, if this accelerates things.