I am finding that the function ps2000_run_streaming_ns adds unwanted pre-trigger samples to the output stream. If I call the function with triggering disabled, and ask for a million samples, I receive exactly a million samples, which is fine. But if I enable simple triggering with the ps2000_set_trigger2 call, I receive a million + N samples, where N is variable, and if I examine a waveform manually I find that the trigger point would have been N samples into the received waveform.
Another example: suppose I enable triggering with ps2000_set_trigger2, setting the trigger level so high that it is above the peak waveform voltage, and also setting the auto_trigger delay parameter to 3 seconds. If I then call ps2000_run_streaming_ns, requesting a second's worth of data, what I receive is a 4 second long data stream.
The delay variable (parameter 5 of 6) passed to ps2000_set_trigger2 seems to have no effect on this behaviour.
My test setup is a program written in C# running on a 64-bit Windows 10 PC, interfacing with the 64-bit ps2000.dll from your current SDK (driver 18.104.22.168), and communicating with a 2205A scope.
Post general discussions on using our drivers to write your own software here
1 post • Page 1 of 1