How to ensure no overrun in streaming mode with ps6000 ?

Post general discussions on using our drivers to write your own software here
Post Reply
Posts: 0
Joined: Tue Apr 16, 2019 5:06 pm

How to ensure no overrun in streaming mode with ps6000 ?

Post by tfischer » Tue Apr 16, 2019 5:14 pm


I have just started using the sdk in order to stream continuous data from my 6403D scope.
However I see not way to ensure that the sample are continuous and no data are lost.

For the 2000 series, the documentation states that when sample have value "PS2000_LOST_DATA -32 768 0x8000" it "Indicates a buffer overrun in fast streaming mode". There is even a function dedicated to this "ps2000_overview_buffer_status".

But for the 6000 series (I have not check other series), there is no such buffer_status function nor LOST_DATA value for the samples.

Could someone indicates how to ensure continous stream without data loss on 6000 scopes ?


Posts: 557
Joined: Mon Aug 11, 2014 11:14 am

Re: How to ensure no overrun in streaming mode with ps6000 ?

Post by Gerry » Mon Apr 29, 2019 12:28 pm

Hi tfischer,

I've just noticed this post and, although I answered your query via a help desk ticket, I thought that the answer might benefit other customers. so I've written a detailed response below:

The Overflow flag isn't implemented in the PS6000 driver, so there is no easy way to deal with this. However, what you can do is test if you get overruns with your current code implementation, by capturing a waveform and checking for a discontinuity in the waveform, (which will be the point where) and then make changes to your code so that it will not cause overruns.

To explain this in more detail, what causes the hardware buffer in the PicoScope to overrun is a combination of using a streaming rate that is too fast for your code to handle (as it is currently implemented), and not emptying the Overview buffer quickly enough, i.e. waiting too long in between calls to getValues(). If the software overview buffer is filled to capacity, then the hardware sample buffer in the Scope will start to fill temporarily to hold the data while the software overview buffer is emptied. If the overview buffer still remains full then the hardware buffer will reach a point where the old data in the buffer, not yet sent over USB, gets overwritten. So, once you establish that your current code implementation is causing an overrun, you can either reduce the streaming rate, or restructure your code so that the time between calls to getValues() is reduced. You can then iterate between test and adjust to optimize your code so that no overruns occur.

So, to detect your overruns you can create a Triangle wave in the Signal Generator with the correct period for your sample rate that will increment at (or very close to) a rate of one sample step every sample interval. This will give you a waveform that is easy to check with a constant value, at the highest resolution (least noise) which will reduce the risk of the test failing (the test will fail where the old waveform is overwritten at the point that coincides exactly with the new data point in the new waveform, but higher resolution means that there are many more points that represent a discontinuity). You can run the overrun test for a long enough period of time to guarantee that the test will eventually succeed (having a low risk of failure just means that you won't have to wait as long for results).

To select the correct Triangle wave, use the following formula:
Linear slope waveform frequency = 1/(sample interval * (2*Scope resolution-2) * (Sig Gen Amplitude / Input range)
e.g. for a PS6000, using a sample interval of 12.8ns, an input Range of 2V (±), with a Sig Gen amplitude of 1V the optimum frequency is:
1/(12.8*10^(-9) * (2*256-2) * (1/2) = 307.578kHz. I have created a psdata file and screenshot of the waveform below:
PS6000D Linear slope Triangle waveform.png
PS6000D Linear slope Triangle waveform.psdata
(774.65 KiB) Downloaded 34 times

With the correct triangle wave selected you can then effectively set a mask around the waveform, by using a variable that tracks the waveform sample values and checks that they don't increment or decrement by more than a few samples (the guard band of your mask). You just need to detect the top and bottom of the triangle-wave so that you can reverse the tracking direction of the mask.

So, once you have setup the waveform and the mask, you can run your code and then just disconnect/reconnect the the Sig Gen to check that the test works, and then restart the test and let the code run long enough to see if it breaks the mask on its own.

I've used this method in the past to successfully help a customer determine the maximum streaming rate for their code.


Technical Specialist

Post Reply