I'm using matlab drivers for a 5444B device. The issue I met is the actual sampling rate got changed after running in streaming mode for a period of time.
The code I'm using is this example: (with a little logs added) https://github.com/picotech/picosdk-ps5 ... _Example.m
I'm using streaming mode, and set the sample interval to 100ns in the code.
The actual output is the following:
The input signal in Channel A is a regular square wave. The streaming mode has been working fine in the first 25 seconds, and I do get 300M samples as the input of "preTriggerSample" and "postTriggerSamples", but it seems, but the sample rate seems somehow downgraded after 25 seconds.
In my mind there could be some mechanisms for this downgrade, e.g., buffer overflow or matlab execution too slow, ...
What I would like to know are:
1. Could you help to explain what is happening? Why the sample rate downgrades after 25s?
2. Can I do something? The ideal expectation of mine would be it continues to record data at a stable sample rate of 100ns until it run out my PC memory.
3. If there's a hardware limitation, can I estimate the max streaming time or min sampling rate for infinite streaming? Or can I get the actual sampling rate from the device?
I'm meeting exactly the same problem and I'm wondering if someone could help. I'm trying to do streaming for ~4.5 s with a high sampling rate (125 M/s) in my Matlab. After around 1.5 s, there are some distorted signals appearing in my input, such as missing part, changing interval, or even lost one signal as I attached in the figure.
Chaobo, you may try increasing the buffer size in the code, which works for me lower sample number at the beginning. I guess if the buffer size is too small, the data may be erased before they're written to the Matlab. But for me now, I'm using a 6e8 buffer size to collect 6e8 samples, I don't understand why they still behave in this way.