I use a Pico 6404D to record 1 million samples in block mode with 0.8 ns sampling rate.
Now, the signals in my waveform have a design distance of 9.838 us, but with my Pico I measure a distance of 10.060 us.
This means I am off by 222 ns.
To explain this, I checked the data sheet and found, that the so called 'timebase accuracy' is +- 2ppm. As far as I understand this accuracy, I should be off by 2 ns at the end of such a long waveform and not ~200 ns somewhere in the middle, right?
Can you post a psdata file so that we can look at the data and the settings. If the data is in one waveform please save just the CurrentWaveform to reduce the size.
Well, I'm not using the Picotech Software, but the Labview SDK. The data was taken during the SuperKEKB commissioning a while ago and just recognize this while analyzing the data.
To give you an idea: a pico 6404D has been in use and we measured circulating particle bunches in the SuperKEKB collider. The design revolution time of those bunches is 9.838 us, we measured 10.060 us using the picos. For smaller time differences, e.g. two subsequent particle bunches with a design time difference of ~100 ns, the measured time difference fits like a charm. It only happens with larger time differences...
Can you draw a quick sketch showing what you would expect to receive in a 5MS packet, and where the timing points are as I am still a bit unsure how you are calculating the times.
That's what I thought you were saying, and it doesn't make sense. If no changes are made to the settings of the PicoScope, and it is just the external test timings that have changed, I would expect it to read correctly.
Without seeing the actual code, it would be difficult to see where the error could be.
Was LabVIEW completely closed down between the runs ? and are there any manual inputs before it can collect data ?
It does not make sense if the error in the sampling rate is really 2ppm and not more, and this is the question.
Showing you the entire code would not make sense. Actually the pico-settings are set as in the Labview examples which are coming with the SDK. The system is not turned off in consecutive data takings. After we set the settings, the system run in a loop of data taking and data saving. There is no set of new settings in between.
Therefore, the question is: There is no larger error on the sampling rate observed than the 2 ppm as written in data sheet?
UPDATE:
Problem solved. I got wrong information. The measured time difference is correct! Thanks for your help!