The published Time Base Accuracy of the PICO 3206B USB Oscilloscope is ±50 ppm. I am struggling to understand what this acually means. My current understanding is; if the sample rate of the instrument is set to 4ns (250Msps) the 4ns sample rate may have an error of ±50 ppm (±80fs). Over the whole sample range (128 MSamples, 64M x 4ns = 0.256seconds) will give rise to a ±5.12us possible error.
Have I understood this timebase accuracy figure correctly?
Your initial maths was wrong but the principal was correct.
±50 ppm or, to make it easier, ±0.005% would mean 20ps on an individual sample interval of 4ns, or 1.28ms on 64M samples
Edit: and so was mine