I have been doing development with a Picoscope 3204B for a while now (writing Labview software for it). Up until recently I have only used timebase settings of 1, 2, 3 and 4.
According to the manual these should give timebases of 0.004, 0.008, 0.016 and 0.032 microseconds per sample, with subsequent time bases doubling this. All was well until I took some readings on timebase 5, which didn't seem to show the waveforms arriving at the right time.
Further investigation showed the timebase was not 0.064 as expected, but 0.048 microseconds per point (approximately - all I could do was measure a known signal).
Since then I have checked the longer timebases and come up with the following (microseconds per point):
Timebase 6: 0.065 us
Timebase 7: 0.082 us
Timebase 8: 0.1 us
Timebase 9: 0.118 us
Timebase 10: 0.14 us
(all are a bit approximate based on some measurements)
This is a bit confusing as they don't seem to follow any rational pattern. Is my picoscope / software working correctly, and importantly will these timebases be consistent across all picoscopes as the software I am writing will need to give the same result on another Picoscope of the same make and model? Has anyone else had similar problems??