Hi there, hope you can help please.
I am using a PicoLog 1012 USB device into a desktop PC running Windows XP. I am feeding in a DC voltage which oscillates, so i need to run at a fast sampling rate, 5ms.
At this speed in single streaming mode the data looks good with nice resolution, but i have worked out that the time recorded in milliseconds is not quite correct. If it is set for 1000 samples at 5ms sampling rate then it should run for 5 seconds, which it shows on the x-y chart. Unfortunately it is losing time, and has actually ran for a little over 5 seconds. Is there a fix for this, or do i have to introduce a correction factor afterwards ? To ensure it was not the PC at fault, i have re-installed the software on my laptop today and it does exactly the same thing. Many thanks - dave