Series 3000 internal oscillator jitter

Post any questions you may have about our current range of oscilloscopes
Post Reply
Posts: 0
Joined: Wed Feb 05, 2020 1:30 pm

Series 3000 internal oscillator jitter

Post by jereq »

Dear all,

I've developed an electronic frequency measurement device and I would like to quantify its precision.
I'd like to use picoscope (series 3000) waveform generator as a reference signal to do it.

Reading some other posts from this forum ( and I understood that there is limitation related to the update rate of the DAC. I my tests im using rectangular signal and I choose waveform frequency which divides the 20Msps generator output without the rest, e.g. 40kHz, to avoid problem with edge uncertainty.

I consulted the datasheet to understand how accurate the generator is itself, however I did not find information such as jitter, long-term jitter and phase noise and precision of the clock that drives the waveform output. Could sb from the technical team provide this numbers, if available? Is this the same clock that is used for sampling the inputs as well? (I would assume it is, as measuring the mean frequency of the picoscope generator using its own input I get virtually 0Hz variation, so they seem to be in perfect sync).


Posts: 650
Joined: Mon Aug 11, 2014 11:14 am

Re: Series 3000 internal oscillator jitter

Post by Gerry »

Hi jereq,

I'm assuming that by 'I'd like to quantify it's precision' you mean 'frequency accuracy' because, as the developer of the device, you are not likely to know the accuracy to which the measurement can be made without testing it against a known reference. However, you should know the precision (number of decimal places) to which the device can report a measurement, even if the measurement is not very accurate (which should be a function of the effective resolution, if the design is digital, or the smallest error free increment of the full-scale voltage range if the design is analog).

So, assuming that you mean frequency accuracy, I will also assume that you don't have a high accuracy frequency, reference available (as you're asking about the accuracy of the PicoScope 3000 Sig Gen). Therefore, just to set the right expectation level, you should know that our 3000 series PicoScopes are not designed for measuring high frequency accuracy (you can do that much better using a 6000 series PicoScope, as it has an external input for a high precision sample clock master). That said, if you don't need high accuracy, and you can accept the clock errors associated with the PicoScope clock driving the input channel acquisition (ADC) as well as the Signal Generation (DAC) then as long as the combined errors of the PicoScope clock result in a frequency accuracy that is at least 3 times less than the errors of the clock/timing driving your frequency counter, you will be able to get a meaningful measurement of the frequency accuracy (i.e. to within the accuracy of the PicoScope 3000 clock accuracy).

Incidentally, the other post that you are referring to concerns our entry level PicoScopes (the PicoScope 2204A and 2205A) which involve another mechanism that causes the ghosting effect on the edges of a sampled squarewave signal, that are not relevant for your PicoScope 3404D, and this makes the entry level PicoScopes incapable of any kind of accuracy when used for frequency measurement.

Regarding the Signal Generator for the PicoScop 3404D, as alluded to earlier, the same clock source is used to drive the ADC in the input channel and the Sig Gen. However, the jitter of the captured waveform will also depend upon the jitter of the clock divider used to derive the Timebase, and the jitter of the of the Sig Gen will also depend upon the jitter of its implemented clock divider (which depends upon the way that it is implemented in the hardware logic, i.e. which resources are used, and exactly what division is being done). Unfortunately, we don't quote the jitter specs for the Signal Generator.

So, if you want meaningful frequency measurements you have 2 options:

1/ (a) If you are happy with a measurement that will be limited by the frequency accuracy and jitter of your PicoScope 3000 device clock that is driving both the input channel ADC and Sig Gen DAC then, to quantify the uncertainty of the measurement, you should characterize the jitter for the measurement setup that you are going to use with the input channel and Signal Generator. The most straightforward way to give you approximate values would be to create 2 Math channels (from Advanced->Buffered functions create min[A] and max [A]) then perform set your Trigger to 'repeat', set your Preferences to capture as many waveforms as possible (under 'Waveform buffer' set 'Maximum Waveforms' to 10,000) and then perform captures until the waveform count stops. The 2 Math channel waveforms will give you the limits of the jitter for each edge of the captured waveform. To find the limits of the jitter for all edges you can export the maximum and minimum currently displayed Math channels as a CSV file by going to 'Save As..', then selecting 'CSV' in the 'Save as type:' drop down list, and then selecting Save current waveform only on the left hand side. If you then import this into Excel, you can create a formula or Macro to find the edges, and then find the overall worst-case minimum values of the Minimum Math channel, and the overall worst-case maximum values of the Maximum Math channel to give you your final aggregated overall limits of +ve and -ve jitter for your captures.

(b) You should then also add the inaccuracy of the clock by going here:, scrolling down to 'Oscilloscope - horizontal' and using the value of 'Timebase Accuracy' for your specific model of picoScope to calculate the accuracy error as (number picoseconds in one sample interval for your chosen Timebase / 1,000,000) * Timebase Accuracy value and then add the value to your worst case +ve jitter, and subtract it from your worst case -ve jitter to give you the total margin of your uncertainty.
(Note that you would normally have to add the accuracy error to the jitter error to get your final value of total error due to uncertainty as follows:
Total Error = √([accuracy error]^2 + [jitter error]^2)
which would reduce the error slightly, but that would be, to some extent, compensated by the fact that we haven't allowed for how the number of captures affect the Confidence Interval which would increase the error).

2/ If you want to mitigate the uncertainty of the clock accuracy and jitter, to get a higher accuracy measurement then you would need to use a high accuracy clock source (such as this


Technical Specialist

Post Reply