this way of finding the closest frequency and setting it works fine all the way up to 46566.175308Hz but then the resulting frequency is rounded down by 1 signalGeneratorPrecision i tested this with deep measurement and averaging 9310 cycles.

So my question: is the way signalGeneratorPrecision is calculated the right way ? and how is the scope driver rounding of the frequency set by SetSigGenBuiltIn ? its could be more clearly explained in the programming manual !

How does this compare to setting the frequencies in Picoscope 6?
I assure you are using a sine wave as this would rule out rise time limits of the AWG.

I would try setting up the AWG with a one cycle of a a sine wave in the buffer. And call SetSigGenArbitrary()
Calculate the deltaPhase value using the formula on page 100 of https://www.picotech.com/download/manua ... -guide.pdf
Use the max size of the AWG buffer to make awgBufferSize and arbitraryWaveformSize so they drop out of the formula.
You also could compare your value of deltaPhase from the one returned from using SigGenFrequencyToPhase()
Use a double data type for the frequency for your formula.

You also try doing this in Picoscope 6 also compare a frequencies in 'sinewave' and 'Arbitrary' mode (set the Arbitrary buffer also to one cycle sine wave)
I think it might be related the datatype limits for startFrequency and stopFrequency. These are both 'float' which is limited to 7 digits.
You are using a double (15-17 digits), there also maybe issue passing double into float,and the round function depending on what data types requestedFreq and signalGeneratorPrecision are.

Hey Again Andrew
Yes it is pure sine function !
Today i did a test using the AWG to set a specific frequency.
I did the test in picoScope 6 and it looks like no rounding down above the ~46KHz so good !

So i did a deep measurement recording of the frequency and calculated the average frequency
i did that for 10KHz and 50 KHz and comparing the same frequency set with AWG and Sine Function
For 10KHz dF = 0.001728027 Hz ( assume related to timebase accuracy another measurement may show other value)
For 50KHz dF = 0.047721352 Hz ( this Value match the signalGeneratorPrecision of the AWG pretty much)

I thought the function generator was just the AWG in 14Bit mode with some preCoded functions ?

but could the step down over ~46Khz have something to do with the API Range limits to max=20e6 and min= 0.03 and some truncation from that ?

I did a few more experiments here with my DC509 frequency counter. This time I set it in an averaging mode that gives me 8 digits of precision and let it warm up for an hour to make sure its clock is stable. There does appear to be some shift going on here around 46kHz: see plots attached. Absolute error is just the measured frequency vs the DDS predicted frequency to 1 mHz precision. Relative errors are well within the spec of the PicoScope + DC509 counter. But I would not expect a sudden shift like that unless there were some unexpected rounding going on in the driver/scope.

As I mentioned over here (viewtopic.php?p=144939#p144939), I did trace the double precision value going into SetSigGenBuiltInV2 and it appears as I expect (no rounding issues).