The Spectrum display seem to be wrong if the trigger delay is set to -50%. I seem to get all the frequencies doubled. If I set the tirgger point to 0%, it looks OK. Is this normal?
What scope are you using and which revision of software? I have tried this and cannot see a problem. It would be helpful if I can try the exact same setup as you are using to see if the software is at fault.
Hi Sarah,
Attached are two PSD files, and two image files showing the problem I see. One shows a noisy 10 KHz square wave, the other shows a very clean 1 KHz square wave. Notice the spectrum displays. If you leave the display refreshing long enough, occassionally it will show the correct spectrum for a single trace, then switches back to the one I captured. It is almost always showing the display I captured.
-David
I have tried this out using a 2105, the latest software and your file that you supplied (as well as setting up on my own) and all the results I get are normal.
What version of the software are you using? If it is not 5.15.6 then I recommend downloading this from our website.
Some more data points:
If I set the trigger to -25%, the display goes to showing a 1 KHz signal with a peak at 4 KHz (4 times the correct value). If I set the trigger to -10%, the spectrum shows the peak at 10 KHz (10 times the correct value). At -5%, it shows up at 20 KHz, at -1%, it shows up at 100 KHz, etc. This pattern seems to follow for all values in between, till the trigger is at zero, (at zero the spectrum is OK). In all cases, it is flashing between being correct and the wrong setting, with the wrong setting showing about 70% to 80% of the time.
Given the apparant resolution on the spectrum display, my guess is that it is actually correct data, but the display routine is stretching it out and only showing the first 10% of the real data that was calculated (in the case of the -10% trigger). Also, in all these tests, the scope display looks normal.
I have included a snapshot and psd file for the -10% case.
Thanks,
-David
-->
I am running 5.15.6. I have tried it on two systems, with the same results both times. When I do Help-->About, I get the following:
Some more data points:
If I set the trigger to -25%, the display goes to showing a 1 KHz signal with a peak at 4 KHz (4 times the correct value). If I set the trigger to -10%, the spectrum shows the peak at 10 KHz (10 times the correct value). At -5%, it shows up at 20 KHz, at -1%, it shows up at 100 KHz, etc. This pattern seems to follow for all values in between, till the trigger is at zero, (at zero the spectrum is OK). In all cases, it is flashing between being correct and the wrong setting, with the wrong setting showing about 70% to 80% of the time.
Given the apparant resolution on the spectrum display, my guess is that it is actually correct data, but the display routine is stretching it out and only showing the first 10% of the real data that was calculated (in the case of the -10% trigger). Also, in all these tests, the scope display looks normal.
I have included a snapshot and psd file for the -10% case.