Active filtering in the persistence mode, if the cutoff frequency is above a very low level, seems to make the waveform disappear. I am using a 3406B watching a datastream which is about a 1 MHz square wave at 200 ns/div. I then switch to the persistence scope and the first thing I notice is that the sample rate drops from 1 GS/s to 500 MS/s and the setting for the cutoff filter gets deselected. Then I go into the channel A setup to enable filtering again. If I set the cutoff at 2 MHz, the display shows no waveform. If I set the cutoff 200 kHz or lower, I get a waveform, but that is a really low cutoff. If I set it to anything higher than 200 kHz (I want 2 MHz cutoff), I get no display at all. These displays all work OK in normal scope mode.
Also, I see that the "20 MHz Bandwidth Limit" setting carries over from the normal scope mode to Persistence mode, but none of the other settings in the channel setup carry over. For example, if I set a DC offset (right next to the 20 MHz Bandwidth Limit setting, in the same box), then switch from normal to Persistence mode, the DC offset mode reverts back. There appears to be a separate mode for some of the settings, and common for some. It makes a bit time consuming to use Persistence mode, since I have to keep switching back an forth to look at the channel settings and copy them all over, as I use the normal mode to get the scope setting right, then switch over to Persistence mode. Is this normal, or is there a way to configure it so they use the same settings?