During the development of a programatic solution involving a pico model ps5244d, we had problems achieving consistency of results. Our procedure looks roughly like this:
1. Open the scope through the synchronous OpenUnit call
2. Set the scope resolution
3. For each waveform configuration we were interested in:
3.1 Configure the channel
3.2 Find the timebase
3.3 Configure the trigger
3.4 Capture a number of waveforms through several RunBlock call. (This was later changed for a single the rapid block mode call. The problem occured in both cases.)
3.5 Stop the pico through a Stop api call
4. Close the pico
This would correspond to what we may call a "run" in this scenario.
Now, for some of these run, in a semi-random fashion, all the captured waveforms would appear to be distorted in the exact same manner. Their fall time would be higher (doubling or more) and their duration would be longer than they otherwise would be on "good" runs. This was true for all the captured waveforms of a given run. Then, on the following run, the problem would (sometimes) be gone. It would then randomly reappear for other runs.
We eventually discovered that the problem seemed to have been caused by the fact that the pico would automatically set the BandwidthFilter to the 20MHz settings instead of being set to None (Or "Full". I'm not 100% sure about the terminology as the PicoScope 6 software seems to call it this filter setting "None" but the API documentation seems to call it "PS5000A_BW_FULL".) Manually setting the BandwidthFilter to 20MHz through the api would give us back the "distorted" waveforms as they would sometimes randomly appear in our results. Systematically setting the Bandwidth filter to None (or BW_FULL) seems to resolve the issue. Note that, while experiencing the problem randomly, we never called the ps5000aSetBandwidthFilter function. We had to call it explicitly to solve the problem.
Now, the ps5000a API documentation mentions very little about this setting. It does not specify its default upon opening the device or if any other settings could cause the pico to automatically change it. The word "Filter" appears only three time in the api documentation. The word "limiter" also appears very little. My questions are as follow:
1. What is the default behavior of the pico with respect to bandwidth filter setting? Is it supposed to be None (Or BW_FULL) by default? Is it undefined?
2. Are there other settings and configuration, such as the channel configuration for instance, that may set this parameter to something other than None (Or BW_FULL) as a side effect?
If we are correct in our evaluation of this problem, then we must say that we were surprised by this behavior.
Discussion forum for the new Picoscope Linux software
1 post • Page 1 of 1