I've been using the useful Sample Rate Calculator, recommended by Autonerdz -- fantastic web site too!
Sometimes I cannot reach the counts of Samples on Screen predicted by the calculator.
Could anyone with one of these models please let me know what is the "Samples on Screen" they can get with single channel, at, say, 5 ms/div (i.e. 50 ms screen time). This is "Number of Samples" in PicoScope's terminology.
The predicted count is 500,000, but I only see half of that. It's almost as though only half the buffer is available at speeds above 1 ms/div.
EDIT: I also notice a jump in CPU usage to 100% which also seems to happen routinely above 1 ms/div. This happens sometimes at other rates, but always at 1 ms/div and 2 ms/div.