I'm looking for a means to be able to get hardcopy from a scope (something analogue scopes don't provide easily!), and a software scope such as PicoScope will give me this. The circuits I'm involved with are not particularly demanding bandwidth-wise (analogue music synthesizers), but do tend to use a lot of CD4000 series chips, and one of the things I occasionally need to know is whether the rise/fall times of clock signals etc. are fast enough for the chips. Basically this means they must be no longer than a few microseconds, and so I would need to be able to easily determine if any rise/fall time was inside or outside this criterion. Thus I downloaded the demo and had a look-see. Whilst looking at the edges of the 'dynamic' sample squarewave at various settings I was a little dismayed to discover that it had rise/fall times of several microseconds, but was being displayed as several large steps, of about a microsecond apiece, which just didn't fall in with my expectations. I then saved the data as text and had a look at it - here is an abridged version (dots are where I have deleted data):
So two big jumps from -390 to -146, then to 190. Sampling speed is clearly not an issue, and neither is apparently resolution. So my questions are:
1. What is the reason for this?
2. What scale/bandwidth/resolution was the data recorded at?
3. For my intended use, 8 bit resolution and 50MS/s, seems like the minimum requirement, would you agree?
4. I'm running Windows XP Home. The demo crashed on me about 5 times in 2 hours (which didn't impress me at all, but it had been suggested to me before-hand that the software was known not to be 100% bug-free) - do you have a regular schedule of software updates to rectify such problems?