I'm using fast streaming mode for a 3224 on an XP system.
I am not using triggering or autostop, as it wants to run continuously and analyze the data stream in nearly real time. Aggregation is set to 2, on the idea that you're returning a min and max anyway, so might as well use both of them.
With an overview buffer size of 300,000 and a max_samples size of 64,000 (or 24,000, it did not seem to matter) everything is dandy with a sample_interval of 2300 nanoseconds. My code runs in a loop sleeping 150 milliseconds between each call to return more data. It runs indefinitely (at least 10 minutes just now) with no problems.
However, at a sample_interval time of 2050, it runs for about 10 seconds, at which point both the min and max values for both channels comes back with values of -32768.
I do not see this value documented anywhere. I am guessing it means something, and not a good thing.
Can you tell me what it means?