Hello, I have an application that uses c++ to interface with the picoscope API. I find that whenever in fast_streaming mode (running a 2203, latest drivers and headers from the SDK) one CPU core is always pegged at 100% (i have a dual core).
My code calls
processes events if there are any, and otherwise puts my calling thread to sleep for 20ms.
I don't understand why the picoscope driver thread is always at 100% utilization when I am only asking for events every 20ms.
If I delay for a long time inbetween asking for events (say 1 second) there is no change in cpu usage.