And one more question about fast streaming mode.
The ps3000_run_streaming_ns function sets sample_interval instead of timebase. How precise is that value? If I set it to 1 microsecond, can I be sure that time interval between samples will be 1 microsecond within 1% error?

The pros in question one is that the overview of the waveform can be seen as it happens. With Question 2 you will still need to clear the overview buffer, unless a high enough overview buffer value size is set and the aggregation is also set very high, to ensure that the overview buffer never fills up. If the overview buffer does fill up then lost samples will occur and the collection from the device will stall.

The accuracy of the set interval is 50ns per channels, except when 3 channels are active then it is 200ns (the same as when 4 channels are active).

When you first start up your application and initialize the buffers the buffers will be available until you end your program, then the memory space the buffer uses will be dropped.

As your program is running and data is being stored to the memory buffers, before the buffer is completely full you will need to transfer the data points to your application. If you do not then the data points that you had not transferred will be lost. You can transfer the data points using a couple different methods such as ps3000_get_streaming_last_values or ps3000_get_streaming_values_no_aggregation. This does not clear/delete the buffer, but transfers them to your application before they are overwritten.

Thank you for your response. No, I am afraid that it does not answer my need which is to stream data for long periods.

I am using ps3000_get_streaming_last_values driven by a timer which is much shorter than the time to fill the buffer I assign when invoking the ps3000_run_streaming_ns function. Once the number of values reaches the size of the buffer, (say the default 30000 or upt to 150000) there is a loss of data plus corruption for about 24 cycles. The data stream then resumes.

The fact that it occurs at exactly the size of the buffer is a clear indication that there is a driver / low level issue rather than a problem with my code.

I asked about reseting because I was looking at the forum and saw a response from "Ziko" which said "With Question 2 you will still need to clear the overview buffer"

If I was certain that the loss of data, and transients, lasted exactly 24 samples I could write code to overcome the problem However, I have no way of being sure that is the case. Having said that, it seems very consistent over a range of tests that I have run.