I'm having some issues to set up a pulse width qualifier on my trigger !

Let's explain the problem :

I have a 100Hz rectangular 0 to 2V signal with 50% duty cycle, and the trigger fires as following :When i set my pulse width limit around 1 250 000 samples of 8ns each (timebase n=1) , it triggers for 10ms periods (5ms at 0V and 5ms at 2V)

But I don't understand why ! The pulse only last for 5ms so why isn't the trigger firing around a 625 000 samples limit ?

It's important for me because the signals I want to study hasn't a 50% duty cycle ! When i set the duty cycle of my signal to 20% (1 period = 2 ms at 2V and 8ms at 0V) the trigger behave exactly the same : it triggers with 1 250 000 samples of 8ns while the pulse duration is very different ...

I think there's something i don't understand here, so can anyone help me please ?

But what I want is setting the pulse width for the 2V pulse duration only, not the entire period of the signal.For example when I will set my pulse width range to 600 000-650 000 samples, the 50% duty cycle signals will triggers, but if i change the duty cycle to 20%, it will not trigger anymore. How can I do that ?

It would be helpful to know how many samples you need to capture, at what sampling rate and with how many channels. Your images

The autoStop parameter is set in the call to ps4000RunStreaming().

When collecting data in streaming mode, the trigger can only be activated once. If you set autoStop to false, the device will continue to collect data. However, if you need to wait for another trigger event, you will need to stop data collection and start again.

So I need to trigger every 10ms (with approx 10% variability), my sampling rate can be around 1 to 10 us/sample.I tried to call the StreamingDataHandler in a while loop but i get only around 80/1000 triggers for a 10 seconds acquisition...So I understand it won't be possible in streaming mode, is that right ?

But when i collect my triggers for 10 seconds, it seems like only 600/1000 impulsions fires a trigger ...

If I set my frequency to 10Hz (instead of 100Hz), the trigger fires for 100/100 impulsions, i think i'm limited by the fact that calling the BlockDataHandler take some time, what can I do to reduce the time between 2 triggers ?

There is an example in the C console example and you will need to segment the memory. Rapid block data collection reduces the time between successive captures (it can be down a few microseconds at the fastest timebases) and then transfers the data to the PC when you call the ps4000GetValuesBulk() function when data collection is complete.

So this is working now : My trigger is functioning and there is no misses of pulses.

But I have a problem : When I use the date of the PC to measure my Period Time, it's not very accurate ... there is an average 20us difference with the reality. As you can see on the screenshots, my application tends to a 9977us average time per period, with periods between 9900 and 10000 us. But when i observe my signal with PicoScope 6, i never see a period under 9999 or over 10001 us

How can I increase the accuracy of my system ? I really need it because I have to report statistics on a signal and I can't allow these statistics to be so poorly accurate.

When you call the ps4000RunBlock() function there will be a delay of tens of milliseconds while the device is setup for data collection as well as any subsequent time period while waiting for a trigger, the time for the data collection to complete and the device to indicate that it is ready.

The current PicoScope 3000D and D MSO models support trigger timestamping but are only 8-bit resolution devices compared to the 12-bit resolution of the PicoScope 4000 Series devices.

printf("Trigger %d Period Time : %05ld us Total Time : %ld us Average Period = %li us\n",j , timeMicroSecondDifference, totalTime, moyenne); }

j++;}

there will be a delay of tens of milliseconds while the device is setup for data collection

Did you mean tens of MicroSeconds ? Because I don't see such a delay in my calculation ...

One work around might be to trigger the signal generator output every time the input channel is triggered and measure the time between the signals output. There might be a few milliseconds delay though.

Please tell me you meant MicroSeconds as well ? If not I will have to tell my tutor the equipment he gave me cannot be used for our purpose ...

ps4000GetTriggerTimeOffset does not provide the time between two adjacent triggers, it gives the jitter between the the true trigger point that may lie between two ADC samples with the ADC sample that is tagged as the trigger.. The values will often be 0 or very small, and always be smaller than the actual sample interval being used. The purpose of the function is to adjust screen drawing of the trace in PicoScope 6 so that the trace does not jump about.

It is not possible to time difference between adjacent triggers with the 4000 series devices.

Can you explain what you are trying to measure ? Is it the pulse width of the trigger event ?

I am trying (and succeeding) to precisely measure time periods, and I want my program to fire triggers when periods are out of selected range, but I can't afford to miss one if two periods in a row are out of range, so I need a continuous acquisition and datation (for 3 to 30 days tests).

I eventually came to use the streaming mode and I made up some manual trigger in my program, which doesn't need the trigger of the picoscope. For those who would like to see my solution, you can find it in my next topic (which highlight a sampling Interval issue).