Detecting One Digital Signal Cycle

Hi, I am very new to C language having only used C++ when I was in university. I am using C at work and am faced with several difficulties:

I am trying to write a program that would detect levels in digital signal (1 or 0) and then get the duration from the first 0 of a cycle to the 0 on the next cycle, ie. duration of one cycle. For example, the data obtained for a digital signal cycle is 0 0 0 1 1 1 0. I will need to get the duration from the first 0 to the last 0 to indicate the total duration of a cycle.

How do I write this in C? The key point here would be to get the duration for one cycle using level detection.

You would write it in C largely the same way you would write it in C++, though without resorting to OOP. How would you write this in C++? How would that differ from C?

It sounds like this is an embedded programming problem, so console I/O (ie, stdin and stdout that in C++ you learned to access with cin and cout) would not apply.

Read the schematic to see what pin of the microprocessor that signal is coming in on. Is that an peripheral I/O port (PIO)? Is it a serial input port? Is it an interrupt input. What kind of a pin that signal is hooked up to will determine how you would detect it. We do not have the device's schematic before us, nor could you post it due to its undoubtedly being proprietary in nature. I would think that the signal should be coming in on an interrupt input which you should set for edge detection, but that must already have been decided by the hardware designer.

Your system should have a timer interrupt that you're using to keep track of time. So detect the level transition (AKA "edge") and store the start time. When you detect the level transition that marks the end of the cycle, store the stop time and take the difference between start and stop times. If the signal is coming in on an interrupt line and hence is used to trigger an interrupt, then the edge detection and start/stop time storage would go in the interrupt service routine (ISR) along with the setting of a flag to indicate the capture of a cycle time. Then in other non-interrupt code that's run periodically you can detect that flag and perform whatever calculations and reporting that you need to.

Again, all that can be done in either C or C++. There's very little that I can think of that you would be doing differently in C than you would have in C++.

First with that fact that you seem to think that knowledge of C++ rather than C is any sort of a barrier - if you don't know how to do it in C then you don't know how to do it in C++ either, so you cannot really blame that.

Second with the fact that this is a real work requirement rather than an academic assignment. It is far too trivial for that, and in a work environment you would normally have more experienced colleagues to turn to rather than a forum.

Thirdly that you assert that this is a problem of "level detection" when it is in fact one of "edge detection". You need to know when the signal changes from 1 to 0.

Anyhow, the answer to your question will depend entirely on the hardware and target platform you are coding for. It may also depend on the frequency range of the signal, the mark-space ratio, and the accuracy required. A generic C/standard library solution might be possible, but may only be practical for suitably slow signals with a low precision requirement depending on the target and what other tasks must be carried out while timing.

Let us say for example that the target is a microcontroller. In that case the best solution would be to connect the signal to an on-chip timer/counter gate such that the signal latches the count which will directly measure the length of the last cycle with zero software overhead. Most timer-counter peripherals will generate an interrupt when the count is latched, so your code can get on with other work independently.

If it is not possible to connect the signal to a timer/counter, then the second best option would be to connect the signal to an external interrupt pin, and set it to generate an interrupt on a falling edge of the signal. The interrupt handler can then read some internal clock source (an on-chip timer) to determine the period since the previous edge. This still has a low software overhead, but is likely to be less accurate and have greater jitter if the system has to handle other interrupts of a higher priority.

The worst possible solution is to continuously poll the input testing for a transition from 1 to 0 and logging the time of such an occurrence and subtracting the previous time logged from that. This might be very accurate given a suitable clock source and if that is all the system has to do, but will be less deterministic and/or precise if the system has to do other work while monitoring the signal.

On the other hand, if a solution is required on a general purpose computer running Windows or Linux for example, then since these are not real-time operating systems, precision and resolution will be limited. This may not matter for slow signals and low precision requirements - so you need to specify those if you want a sensible answer.