Eventually this sketch will be for turning solenoid valves on and off, but blinking LED's is a a good stepping stone. It's for this project, turning air on and off to some pneumatic robots.

So, I'd like to use a visual interface in Processing to blink three LED's on and off without using "delay();". The point is to use a slider interface to change the period of time each LED spends on with a constant off time (~100ms). It's essentially a super slow PWM.

I'm mashing up this with the first code example here. The processing sketch I'm using for the interface is here. The only think I really need the interface for is handing a value between 0 and 255 to the incomingByte[i] array. Below is the code I've come up with so far:

The results are pretty erratic, with only one LED at a time doing the blink routine. When you drag the dots around the interface one LED will start blinking, turning the other one that was blinking off. Right now the on time and off time are both == incomingByte[i]. I'd like to get the "off" time set by a variable so I can play with it and adjust the on time with the interface.

Not sure why you insist this should be done without delay(). One approach would be to keep track howhow long it will take until the next LED transition., and just delay() until that time. Keep track of 3 separate TimeToNextBlink counters, delay equal to the minimum of the three, and update all three accordingly when the delay() returns.

A more elegant option is to figure out how to set up three separate timers, and run all the blinking off interrupts.

I'm not entirely certain, but doesn't delay(1000) cause everything on the chip to keep doing what it's doing until 1000ms have elapsed? Since I'd like to vary the 3 LED's blinking independently of one another, wouldn't the delay() function act like a master clock, causing all 3 LED's to stay in their current state until the time has elapsed?

It looks like you are only changing the state of the LEDs if you have received 3 bytes of data on the serial port. If I understand you correctly, you want the LEDs to keep blinking independently of whether serial data is currently being transmitted, so you may wish to try moving that part of the code ouf of the "if (Serial.available() >= 3)" block.

void loop() { check_serial(); // there is a latent bug here; every 50 days, there will be a short cycle. This probably doesn't matter, but here's where you'd fix it if it did. unsigned long cur_time = millis(); unsigned long cycle_pos = cur_time - last_cycle_start; if (cycle_pos > cycle_length) { // this is where the bug is. last_cycle_start = cur_time; for (int i = 0; i < pincount; i++) if (pin_time[i] != 0) digitalWrite(pin[i], HIGH); } else { // cycle_length can't be that long; in fact, it can't be longer than an int. int cv = cycle_pos * 256 / cycle_length;

void check_serial() { // protocol: lines containing command byte, ascii register number, '=', new value, then newline. // commands are: // - c: cycle time, in ms. No register number applies. // - p: set pin duty cycle. Register number is pin number (1-n). Value is from 0-256, where 256 is fully on. // response is either '!' followed by an error message, or '+OK', optionally followed by a response. No commands currently have a response. // Examples: // c=1000 // set cycle time to 1s // p1=0 // turn off pin 1 // p2=256 // turn on pin 2 // p3=128 // set pin 3 to half on

TQ - I can't thank you enough for hacking on this. After hooking up some test rigs with the bot, it looks like it's exactly what I need. I'm trying to come up with the smallest Processing program possible to pass commands to the Arduino board. Here's what I've been assuming the sequence of things needed to send commands looks like, but I'm missing something crucial along the way.