OK so I have two SICKs (NT6) greyscale sensors where one is the digital trigger that tells the Arduino to read the voltage from the other - all working well except for AD wobble.

Being pretty new to just about everything 'electronics' I am experimenting with this and that: I have a 12V power supply measuring 11.94V ± under load with the load being both NT6's, the Arduino and my electronics, amounting to:

'Dividing' the voltage from the digital sensor from 11.94V to <5VRestricting the analogue current with a resistor (Voltage range is 0..<3V before any tinkering)Using a 4V7 zener diode to cut the 11.94V supply down as an attempt to smooth the reference in conjunction with the AREF pin (also uses 4K7 resistor in series to allow software reference switching safety)

My wobble problem manifests as readings varying between say 56 and 58 (parts of 1023) for a typical analogue reading where the 'test' colour is physically static and I am wondering about Earth/Ground/GND implications.

The wiring that 'works' (with wobble) has the plain divider (not the zener whatsit) connected to the power supply Blue wire (European hardware if relevant) and is in parallel with the GND on the UNO R3 'Power' section. The bottom of the zener divider goes to the GND which is next to the AREF pin. In addition the analogue is pulled down using around 9K Ohm - to the GND in the Uno's 'Power' section.

N.B. The zener setup (AREF) doesn't seem to have done anything to change the wobble either way (compared with setting the ref to DEFAULT) and I haven't bothered dividing by anything other than 1023 yet and of course that will only make the wobble appear worse.

I read a technique about making the processor go to sleep when taking AD readings but the C was unfamiliar (I need more practice) and don't yet understand how to use capacitors. Incidentally I came across how to program the system to automatically calibrate to the AREF. If anyone has an idea about what is wrong (and right!) I would be very grateful.

Not sure what you mean by "wobble" (or SICK NT6 for that matter, hint), but +/- 2 counts on a ADC reading is about right.

"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.Do not send technical questions via personal messaging - they will be ignored.I speak for myself, not Arduino.

Yup... SICK is the manufacturer and the NT6 is an optical sensor that views colours in greyscale, rather like a black and white photo. Wobble is the inconsistent readings I'm getting which are in the region of ± 1 (56..58 out of the 1024 10-bit range). Actually it is not known whether this is ±1, +2 -0 or +0 -2...and so whether additional wander is possible at the wave of a hand (not a metaphor) I can't say. Neither can I say whether the mean value can change easily or what can (or perhaps what can't) make it change.

I'd really like to constrain this further because I would feel like I'm cheating to 'program' around it. Perhaps an experiment with a battery will at least give me experience in this matter and I have to say, the idea of noise originating from processor activity makes me think this analogue business might be an art (based on science obviously!).

"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.Do not send technical questions via personal messaging - they will be ignored.I speak for myself, not Arduino.

"Pete, it's a fool looks for logic in the chambers of the human heart." Ulysses Everett McGill.Do not send technical questions via personal messaging - they will be ignored.I speak for myself, not Arduino.

AWOL is this mantra always applicable when the inconsistency is this small or should I try to improve the design - given that I really don't have much idea about earth loops and the like. As this is my first experiment with anything electronic really, I'm all ears. As you say, this result is not bad and so I guess I should begin to focus on what the implicatons are. I'll take a look at what constitutes a shortfall in performance and what doesn't - given the right method of interpreting the result/s.

Just for your interest, there is really only one shot when it comes to reading the voltage from the sensor - this is because the colour is moving beneath it and any second read, should be labelled as 'second read' and compared with an 'ideal second read' value - at least this is my current thinking.

PeterH I hear you. However, given that I have a issue (or non-issue) with 10 bit, without the reduction in mV wobble, it will only get worse unless the greater resolution somehow shows up a frequency pattern wont it?

Cheers

PeterH

PeterH I hear you. However, given that I have a issue (or non-issue) with 10 bit, without the reduction in mV wobble, it will only get worse unless the greater resolution somehow shows up a frequency pattern wont it?

Cheers

If the signal you're measuring is noisy, then increasing the ADC resolution will just give you a more precise measurement of the noise. Averaging is the way to filter that noise out.

I suspect in this case though, that you've simply reached the limits of the ADC and the noise you're seeing is quantisation noise and inaccuracies within the ADC and not a variation in the signal being measured.

If you use a higher resolution ADC it will still suffer from 'wobble' as you term it in the least significant bits but the magnitude of the wobble will be smaller. For example, if you have a 10-bit ADC you may see noise in the tenth bit caused by ADC inaccuracies. If you had a 20-bit ADC you would not expect to see noise in the ten most significant bits, although you may see noise way down the scale towards the twentieth bit.

Oh right. I was struggling a bit with what seemed to be a bad run of luck in that I felt the quantisation rounding was being 'invoked' far to often i.e surely I couldn't be that close to a threshold of 0.5 that often.

Given that you think the system is largely working healthily in its current guise, would you expect the only approach (when using 20 bit ADC) to be simple mean averaging or, do you think a Median or Mode function might be the route (in other words: might there be a pattern in the data). A while ago I looked into ways to average because I thought it was at least something to consider, trying to find a library Mode function in C# (rather than rolling my own) to no avail and so hence an [additional] lingering reluctance to average - at least using C# (I'm using a PC to plot a graph in real time).

Thanks a lot PeterH

PeterH

We seem to be presupposing that the noise matters. Does it actually matter if the chart (or whatever) shows a variation of 0.1%, or whatever? If you're trying to work at that resolution then you're pushing the boundary of the accuracy of a 10 bit ADC.

If you want to get the maximum possible resolution from that ADC, then since you're sending the data to a PC I guess your sampling frequency will be limited by the serial link rather than by the ADC speed. In that case you could simply take several readings per sample and calculate the average of the readings (i.e. sum and divide by the number of readings). It still won't be absolutely steady at all times. If you want to smooth everything out then you can average the readings over time (to produce a decaying average, or whatever other algorithm you want) but by eliminating the higher frequency variation you are losing data and this may not be what you want. It comes back to what you want the data for and what sort of accuracy, resolution, frequency and filtering you need to apply to achieve that.