A paddle wheel flow sensor is installed into a 1“ pipe. It takes 11 seconds to fill a tank to the 5 litre mark. The totalizer shows 560 pulses when scale = 1. What divide ratio should be used to display a flow rate in litres/second?
if you know the steps, show me

I'm reading the question as saying that the totalizer has counted 560 pulses and is showing a scale reading of 1 at the point that the 5L has been reached. In that case, it doesn't matter what the scale is reading, the totalize sees 560 pulses for every 5L of fluid.

The problem is that if the sensor just reads out totalized volume, then converting that to a flow rate isn't a matter of simply dividing it by the right scaling number since that does not incorporate the notion of time.

Could you clarify just what the sensor is reading out? Is there just a single output or are their two (for instance, totalized volume and something different that is the "scale" output)?

I'm reading the question as saying that the totalizer has counted 560 pulses and is showing a scale reading of 1 at the point that the 5L has been reached. In that case, it doesn't matter what the scale is reading, the totalize sees 560 pulses for every 5L of fluid.

The problem is that if the sensor just reads out totalized volume, then converting that to a flow rate isn't a matter of simply dividing it by the right scaling number since that does not incorporate the notion of time.

Could you clarify just what the sensor is reading out? Is there just a single output or are their two (for instance, totalized volume and something different that is the "scale" output)?

Well you have count per volume and volume per time. All you need to do is divide the counts at the end of the transfer by a constant that renders the flow rate (volume per time ) which you already know. So,

count/? = volume/time.

And you've calibrated a raw count to a flow rate. If that's not what is intended by the question, then I don't know what else it's asking.

Well you have count per volume and volume per time. All you need to do is divide the counts at the end of the transfer by a constant that renders the flow rate (volume per time ) which you already know. So,

count/? = volume/time.

And you've calibrated a raw count to a flow rate. If that's not what is intended by the question, then I don't know what else it's asking.

Click to expand...

But you haven't calibrated a raw count to a flow rate. If the transfer of 5L takes 22s tomorrow you are still going to have a raw count of 560 and if you use the calibration constant you calculated today your displayed flow rate will be twice as big as it should be. Also, if it's truly a totalizer, then it will be starting out at 560 and after transfering 5L tomorrow it will be at 1120 and now your indicated flow rate is four times too big. Now, if you want to assume that the totalizer is reset to zero at the beginning of each transfer, that's reasonable, but it is also an assumption that should be stated. And you still need to get the time measurement folded in there somehow and since the problem is asking for the system to display a flow rate, it would seem that it has to be something other than someone timing it on their watch and doing the math each time.

No, if it takes 22s to transfer 5l tomorrow, you still use 11 sec as your base, and the system remains calibrated. After 11 seconds, the totalizer will read 560/2. What you're suggesting would be like charging a capacitor for 11 seconds one day, then charging it for 22 seconds then next day and concluding the capacitor doubled in size.

We obtained 5 gal in 11 sec with the totalizer reading 560. So, according to your recommendation, we get our calibration factor using:

count/? = volume/time.

And you've calibrated a raw count to a flow rate.

Click to expand...

? = (count*time)/volume = (560cnts)*(11sec)/(5gal) = 1132 cnt*sec/gal

So you are saying that we make a circuit that divides the output of the counter by 1132 and we have our flow rate. The problem isn't to simply calculate the flow rate for that one use, but to calibrate it so that it "displays" the flow rate.

So three days from now I fill a barrel and the totalizer displays 5600 when I am done. The display reads 4.57 gal/sec. Is that correct? When I was halfway done, the display was reading 2.28 gal.sec. Was it really flowing only half as fast?

Or are you suggesting that we need to post a sign next to the display telling the user to only make note of the displayed value 11 seconds after they start pumping?

My answer does not change. You're questions are irrelevant to the original question. Use the timebase and the calibration is valid. I won't answer questions outside the relm of the sampling I've proposed. Sample for 11 seconds and the system is calibrated. It's just that simple.

An odometer is installed in a car. It takes 11 seconds to travel 550 feet. The odometer shows 275. What divide ratio should be used to display a speed in feet per second?

I'm still not sure what the "scale = 1" is, but I'm beginning to think it is simply a scale setting on the totalizer so this is akin to saying that a scope shows 2 divisions when the scale is 1V/div. Perhaps the totalizer was nominally intended to produce 100 pulses per gallon on a scale of 1 and perhaps 1000 pulses per gallon on a scale of 10. That's just a guess.

So the resulting spedometer can only be used 11 seconds into a journey. How many systems have you ever encountered in which the calibration was only even close under the exact conditions that the calibration was performed under? Which makes more sense, that this 11 seconds just happened to be how long it took to fill a five gallon container, which just happens to be what they had on hand, or that they wanted to calibrate this thing to know the flow rate exactly 11 seconds into a transfer and to have totally bogus readings both before and after that magical moment in time?

I guess what I am contending is that it is unreasonable to interpret the question in the latter way and you are contending that that is the only possible way to interpret it based on what was given. I guess we'll just have to agree to disagree.

Actually there are alot of systems that are precise under the conditions that the calibration was performed. In a sampling senaro, you can only use what you have, time, quantity, scaling factors... One such example is a DMM that uses a VCO and takes a sample in a specific period and calculates the voltage from the quantity of that sample. I worked on such an instrument about 30 years ago.

BTW, I did not contend that that was the only possible way to interpret the question. I only suggested that the question refered to basic sampling theory. It might be something else, but I beleve I have it correct.

Now, if your question is more of are there other sample periods for which the calibration is valid, then my answer would be it depends on if the system scales linearly. If so, one could use a different sample period and apply an appropriate scaling factor. Otherwise, one would require an array of coefficients and select the correct one depending on the the sampled value. The more non-linear the system, the more that the calibration becomes an approximation, assuming a constant number of coefficients.