I thought about this to start with, and it may still be a possibility.

The downside is it adds to the complexity of the electronics. I'd have to have a different burden resistor for each different CT. Depending on the output voltage I get, they'd have to be 1 or 2 watt resistors (with more heat dissipation - with 20+ circuits it can add up). And if I can't get the same output, the amplifier gain would have to be different on each one (I want to bring them up to 5VDC max).

Of course they'd have to be 1 or 2% or accuracy would suffer (worst case a 100 amp main circuit could be off by 10 amps with 10% burdons). It also is hard to find the exact values I would need.

Finally, each input would be restricted to the one CT designed for it. The latter wouldn't necessarily be a problem because you don't replace circuit breakers with ones of a different size (at least not safely! ), but it's something to consider if I want to take it to our next house (I'm thinking ahead here).

It may be the way I have to go to keep costs down,

Of course, I could go with solid core (much cheaper) for all but the mains and just disconnect the wires (with that circuit off, obviously), run them through the cores and reconnect the wires. I'm not afraid of working in a breaker box (except for the live mains); I just don't like to press my luck.

I hadn't looked at those in particular, but had seen similar ones. However, I don't think they will work in my case. I need something which can clip around a wire - 14 gauge to 000 gauge, depending on the circuit (branch or main). Then there would be a wire pair going to a box with the circuitry in it. Maybe I'm missing something, but it doesn't look like these will work in that situation.

If I am missing something, I would appreciate if you could enlighten me.

Thanks for the ideas. Unfortunately, the RIBXGTF and the RIBTWXV2401B won't work because they are switches which trigger at a pre-set current level. I need to know the actual amount of current flowing. On their site I did see the RIBXG420, which might work - I'll have to look into it more closely. The only problems I see with it are cost (about the same as the Magnelabs SCT-400), and the fact it doesn't have a burden resistor.

The latter would normally result in high voltage at the terminals if the burden resistor became disconnected (or, for instance, were on the board and the transducer were disconnected from the board). However, the fact it claims DC output is interesting; it must have some circuitry in it to rectify the AC.

It also shows some non-linearity in the botton 10% of it's operating range, but that could be managed.

I'll have to investigate this one more; the DC output is a definite advantage; it removes the need for a precision rectifier on each sensor.

I had some time to investigate it further today. It looks like you have to supply it with 9-35VDC (depending on load resistance), and it varies the current accordingly. This is way more than the circuits I'll be dealing with are made to handle (max is about 3.3VDC). I could handle it with voltage dividers, but means another (higher) voltage supply is required.

But the real killer on this one is the statement in their bulletin: "CAUTION: RISK OF ELECTRICAL SHOCK - MORE THAN ONE DISCONNECT MAY BE REQUIRED TO DEENERGIZE THE DEVICE BEFORE SERVICING". This might be OK in an industrial environment, but I don't want someone (that means me! ) getting shocked when I disconnect the current transformers to play with the electronics some more. And I wouldn't want it to shock anyone else (wife, kids, etc.) who unplug a current transformer accidentally or on purpose.

It was a good idea, but I just don't think it will work for what I need. Thanks, though, for the idea. It was well worth the time to investigate.

That statement is just to protect from touching the wire the current sensor is connected around. The current sensor is completely safe to touch, the output terminal or wires, depending on the model you get, are low voltage and safe. Of course, you still have to be careful with the Line Voltage wire these are wrapped around (your house wire).

Also, if you use the RIBXGTV5-20, 50, or 100, no power supply is required. The power supply is only required for the 4-20mA output devices (RIBXGT420 for example).

That may be, but it's not what it says. And I have to go by what it says. Please not that it would be very unusual for a current transformer to be wrapped around an uninsulated wire. But even if it were, it would not require a warning on the transformer.

And the RIBXGTV5- series are current sensors to switch a load , not current transformers. They do not provide the output I need.

Sorry for the delay in responding. I've been on three higher priority projects over the last three months.

Amplification is not a problem; I want to bring it up to about 3V P-P for better accuracy in the precision rectifier anyway. And this will be feeding an in-house developd controller running at 3.3V. The biggest problem seems to be finding reasonably priced split-core transformers, which you helped me with tremendously. Thanks!

Jerry, you can can take your measurements with one current transformer, no hardware changes and no firmware changes. For example, configuring your circuit to use the full range of a 16 bit A/D at 100 amps provides a resolution of 3 mA. Even a 12 bit A/D provides resolution of 24 mA.

The LEM USA model TT 100-SD split core transformer is quite linear from 0 to 100 amps and is available at Digikey for $25.20 each in quantities of 10. The Digikey part number is 398-1079-ND.

The biggest problem with such a design is the non-linearity of precision rectifiers at the low end, unless semiconductors are closely matched. Another problem is you wouldn't want to show a 15 amp circuit at the same chart scale as a 200 amp circuit. So multiplication is necessary, anyway.

The software isn't a problem. It's easy to program in the setup, and even easier for the multiplication.

Jason, thanks, but it won't work. I need one with the burden resistor as a part of the device - the main reason being if the transformer becomes disconneced from the burden resistor, the transformer can output dangerous voltages. That is acceptable in some situations, i.e. industrial equipment, but not in the situation we have in mind.

Diode linearity should not matter with an active precision rectifier. In fact the ADC can be driven with a DC level shifted AC waveform and with some math it can calculate the RMS voltage with no rectification needed.

Amplifying the signal a potential 0.2% or 2% gain error is introduced by the 0.1% or 1% Op Amp resistors. And the Op Amp introduces an offset. These issues can can be compensated for in firmware if each unit is calibrated.

It's not diode linearity that's the problem - it's the nonlinearity inherent to precision rectifiers at low voltages. This most often is caused by different cutoff voltages in the diodes being used (none are perfect) and slight differences in resistor values. This isn't much of a problem except when you are very near the cutoff voltages of the rectifiers, when it can become very apparent.

And trying to shift the waveform and calculating the RMS voltage is much more complicated than a precision rectifier, and less accurage unless you have a pure waveform on the input.

We are not amplifying the signal 0.2% or 2%; we are amplifying it approximately 900%. A 2% error is acceptable, which is easily within our current design.

Finally, manual calibration is out. The labor time alone to calibrate each unit would increase the cost beyond what is acceptable. It has to be accurate enough off the assembly line, which it is.

There are a number of problems associated with this. First of all, with a 333mv max input level, the rest of the circuitry can easily be protected with a pair of silicon diodes front-to-back across the input. A 2V P-P level would require 4 diodes (assuming 10% overcurrent). Second, it would still need to be amplified to get to the 3V P-P desired for the precision rectifier. Third, the input impedence of the precision rectifier is neither constant nor linear; it is best to feed one with a low impedence source, so even if I had the 3V p-p, OTOH, an amplifier is very simple - 1/4 of an op amp and a couple of resistors is all that's required.

Ah, sorry - I missed the RMS. I read it as 2V P-P. We could divide it down, but then that comes up with other problems like how to protect against overvoltage on the input. As I said earlier, the 333mv input is easy to protect, and even if there's a problem which causes a peak at 700mv (and the protection diodes to conduct), the output is still limited by the PS voltage, so the rest of the circuitry can't be damaged.

Additionally, I see this one is only available in 25, 50, 75 and 100 amp versions; we need them from 15 to 200 amp.

A good idea, but not workable in our situation. Thanks for the suggestion, however.

The accuracy of a precision half wave rectifier does not depend on the diodes. The two dominant error sources are Op Amp input offset voltage and the resistor tolerance. The circuit can be configured for a single power supply. A dual power supply precision rectifier is shown below.

If the A/D is followed by a microprocessor capable of doing some math, no rectification is needed. The microprocessor can simply measure the waveform P-P value and calculate the RMS value. If significant waveform distortion is anticipated the microprocessor can perform a true RMS measurement rather than multiplying the P-P value by 1/[2(2)^0.5].

When using A/D converters, as with any instrumentation, an error analysis of each stage should be performed. I've seen cases where an analog front end increased A/D measurement error rather than helping it. Running an A/D at a fraction of its range can sometimes be better than adding an amplifier before it to use the entire range. One thing to watch out for that the circuit should be able to source sufficient charge when the A/D sample-and-hold samples. This can be done by placing a resistor between the signal source and the A/D with a capacitor to ground at the A/D input.

I am well aware of the inherent errors in amplifiers, especially offset error in op amps and resistor tolerances. However, I believe you are incorrect about the diodes. You can easily see the distortion in the output waveform near the zero crossover point. Resistor tolerance is part of it, yes. But so are the diodes. All diodes are not created equal, and there are slight differences in their operations. In most circuits it does not matter; however when you have a circuit such as a precision rectifier which, if the input is close to 0V P-P, that difference becomes apparent.

And yes, there will be some waveform distortion due to load reactance. Just measuring the P-P value will not give an accurate reading. And while there is a microprocessor involved, it is doing a lot of other things, such as taking various actions depending on the input. Plus, we are monitoring multiple circuits, and there is not enough processing power in the microprocessor to compute all of the currents for all of the circuits and the rest of the work it needs to do.

I've been designing circuits since the early 70's, and am well aware of what I'm doing. And we've been over the design many times, throwing out other ones for one reason or another. This is the best design for this project.

Given an ideal Op Amp (infinite GBP) in the circuit shown, the diode non-linear forward voltage drop is completely compensated for. The voltage at the summing junction is zero volts and that is set by the voltage at the diode cathode. The voltage at the diode anode does not enter into the error calculation, given the ideal Op Amp. However, with real world Op Amps a small error voltage needs to exist and be amplified to overcome the diode forward voltage drop. In a practical implementation, for example using the LT1012, the finite gain of the Op Amp creates the need for this error voltage. The LT1012 gain is about 10,000 at 60 Hz. The diode forward voltage of 500 mV then leads to an error of 500mV/10,000 = 50 uV at the peak of the AC waveform. LTSpice shows an error of 30 uV. The Op Amp/diode gain error at the zero crossing - where the diode forward voltage drop is 0 volts - is zero. The Op Amp/diode FS gain error for a 14 bit A/D having a FS of 1 volt, is about 1 LSB. This error - if it's an issue - can be reduced by using an Op Amp having a higher gain at the frequency of interest.

The Op Amp input offset voltage is another error source and it can be compensated for in firmware or it can be minimized using an Op Amp having a suitably low Vos. No calibration is needed as the information needed to perform a firmware input offset compensation is present in the Op Amp waveform.

The ADC needs to have considerable headroom to measure non-linear loads having a high crest factor. I see some instruments designed for a 5:1 crest factor. To measure a 20 A RMS, and be sufficient for a 5:1 headroom, the ADC needs to be able to measure 100 A without clipping. The CT can still be rated for 20 A RMS.

A simple topology is to drive the CT into a true RMS converter and drive that into the ADC.

A more accurate way can be to drive the CT into the ADC then drive that into a uP. The uP then does the RMS computation. Note that in neither topology is rectification needed. To drive the AC waveform into the ADC the CT drives through a capacitor to a DC bias voltage at the ADC input. The ADC measures the waveform periodically (perhaps at a 1200 Hz rate - it depends on the accuracy and resolution needed) and performs an RMS computation.

One can get fancy and derive the circuit power from the CT transformer itself. To do this the circuit periodically disconnects the burden resistor and substitutes a shunt regulator which charges a cap. An easier (but more expensive) method is to use a second CT to power the circuit. To go along with this a WiFi link would be great. Clamp the device onto a wire and it can be polled for the RMS current by a PC.

In any event the design should conform to the appropriate safety standard. I don't know which one offhand but for my own use I would default to UL60950. If I was designing it as part of my job and having it certified I would locate and purchase the correct safety standard.

Yes, some instruments require a 5:1 crest factor. But that is not necessary for our usage.

Additionally, we are quite familiar with the appropriate safety standards and our design follows them completely.

And yes, there are other ways to do it. But as I have said before - we have the design planned, and it is the best one for what we need. We are not new at this, and are definitely not the hobbyists you seem to think we are.

I really don't think there is anything to add. All I wanted here was a less expensive source for split core CTs, for which I have received a good response.

If I could add a little light (and not too much heat) to this discussion, a precision ground-referenced rectifier can be made with 2 GOOD rail-to-rail op amps and no diodes, per the attached diagram. When Vin>0, all op amp inputs and outputs go to Vin, so Vout=Vin. When Vin<0, op amp output A goes to its negative rail, which is 0 volts, and this is applied to the + input of op amp B. The output of op amp B does what it has to do to bring its - input to 0 volts; it does this by taking its output to +|Vin|. It is possible to also get some gain out of this rectifier by using additional resistors.

As I mentioned to Dave - the design is complete. There will be no further changes.

And yes, your circuit does not require diodes - but still suffers from the same crossover distortion of precision rectifiers with diodes. The only difference is the resistors must be very closely matched. Even 1% resistors can theoretically have up to 2% crossover distortion in this circuit.

Some crossover distortion is inherent in any rectifier; the project as a whole will determine the amount which is allowed. What we have is within the limits of what is required.