A new force test stand has been added to our product range (HPTS-500N), based on a rack and pinion hand press.

With the addition of some brackets to raise the throut clearance for the attached gauge and button clamp, plus two toggle clamps and retention plate to restrain the material which the button under test is attached, we have a stand alone low cost button pull off test solution.

A convenient and inexpensive method of calibrating the DFTS5000i inhouse is to use a seperate calibrated smart load cell. Linking the calibrated load cell to the unit under test via appropriate springs, enables the recording and comparison of the force measurements over the entire working range. The smart load cell can be treated like a gold standard and used for regular calibration checks, with traceability maintained by periodic returns to the manufacturer for recalibration without interrupting production testing.

Many tests require the data to be validated by a certificate of calibration, this can easily and quickly become an expensive overhead. So, the first question, is it necessary? The answer is subjective and probably not. We will explore this answer later in the blog but to do this we have to understand calibration, so what is it?

Calibration is the comparison of two measurements obtained in a similar way under similar conditions. One measurement of known magnitude or correctness and the other “the unit under test” or item requiring calibration.

Traditionally, if you wanted to ascertain if an instrument was accurate within its specification, say +/-1%, you would compare the measurement with that taken by a superior instrument previously calibrated to +/0.25%, giving an uncertainty ratio of 4:1 These days with more sophisticated electronic systems it is normally 10:1

So, if I we need to calibrate a 50N force gauge in tension at two relevant fixed points, we could acquire a 1Kg and 4Kg weight certified accurate to 0.1%, 1gm and 4gms respectively, suspend them individually from the gauge and take readings. If the indicated readings were within 990gm and 1010gm for the 1Kg weight, then between 3960gm and 4040gm for the 4Kg weight, then we could realistically say thet the force gauge was calibrated, of course quoting the test procedure and identity and certification (traceability) of the weight used.

One can see, that this actually is only an indication, as we have only examined 2 fixed points and a nonlinear measurement function could have significant error mid scale at 3Kg while not being made visible by the calibration check. Now comes the debate, how many calibration points required, and increasing cost of equipment (calibrated weights) and length/ difficulty of the procedure. Costs escalate as the measurement range increases, as well as logistics and safety of using larger weights. hence the gravitation to comparison of test instrument with a “standard” instrument and automating the process. In the next blog we will examine the use of the DFTS5000i and LabView application to self calibrate.

The definitive metric for establishing the quality of a connector or terminal crimp is the force required to pull the terminal off the wire. Crimp connectors used on power cables up to 30KV are designed and type tested to the European standard EN61238-1:2003 The subsequent bonding of the connector to the wire is done with a crimp tool usually specifically designed and supplied by the crimp connector manufacturer. It is the quality and consistency of this crimping process that is the subject of pull testing during the production of a cable harness. This testing is less stringent than the type testing outlined in the above standard and is usually specified by the connector manufacturer. As such absolute accuracy is less important than consistency as the procedure is designed to highlight problems with the tool operation not the crimp design.

National Instruments LabVIEW Modbus VI is modified to set the speed of the DFTS5000i test stand, initiate the test from event (mouse click on start button), then capture the force measurement from the panel meter, displaying values in real time on the simulated chart recorder.

The panel meter and motor drive each have a RS485 serial port and can be configured to use either ASCII or RTU (Binary) Modbus protocols. In this instance RTU is used for its increased efficiency (hence speed), each port is a slave on the bus controlled by the master port on the laptop. As the laptop does not have a RS485 serial port, it is simulated via a USB to Serial (RS485 version) Converter which windows presents as a virtual serial port available to the LabVIEW software.

The rate of force application to a test specimen is usually dictated by the test standards being complied with. The specification for crimped joints (connectors) for aircraft electrical cables and wires BS 5G 178-1:1993 states the jaws clamping the test specimen shall seperate at a steady rate of 25 mm/min to 50 mm/min. Other standards such as EN61238-1:2003 (connectors for power cables) are more stringent and will be discussed later.

The speed of the test stand platform is determined by the drive motor and the mechanical gearing. The gearing is fixed at design leaving the motor speed as the controlled variable. The DFTS5000 series of test stands uses a three phase motor to apply load, with the voltage and frequency provided from a variable frequency drive. It is the VFD which is configured and controlled from the PC/Laptop/Tablet via a serial connection.

Digital speed Control

The VFD-EL series used has an industry standard RS485 physical interface and provides both MODBUS ASCII and RTU protocols, making it compatible with a host of test software. Demonstration LabVIEW VI’s are available.