Smart Cameras image a better Laser Weld

Laser welding has long played a role in auto manufacturing. Rapid control of the laser’s output power to maximize quality has been the goal of manufacturing engineers seeking high-quality welds in a competitive industry. Control of the process, however, has been less than optimum.

Now researchers at the Fraunhofer Institute for Physical Measurement Techniques (IPM) in Freiburg, Germany, have developed a superfast control method for laser welding processes that uses 14,000 images a second. This controls the laser’s output power and enables optimum-quality welds in car manufacturing. Imaging-based control at such high speeds – about an order of magnitude faster than is viable with conventional imaging software – is achieved by using special CNN-based cameras.

It’s not TV

This CNN is not a TV channel but rather stands for cellular neural networks, a technology that is seen as one of the hot trends in high-speed imaging and data processing. CNN’s methodology for parallel computing is similar to that of neural networks; however, each processing cell or “neuron” exchanges information only with its adjacent neighbors – which turns out to be particularly useful in two-dimensional applications such as image processing, pattern recognition and feature extraction.

In CNN vision systems, each pixel has its own processor, which performs simple operations locally without having to transfer all the data to a computer. Such calculations – e.g., subtracting a pixel value from its neighbors’ to detect edges in the image – are performed via analog computation, and programming is done in a “single instruction, multiple data” fashion addressing all 25,000 processors simultaneously, as Andreas Blug, IPM project manager, explains.

Blug sees the institute at the forefront of applying this technology to industrial processes, with optimizing laser welding being only one of the first. A commonly used laser welding method is so-called keyhole welding, where a highly focused laser beam generates deep and slender weld seams with a minimized heat-affected zone. The challenge is that the laser output has to be set within a narrow power range. If the power is too low, the connection does not extend over the full cross section of the material, and if the power is too high, the laser cuts right through.

Laser welding of metal sheets has to date been considered too fast for in-line control. However, smart image sensors with built-in processors open up the opportunity for mastering this industrial process in real time.
Full penetration is visible in a coaxial camera image as a dark zone, the so-called full penetration hole, directly behind the laser interaction zone. However, for closed-loop control, a frame rate of 10 kHz, i.e., 10,000 images per second, is required to monitor the rapidly changing full penetration hole. Because this was not possible with standard microprocessor-based image processing, welders have used preproduction trials, after which constant power is set. However, protective glass that is used to shield the beam delivery system from debris tends to get dirty and reduces the power over time. This typically happens within 10 to 20 hours but can be strongly process-dependent.

A new solution

The new solution uses Eye-RIS camera hardware from AnaFocus, a spinoff from the University of Seville, Spain, featuring a 176 × 144-pixel cell array. Each cell is interconnected in several ways with its eight neighbors and includes circuitry for image pre- and postprocessing. Preprocessing extracts useful information from the input image flow, for example, eliminating redundant data for the specific algorithm required. Postprocessing supports making complex decisions and taking action, such as deciding whether the welding seam is good or whether power adjustments are required.

“The main know-how is in these algorithms, which include the system and application knowledge,” Blug said. In a joint project with Stuttgart University’s IFSW (Institute of Laser Beam Devices), the method has been successfully tested in the lab. The next step is taking it to the shop floor, a move currently being discussed with interested parties.