Peter Kronfeld, born in 1962, has always taken great interest in the subject of technological change in the economy, society and business. This already started when he was a student of economics and communication and he has been keeping track of these topics as a journalist and as managing director of HighTech communications GmbH until today.

“The evolution of AI in the smart factory” – by Trevor Galbraith

After meeting him at SMT Hybrid Packaging I asked Trevor Galbraith, publisher and editor-in-chief of Global SMT & Packaging for a guest commentary. Here we go – many thanks to Trevor for this contribution to smart-smt-factory-forum.com.

When you sit back and think about it, the evolution of surface mount assembly in the early 70’s was a milestone achievement. Gluing a wide range of different-sized components to a substrate, accurately, then baking them in a reflow oven, where the substrate, interconnection materials, board finishes etc., all expand at a different rate (CTE), and it still works, is remarkable.

Now, overlap today’s challenges to optimize this process in a ‘smart factory’ environment and the fun really begins. Using a widening array of sensors that can replicate all the human senses, (sight, sound, smell, touch and taste), manufacturers of electronic assemblies are able to gather huge amounts of data that they can either store for use at a later date, or utilize to optimize the manufacturing process.

Step 1: Collecting and storing large amounts of data

The ability to collect and store large amounts of data is ‘Step One’ in the long journey towards a ‘smart factory’. It is virtually impossible to predict exactly what data you will need in 3-5 years’ time, but with storage costs being relatively inexpensive, it is prudent to collect as much data as you can.
On the cyber-physical roadmap towards the smart factory, it is logical to begin at the front of the production line with the printer. This is the area where most experts claim 60-70% of all process-related defects originate. Artificial Intelligence (AI) currently collects large amounts of data from SPI systems using algorithms and makes closed-loop process corrections. There are also similar closed-loop instructions happening between post-placement AOI and the pick and place machine and also communication between the AOI and SPI systems.

These systems perform more effectively and cover a wider range of potential defects when they are communicating with inspection machine and equipment from the same manufacturer. Of course, they can still perform with third party equipment, but the range of metrology-related functions is likely to be more limited.

Emerging data standards for M2M communication

There are still many road blocks that stop AI systems working to their optimum, not least of which is a recognized industry standard for machine-to-machine communication. At the time of writing, there are currently three main contenders in this race; the IPC CfX standard, Mentor Siemens OML and the newly announced Hermes standard. All have merits and I am confident one will emerge as the preferred standard by the end of the year.

Once the industry standard is established, AI can continue on the next phase of it’s evolution –to become self-learning! That is, teaching the system to identify potential process outliers and auto-correct without human intervention. In other words, true ‘predictive testing’.

On the wider stage, this latter phase is controversial. Warren Buffet believes that AI could be “enormously disruptive” yet beneficial in making the economy more efficient. On the other hand, the esteemed physicist and author, Dr Stephen Hawking claims that AI could one day become mankind’s greatest threat. There may be a grain of truth to this, but regrettably it is also true that you cannot ‘uninvent’ technological advances. Perhaps like nuclear proliferation, dolly the sheep and other examples of human genius, it will become one of these areas that we will have to monitor and regulate carefully in the future.