Search form

Search form

Gary Mintchell

All of the analog data acquired from manufacturing and products—a.k.a. the Internet of Things (IoT)—dwarfs what is currently known as Big Data.

Big Data headlines not only tech news but also popular news—as in what’s the government doing with all the information it’s storing about us. Big Data comprises just a twig compared with the full-grown oak that Big Analog Data can generate. National Instruments Fellow Tom Bradicich mentioned twice in separate interviews during NIWeek last month that all of the analog data acquired from manufacturing and products—a.k.a. the Internet of Things (IoT)—dwarfs what is currently known as Big Data.

When thinking about data, consider the flow. First is acquisition from analog measurements. This may or may not be used in real time. Then there is data in motion and data at rest. Finally there is archiving the data. Then characterize data by where it is. The insight comes from how the data is used. Real time is important if you are monitoring a motor about to catch fire. On the other hand, maybe you want to go through three years of data to look for trend.

“In test and measurement, we might debate with IT about whose data is bigger,” Bradicich says. “It’s not just size, but also velocity. When data leaves NI devices, it’s in motion. Then first it hits a switch, server or workstation. Now it is at rest in an IT server. Now the IT world takes over for analytics, then archiving. The question for us is, Where do customers want to derive insight? Maybe closer to the instrument, or maybe later at the desk. The four variables of data classically are volume, velocity, variety and value. We have added a fifth—visibility—for who needs to see and analyze results.”

Since NI is a measurement company, it has partnered with several companies to bring a Big Data solution. IBM has become a close partner—not surprising given that NI’s senior vice president of R&D and Bradicich are both from IBM. Specifically, the product from IBM is InfoSphere Streams, part of the IBM Big Data platform. It processes vast amounts of generated streaming data in real time and allows user-developed applications to quickly ingest, analyze and correlate information as it arrives from thousands of real-time sources. The solution can handle very high data throughput rates—up to millions of events or messages per second.

Terabytes of data
An NI partner, Phasor Measurement, has developed a solution to monitor the electric power grid. Bradicich says it can generate 5 TB of data per month. A wind turbine can generate 10 TB per day, and a jet engine can generate 20 TB per hour. It's easy to see how this fast, streaming data could add up quickly.

Duke Energy built a system to conquer the problem of monitoring and analyzing diagnostics of its “fossil fuel fleet” of generating plants. The old way sent condition monitoring specialists to each site with handheld data collection devices. The company figured that the specialists spent 80 percent of their time merely collecting data while using only 20 percent of their time actually analyzing the data. Implementing a Big Analog Data solution, predictive maintenance specialists in remote centers watch key signatures from equipment and note abnormalities. They can then compare these signatures when necessary to a fault signature database and take corrective action much more quickly.

When you delve into the guts of a buzzword, sometimes you find a solution to some intractable problems. So, don't get turned off by all the hype of Big Data. See how you can use it to solve your major engineering problems.

We’ve been discussing the idea of things talking for many years. Now technologies and use cases are finally starting to coalesce into the ‘next big thing’ for improving manufacturing and production.

It will be interesting when things really begin to talk. And listen. And talk back. We’ve been discussing the idea of things talking for many years. Now technologies and use cases are finally starting to coalesce into the “next big thing” for improving manufacturing and production.

You know the technology is finally stable enough when suppliers that are not thought of as specifically “automation” suppliers are embedding connectivity into their products. That is one of the top goals for OPC UA, according to Tom Burke, president of the OPC Foundation (www.opcfoundation.org), for example. Technology should just be used.

The original conversations were distinctly more limited—as is often the case as technologies are developed. We started with something called “machine to machine” (M2M). This was really just substituting cellular networks as the new broadband for remote SCADA applications. But it was the beginning of the conversation.

>> Can we Talk?Let the work begin on the Internet of Things, M2M, Pervasive Internet—or “Industrial Internet.” http://bit.ly/1cjfum6

Before too long, wireless technology development was blossoming. Wi-Fi (IEEE 802.11 specifications) had become ubiquitous. Wireless mesh network development, based on IEEE 802.15.4, has progressed beyond the chaotic/posturing stage and has settled into two camps of industrial standards—WirelessHART and ISA100. All of these wireless technologies provide a stable connectivity medium with many options so that engineers can design a system suitable for their applications.

What if things talked?
Many years ago, I had a “philosophical” conversation with Peter Martin, vice president at Invensys Operations Management (iom.invensys.com), around the thought, “What if you could easily and inexpensively install sensors in many critical areas of the plant? How much could you know about how the plant is really working?” Thereafter, Invensys began revealing software development designed to exploit information directly from the plant, as well as any other relevant data source, in order to help its customers make better decisions at every level of the organization.

Similar conversations with John Berra, then CEO of Emerson Process Management (www.emersonprocess.com), and Peter Zornio, chief strategic officer of Emerson Process Management, delved into the technologies of wireless sensor networks (WSNs) that would enable these visions of more and better plant operations data. Emerson—along with several other companies—backed the WSN developed by the HART Communication Foundation known as WirelessHART. Others backed the technology espoused by the ISA100 committee (www.isa.org/isa100). These technologies are now being rapidly deployed.

You know that technologies are maturing when companies not known specifically as “controls” companies begin embedding them in products. The SKF Group (www.skf.com/group), which manufactures bearings and related components, has recently done just that. It has released SKF Insight, a bearing with sensors and a WirelessHART transmitter embedded. Now your condition monitoring of rotating equipment can go to an entirely new level.

The basic technologies will never realize their full power until there is some binding that brings it all together. Recently, I talked with David Friedman, one of the founders and the CEO of Ayla Networks(www.aylanetworks.com), a company that just emerged from stealth mode with a couple of announcements. I think that what they are doing is significant to manufacturing and production (and a lot more, as well).

“We are at the beginning of a major evolutionary step for the Internet,” Friedman says. “We have built a platform that eliminates the hurdles involved in building great connected devices and bringing them to market. We have also created a business model with key partnerships to deliver on this vision.”

The company’s platform seeks to simplify and accelerate product development for manufacturers while enhancing usability from the consumer’s perspective, leading to greater overall satisfaction and lower costs for everyone. The efficiency by which the company can provide its service also minimizes the cost of connectivity.

Users give examples that show the benefits of digital technologies, including reduced scrap through better data, and money savings in operations.

“Are all those technologies real?” he asked.

A professional in the audience during my presentation on digital technologies (networking, fieldbuses, OPC, MES, diagnostics and the like) at the recent Maintenance and Reliability Technology Summit (MARTS) interrupted my presentation with that question. I didn’t mind the interruption. In fact, I prefer them.

But that question really stopped me. There was one person in the room who was up to date with the benefits of using Foundation Fieldbus or Profibus PA. One other person voiced some outdated information that continues to survive. In this person’s case, engineering (or someone) had told him that it was impossible to extract diagnostic data from all the HART devices they had in the field; that they could not feed a manufacturing execution system (MES) with data so that his maintenance crew could get more information than just alarm data from operations before going out to check on the problem.

This discussion could not have been more timely. There are few better places to gather information about technologies that work than a conference focused on operations management. Only the week before, I attended MESA International’s 2013 North American Conference in Greenville, S.C. Several people who had implemented an MES/MOM solution spoke to the lessons learned and benefits gained from the technology.

>> Cross-Industry Learning:You probably know far more about the leader in your specific industry than you do about the leaders in other manufacturing and production industries. Can you ever become the leader with just that type of view?http://bit.ly/13LC99l

Mike Yost, MESA president, discussed how his first MES project several years ago yielded many benefits that were never documented, so they never received credit for them even though enough benefits were documented to justify the project. Members and practitioners alike need to educate the market, Yost said.

Users speak
I was able to attend two sessions presented by manufacturers who had recently implemented an MES/MOM solution. The first used the Catalyst Workflow application from Savigent Software (www.savigent.com) as part of a continuous improvement program specifically targeted at improving overall equipment effectiveness (OEE) numbers. The second implemented a solution from Rockwell Software (www.rockwellautomation.com/rockwellsoftware), also as part of a continuous improvement effort, saying he was “pursuing previously hidden information” to aid the effort.

Rather than saying, “What gets measured can get managed,” we should say, “What gets managed gets improved,” according to the first speaker. He had been looking for a platform to automate data collection. The company needed to reduce the need for additional capital while reducing costs and improving OEE. The company was able to track metrics and manage accountability. By gaining increased visibility into manufacturing processes, gap analysis could be employed to improve manufacturing metrics. Manual data entry gave false OEE reads. Better data gave insight that led to scrap reduction of 6 percent.

The second speaker, from a major automotive manufacturing company, spoke of using a holistic approach to pursuing previously hidden knowledge and value. Much of the project involved plant visualization. They made the line status and problems visual. The idea is that no problems are hidden. When the project was begun, there was either no plant visualization system or only a legacy system. This also meant a lack of data for problem identification.

The program unlocked machine data, adding visualization into the process. One of the most important things they discovered is the need to have a unified data model. Environmental reporting improved greatly by having access to more information than ever before. Critical process checks were made reliably and automatically, where previously they had people wandering around checking instruments and writing down the data. The plant moved five dispatchers to more critical areas, thus saving money in the operation.

Both of these projects were IT projects done in cooperation with process engineering, showing that it is possible for the two groups to collaborate.

So, it is real. Failure to use all the tools that technology offers is a recipe for a failing plant.

Germany today is awash in conversations regarding an initiative known as “Industry 4.0.” I just returned from the Hannover Fair in Germany. As it was explained to me, Industry 4.0 began as a German government initiative to spur the industrial sector, which is very important to the German economy.

We hear all manner of conjectures about the future of manufacturing. Some worry about the new generation of engineers, managers and technicians. Others worry about corporate management’s lack of understanding of manufacturing and the subsequent devaluation of manufacturing as a strategic corporate resource.

I read “The Second Industrial Divide,” a 1986 book in which the authors (Michael Piore and Charles Sabel) argue that there were two industrial “divides,” meaning that we were just then about to enter the third phase of manufacturing.

The first industrial divide was the Industrial Revolution, where steam- and water-powered mechanical machines assumed much of the work of individual artisans. The first divide happened, and then we found ourselves in the second industrial revolution of mass production. The authors argue there would be a new phase after the second great divide, one in which there would be a combination of the two—sort of a return to artisans, but within the mass-production model.

Germany today is awash in conversations regarding an initiative known as “Industry 4.0.” I just returned from the Hannover Fair in Germany. As it was explained to me, Industry 4.0 began as a German government initiative to spur the industrial sector, which is very important to the German economy. The government even contracted with what we would call a “think tank” to define the concept.

Siegfried Russwurm, member of the executive committee of Siemens AG (www.siemens.com) and also CEO of Siemens’ Industry group, defined the 4.0 concept during the company’s press conference in Hannover.

Industry 1.0 was based on the introduction of mechanical production equipment driven by water and steam power, he said. Industry 2.0 was based on mass production achieved by division of labor and the use of electrical energy. Industry 3.0 was based on the use of electronics and IT to further automate production, Russwurm said. Industry 4.0 was based on the use of cyber-physical systems.

“Cyber-physical systems” was defined by the think tank contracted by the German government, Russwurm said. Actually, James Truchard, president and CEO of National Instruments(www.ni.com), promulgated this concept in a presentation in 2006. Those familiar with NI know about its founding concept of a “virtual instrument”; that is, doing instrumentation in software. In this case, he was referring to a virtual representation of a manufacturing process in software. Think simulation, for example.

Humans needed
If this sounds like a lot of automation and computerization, it is. But Russwurm, formerly head of human resources for Siemens, responded that humans will always have a place in manufacturing. “Humans conceptualize, design the product and determine production rules and parameters. CPS (or virtual manufacturing) simulates and compares production options on the basis of instructions, then proposes compliant ‘optimal’ production paths. Step 5 is selection of an optimal production path and implementation of product.”

Festo also emphasized Industry 4.0 in its discussion with me in Hannover. As its spokesman explained, “In the era of Integrated Industry, individual workpieces will themselves determine what functions they need production installations to provide. An important milestone on the road towards Integrated Industry is the concept of integrated automation from Festo, based on the automation platform CPX.”

This represents a shift from rigid, centralized factory control systems to decentralized intelligence. “Tasks that are currently still performed by a central master computer will be taken over by components,” predicted Peter Post, head of corporate research & program strategy at Festo. “These will network with one another in an intelligent way, carry out their own configuration with minimal effort and independently meet the varying requirements of production orders.”

The German companies took great pains to explain how they already have products that meet Industry 4.0. Meanwhile, in the United States, an initiative known as the Smart Manufacturing Leadership Coalition (see www.automationworld.com/operations/-administration-supports-advanced-manufacturing) is also working on the future of manufacturing and, in my opinion, is seeking a more comprehensive model. We’ll see what happens.

Even though I had picked up signs of skepticism about cloud computing from end users, the conversations at the ARC World Industry Forum implied that moving to the cloud is no longer controversial.

Enterprise Information Technology (IT) professionals and suppliers always question why I’m interviewing them about using the cloud as part of their IT strategy. As far as they are concerned, it is a “done deal.” Manufacturing IT professionals, manufacturing managers and automation engineers remain skeptical, however, about the security and robustness of storing data in the cloud.

The funny thing is, the cloud has become part of our everyday life; from online banking through enterprise resource planning applications, everyone is probably using some form of cloud-based system—even those skeptics. We are discovering many applications where IT and engineering are finding the cloud useful for sharing data among geographically dispersed users.

I moderated a webcast on January 31 (www.automationworld.com/cloud-manufacturing-path-success) that is still viewable online. In it, an end user described the system he developed and implemented using cloud storage and a variety of devices to keep managers abreast of the latest production status and enable better, faster decisions.

Rick Moulton, the supervisor for plant systems and automation with Construction Resources Management Inc. (www.jobscrm.com)of Waukesha, Wis. was the third featured speaker. He is in charge of the process control systems and IT systems for all of the asphalt plants owned and operated by Northeast Asphalt, Inc. (www.neasphalt.com)and Payne and Dolan, Inc. (www.payneanddolan.com).

If you have not yet listened to the webcast, I highly recommend it. Rick is an articulate speaker and his message of getting information out to a diverse group of people in the many locations of his company using cloud technologies brings the concept to life. The cloud does not have to be a scary place. It is a useful tool for manufacturers.

Making progress
I just returned from the 17th Annual ARC World Industry Forum in Orlando, Fla. sponsored by the ARC Advisory Group. There were many conversations about the cloud in sessions, in private meetings and in hallway conversations.

Even though I had picked up signs of skepticism about the cloud from end users, the conversations at ARC implied that moving to the cloud is no longer controversial. Many people referred to Amazon S3 (a Web-hosted cloud service of Amazon.com) quite matter-of-factly.

Another conversation with Chris Brighouse, product manager with EMC (www.emc.com), a documentation solutions provider, involved the ways that EMC has enabled collaboration among the various parts of a manufacturing enterprise using the cloud. In fact, the EMC Documentum Engineering, Plant and Facilities Management (EPFM) OnDemand solution can greatly assist hand over from the EPC firm to the owner/operator by placing the P&ID, equipment list and other engineering documentation in the cloud to be readily accessed by operations and maintenance at startup.

Innovation dead?
Also at ARC, Invensys Operations Management hosted a lunch with CEO Mike Caliel with opportunity for an informal question-and-answer session. One analyst present stated that there had not been any innovation in the industrial automation space for some time and that “innovation must be done.”

During a private meeting later in the week, Caliel asked what I thought about that remark. “Good God, no,” was my response. Instrumentation keeps getting smarter offering enhanced diagnostics; wireless sensor networks have yielded more plant information; the cloud along with mobility, better analytics and better visualization have improved decision-making. No, innovation remains alive and well in our market.