Tags

Recent tweets

Find us on Facebook

The Convergence of Engineering Disciplines in Modern Product Development

Modern systems such as electric cars, unmanned trains and airplanes, smart phones and healthcare equipment contain a growing number of electronic components and software to deliver function to the final users that was unimaginable only a few years ago. As complexity increase, design challenges increase exponentially. This complexity is the result of two major factors – the increase in scale (e,g, number of product functions, computation units, line of codes and so on) and the fact that any modern product today is in fact a cyber-physical product - its behavior is highly dependent on the interactions between multiple engineering domains such as mechanical, electrical, electronic, software and more. The multi-domain, Cyber-Physical aspect of modern complex systems poses great challenges in coordination between the various disciplines. Engineering teams are working within complex and heterogeneous environments with unprecedented scales of data that needs to be continuously analyzed and reported on during the program evolution. The heterogenic aspect of these programs, as well as the typical multi-level supply chains across both hardware and software disciplines, pose even greater challenges on products’ shipment and quality.

These challenges can be categorized as the data management convergence challenge and the dynamic convergence challenge. Each one of these challenges can be analyzed separately and addressed by different technologies, but it is only the combination of these solutions that provide a true end-to-end engineering discipline convergence.

Data Management Convergence:Controlling the design process in current engineering environments requires great coordination, collaboration, and information exchange that is almost impossible with today’s technologies. Modern engineering environments are typically characterized by the hundreds (and sometime thousands…) of engineering toolset and data repositories that need to be integrated, geographical distribution of design teams, and the outsourcing trend that requires managing the relationship with distant design teams under different legal/commercial entity.

The systems industry is crying for new technologies and innovation to resolve these challenges. As an answer to this clear and urgent business demand, new technologies that inspired by the Linked Data approach (http://linkeddata.org/) are emerging and present great opportunities for improving efficiency, quality, and productivity in the design and development of complex products and systems.

Linked Data, when applied to the engineering domain, is sometimes called Linked Lifecycle Data. This approach move away from traditional (and painful…) integration approaches such as peer-to-peer API and single DB schema towards more flexible, federated, and extendible integration principles. While engineering tools that support the Linked Lifecycle Data approach require to implement a set of architecture principals that are inspired by the way the Internet was designed and architected, one would argue that the most important concept that distinct this approach from other integration approaches is the complete separation between tools’ data and tools’ logic. Similar to the fact that a consumer of Internet content should not care about the way the webpage content was created in the first place (e.g. what programming model or language was use, what is the Web Server OS or what are the data sources that were used to generate that webpage content), a Systems Engineer, Mechanical Engineer, or Software Engineer should be able to consume any engineering data regardless of the tool, vendor, or tool version that was used to create this data. Indeed, this concept looks basic and almost trivial, but this simple architecture principal is one of the main reasons why the Internet became what it is today - the most scalable, open, and integrated system of systems that was ever built by mankind.

So why is this simple to grasp concept is not used more often in modern engineering environments? The reason is commercial. As opposed to the Internet, a system that was designed at the first place to be an open integrated system, engineering tools were traditionally designed to be closed systems where the tool logic and the data it consumes or produces are single entity. This approach gives these tool providers much control and power over their customers, but it is also the main inhibitor for the wanted engineering convergence that is mentioned above. To answer growing customer demand for open integration, some of these vendors claim to support “Open interface”. However, what this usually means is an open API that allows users to start tool functiona through proprietary interface. Users are still required to own and know the specific tool/systems being used, and to analyze its proprietary data structure.

With the growing adoption of Linked Lifecycle Data, and specifically through standardization efforts such as OSLC, we are seeing early exploitation of engineering convergence. Linked Lifecycle Data enable new types of real-time insight across the product design lifecycle that were not possible before. When data is openly accessible, innovation emerges. The best example of such innovation is Google Maps, a web application that analyzes and present data that is created by other systems in innovative way, a way that creates new value for its users. In engineering, with the recent introduction of Rational Engineering Lifecycle Manager (RELM), we are seeing similar evidences. Cross engineering domains search, structured queries, dynamic views of engineering data, and impact analysis provide significant additional value on top of existing capabilities and can change the way systems engineers perform their day to day tasks.

Dynamic convergence : As mentioned before, modern systems are characterized by the interconnection of both cyber and physical aspect. No modern system today is compounded of just software, hardware, or mechanical components. The integration of these disciplines into a single operating unit is what make these new systems smarter, but also much harder to design, analyze, test, and verify. Cross-domain data management convergence is a mandatory for such analysis, but is not sufficient. To address the challenges of these new Cyber-Physical systems, and especially to reduce the risk of unwanted subsystems interactions found late in the development process, new technologies need to be developed. These new engineering approaches should be focused on the dynamic aspect of the systems, i.e. how the systems behave over time, and how the systems components interact. These components can be mechanical, electrical, electronic, software, or any combination of these. Hence, there is a need to analyze the interaction between different engineering domains dynamics. These technologies should be more rigorous, built on solid mathematical foundations, and enable the analysis, optimization and verification of large, multi domain complex systems. Systems requirements will need to be addressed in a different way with a more formal semantic approach so it can be used later on in the design cycle in providing guarantees of performance and reliability against customer requirements. Architectures must be built in a more modular and extensible way. Last but not least, new techniques, methods and analytical tools that address the multi-domain and different semantic approaches of the various engineering disciplines must be in place.

To summarize, the combination of engineering data management convergence and the new technologies that will enable dynamic convergence, Systems Engineering will be able to search, access, analyze, simulate, verify and optimize modern cyber-physical systems. They will be able to predicted faster product performance before prototyping and manufacturing phase, and supports streamlined collaboration and data sharing across the product lifecycle. As complexity continue to increase, these innovations are critical for the sustainability of our industry.

About the Author

Amit is IBM Rational's Technical Client Relationship Manager for the Systems Industry, in charge of promoting and pushing forward new innovative Systems Engineering solutions in the Aerospace and Defense, Automotive and Electronic industries. He is also a member of IBM Industry Academy, the most prestige IBM Industry forum. Prior to joining IBM Software Group, Amit was a senior manager at IBM Research, Haifa, where he worked closely with selective IBM clients in developing new approaches for complex systems design and analysis, business optimization and transformation solutions. Prior to joining IBM, Amit served as Information Systems engineer officer at the Israeli Air Force.