Continuous Systems Validation - System Simulation Configurations

In the last post, we dove into the progressive means of representing and assessing designs in mechanical, electrical and software domains. We found that in each discipline, the representations changed. We also found that the means by which each design representation is assessed progressed as well.

Now why is that important?

Simulating Systems by Leveraging Discipline Specific Assessments

When it comes to simulating systems, you can always create a system model that is representative. You can use that model to simulation a system's performance. However, and here's the catch, the holistic system model will always lack the fidelity of discipline specific representations and assessments. Essentially, I'm saying that the system model will never predict the behavior of the mechanical design more accurately than the mechanical model.

Instead, what some organizations are trying to accomplish it assemble system models form the disparate discipline specific models. But there are two challenges that await such organizations.

Connecting Simulations for a Holistic Picture

How do you connect these various design representations and assessments to each other? They need to interact. Here's a few examples to drive the point home.

The logic of the controller should affect the behavior of the mechanical simulation. And the opposite should be true as well. With sensors, the resulting behavior of the mechanical simulation should act as an input for the controller software.

When the board or processor overheats, sensors communicating with thermal management software should kick a fan on or circulate cooling liquids.

And the list of interactions can go on and on and on. At some point before the design gets released, those interactions need to be vetted. And if you're spending big bucks on building prototypes or testing labs, you don't want the first glimpse of true system performance to occur in the test environment. You want it to be a last validation.

Connecting Hardware for a More Complete Picture

Furthermore, it's not just about connecting all the simulations together. As mechanical or electrical hardware becomes available, you want to be able to plug that in and perform tests. That means you will have various combinations of software-in-the-loop and hardware-in-the-loop. Which leads to the most fundamental and important concept in this three part blog series.

Advancing with the Design Progression

Connect these concepts together and you realize that there is massive and dynamic configuration management problem at the heart of progressively validating system performance. Here, let me show you a few examples to make my point.

Early on in development, you need to connect a 2D sketch of the mechanical design with kinematics and digital calculations to a 2D logic diagram of a board to a UML software model.

In detailed design, you need to connect a 3D model with flexible bodies to a 3D board assembly that has a thermal cooling simulation with partially completed software code.

Close to testing, you want to hook up a mechanical physical prototype to a prototype board and compiled software code.

All that is complex enough, but it gets more difficult. Organizations don't want to simulate systems performance at these three points, but continuously as designs progress from one representations to the next. What does that mean? Well, when the electrical engineering team goes from a 2D logic diagram to a 2D layout of the board, the system engineering team doesn't wait around for everyone to increment their design representations. They want to incorporate that representation into that early conceptual system simulation model.

Summary and Questions

Time to recap.

Organizations are increasingly wanting to continuously simulate system performance instead doing it incrementally.

Organizations are also wanting to replace digital simulations with real operating hardware when it becomes available.

This combination of trends presents a huge configuration challenge for system simulations. Especially as different engineering disciplines will incrementally improve the representations and assessments of their designs.

Those are my thoughts. What are yours? How continuously does your organization assess system performance today?