Cloud and CAD are probably getting to the point where it starts become a real thing. Autodesk Fusion360, Onshape, SolidWork Industrial design. It is likely to absorb some PDM functionality to make collaboration, branching, revisions and other data management tasks easier. Cloud CAD means no files, so engineers have nothing to mess with… Life is getting more Googley if you read John McEleney Onshape blog.

However, here is the thing… What if (for some crazy reason, which is easy to imagine when you deal with engineers :)), customer will decide to do a work with two cloud CAD systems? It is not unusual to see multiple desktop CAD systems in engineering organizations, so why cloud CAD will be any different.

In my yesterday blog – Cloud CAD infrastructure is getting more PDM-ish, I draw the picture of cloud CAD/PDM bundle helping us to collaborate and manage revisions. Now how two cloud CAD systems will work together? I’ve been trying to bring my cloud imagination and thought about Google Docs and Office 365 services co-existence. Actually, it is not very nice story- I can easy get my files distributed between my Dropbox, Google Drive and OneDrive accounts. So, what if my parts will be stored on Google Drive and Assembly on Dropbox? Not sure I will like it…

Similar problem in PLM world created many debates and issues. Do you remember Dassault CATIA V6 story, which required ENOVIA backend to run it? It made few customers unhappy when they discovered that they need to run two PDM/PLM systems. I can see some similarity with multiple CAD/PDM cloud bundles co-existence and interoperability.

What is my conclusion? How engineers will collaborate using multiple CAD cloud software? Cloud technology is great, but it looks like cannot magically resolve some old fundamental problems of multiple systems, collaboration and interoperability. I wish cloud CAD / PDM vendors will think about it upfront before customers will find themselves in the middle of messy CAD / import/export/migrate data scenarios. Just my thoughts…

One of the heavily debated topics in CAD/PLM industry is data interoperability. I remember discussion about data interoperability and standards 10-15 years ago. Even vendors made some progress in establishing of independent data formats, the problem is still here. At the same time, I’m convinced that successful interoperability will play one of the key roles in the future of CAD/PLM. Navigate your browser to my article with few examples showing how important data interoperability for building granular architecture of future application and collaboration.

IoT (Internet of Things) is relatively new trend. We started to discuss it recently. Applications of IoT are bringing lots of interesting opportunities in many domains- smart houses, connected devices, infrastructure operations and many others. However, here is the thing – we can see many companies looking how to get into IoT field. By nature, this field is very diverse. I can hardly can imagine single manufacturer supplying everything you need for your "smart house". So, we are getting (again) into the problem of interoperability between devices, services and processes.

What will slow rapid adoption of IoT? Standardization, including data standards, wireless protocols and technologies. A wide number of consortiums, standards bodies, associations and government/region policies around the globe are tackling the standards issues. Ironically, with so many entities each working on their own interests, we expect the lack of standards to remain a problem over the next three to five years. In contrast, dropping costs of technology, a larger selection of IoT-capable technology vendors and the ease of experimenting continue to push trials, business cases and implementations of IoT forward.

It made me think about two issues. The problem of standardization and data interoperability can be only solved by business interests of vendors. With absence of mutual business interests we will see dumb devices not interconnecting and managing to exchange data. The value of IoT solutions will be impacted. The second problem is related to PLM vendors consuming data from multiple devices and services to improve decision making. Standardization in that field can provide an advantage and present a solid business interests of vendors.

What is my conclusion? We can see an entire new industry of IoT is under development these days. Data interoperability is a problem that needs to be resolved earlier than later. Roots of data interoperability problems are usually related to hidden business interests of vendors. Learning from previous mistakes of CAD/PLM industry can help. CAD/PLM vendors can provide tools that helping manufacturing companies to build a better connected devices. Just my thoughts…

Data. Conversion. Interoperability. Translation. The discussion about these topics is endless in CAD/PLM world. Customers are looking for interoperability between different product versions, competitive products, data models, data formats, databases and geometrical kernels. Customers were always first impacted by problems of interoperability. The lifecycle of engineering and manufacturing work is longer than typical lifecycle of product version or even engineering IT solution. Technically, data interoperability is a complex problem. It is not easy to solve, even if you are want to do so. Evan Yares recently posted an interesting article about interoperability – CAD Interoperability today. Interoperability plays an important role in product lifecycle applications in large OEMs and Supply Chain.

Until now, the perception was that customers are most impacted from data interoperability problems. It was true until very recently. However, I can see some new trends and changes in this space . Consumerization, BYOD and cloud trends are introducing new elements and aspects in product development roadmaps. CAD/PLM vendors are forced to think about cloud and mobile development as well as potential disruptive competition coming from newcomers and other vendors. New design applications become more granular and focusing on a specific functionality or target customers. Two examples of recent announcements are Autodesk Fusion 360, SolidWorks Mechanical Conceptual. These application were born to co-exist with old products. Existing products won’t retire tomorrow. The ability to re-use data with existing product lines such as Inventor (for Autodesk) and SolidWorks (for Dassault) and other CAD packages will be vital for success of new products. I’ve been reading GraphicSpeak – SolidWorks Mechanical Conceptual introduced but not delivered article earlier today. Randall Newton is talking about the product SolidWorks Mechanical Conceptual (SWMC) announced by SolidWorks during SolidWorks World 2013 in Orlando last week. SWMC is build on top of Dassault 3DEXPERIENCE platform. I found the following passage interesting:

Reading between the lines, so to speak, of what was said at SolidWorks World, it seems two critical challenges remain before SWMC will be a selling product. It must prove to be fully and seamlessly interoperable with SolidWorks, and it must be more cloud-based. Interoperability has always been a significant challenge in the 3D CAD industry. 3D kernels are complicated. Dassault’s 3D Experience platform uses the CGM 3D kernel; SolidWorks uses the Parasolid 3D kernel from Dassault’s rival Siemens PLM. Completely accurate automated moving of files from Catia V5 and V6 is not commonly possible, and they share the same 3D kernel. Most of us can only imagine the complexity of moving between CGM and Parasolid.

Granularity is one of the most trending topic these days. Everybody are thinking about Apps. Company are moving away from developing heavy and complex product suites towards granular applications. Al Dean of Develop3D wrote an interesting article about granularity few years ago – Why granularity is going to rock your future… This is my favorite passage:

There are two things that might influence this and push us into further levels of explicit detail and granularity. The first is the ‘cloud’ (yes, I broke my own rules). When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of. Consider what would happen when you start to work on today’s products, in a highly collaborative environment, where data is being passed globally, between teams, between languages, between professional disciplines. And you still need to track data down to this type of level. And when you’re working on a product that looks like X-Ray image.

What is my conclusion? I agree with Al Dean. We will see more granularity in data and new applications. Interoperability becomes a very important factor in a future success of new apps. New level of data compatibility is required. Vendors will be forced to improve the level of interoperability of their existing products as well as new apps. Interesting time and change happens these days. Vendors need take a note. Important. Just my thoughts…

I read Fortune CNN Money Blog article by Jon Fortt – Chrysler’s Engineering Software Shift. In the competitive world of PLM software it raises again the question about what is the better choice – Open or Closed? The context of this article is leaked information about Chrysler’s movement from CATIA to NX or, maybe more from DS PLM product lines to Siemens PLM. However, author made a nice association between the engineering software story and bigger story related to the strategy of closed platforms such as Apple, Oracle and Cisco. It made me think how I see the future of Open vs. Closed routes in PLM.

CAD Openness
The debates about openness of CAD and later PDM/PLM software is not a big news in the industry. From the early beginning, CAD applications tried to protect themselves, by creating a proprietary format to store geometrical models and drawings. For the long period of time and until now, practically all leading CAD vendors are using closed file formats. It created a separate industry of companies working on translators and supporting so called “interoperability”. The discussion about CAD openness and interoperability is probably the longest one I can remember in the history. I’m not sure we’ll be able to see the end of this story. The current situation reflects clearly the conflict of vendor’s business models and user interests. CAD industry veterans outline the future of CAD (MCAD) will remove this barrier and make CAD product more open. You can take a look on my blog post – CAD Future: How To Liberate Data.

PDM/PLM and CAD Integration
In the beginning, PDM was about just managing meta data about CAD File. It started from revision management and release control. Most of PDM system in the market managed to have multi-CAD integration strategy by supporting multiple vendors. However, customers were interested in more integrated products. Evolution of PDM product into PLM, including their ability to manage a diverse set of product data and processes, just added more fuel into development of future PLM Platform strategies. It was a time, when vendors started to think seriously about how to create completely integrated product suites. Dassault V6 is a first kind of system that introducing CAD/PLM system bundle.

CAD vs. PLM Openness
What is the difference between CAD and PLM Openness? In my view, it is an interesting turning point in the overall story of engineering software. In the real world, customers are working with a diverse set of tools. In the world of pure CAD, their decision to work with multiple CAD products was hard, but doable. Many of the customers (especially big ones) worked historically with multiple CAD products. PLM is adding new flavors in the old story about CAD openness and interoperability. This is the place where the world of CAD files ends and companies are starting to think more about how to manage all engineering and enterprise data assets.

The Future Is Open?
The most important question is how we can move into the future where data will be more open. I think, many of the companies, are thinking how to solve this problem. It looks like a very promising future to make data open. However, the business reality is different and companies are continuing to make profits from closed platforms. The following quote from Fortune article is the most important, in my view:

When the closed strategy works, it can yield outsize profits — Apple’s recent financial results being a prime example. But it can backfire, too. If a company’s bundle of products doesn’t work together well enough to justify the added cost, customers can get turned off. That’s the danger for Dassault. Joe Barkai, analyst at IDC Manufacturing Insights, says that in this age of consolidation, automakers are more likely to be looking for flexible design systems that can easily share data with a new partner or supplier.

What is my conclusion? I think, we are going to see more and more stories related to development of Open startegies. There are clearly two possible options: 1/To create excellent integrated product suites and sell them to customers (i.e. Apple story) or 2/To develop open strategies. My take – I think Open game is hard. However, the prize can be big. Just my thoughts…