The F-35 is about to get a lot smarter

War in the 21st century runs on data, a lot of it in the case of the F-35 Joint Strike Fighter. The Mission Data Files that inform F-35 deployments and missions can take up to 18 months to compile, bringing in info on everything from enemy radar and anti-aircraft missiles to waveforms and cyber weapons. Now the Pentagon has hired a California company to shrink that compilation time to just one month, using artificial intelligence.

The company, C3, sees itself as a sort of AI tailor, stitching together different methodologies — from simple machine learning to more sophisticated deep learning—and combining heterogeneous forms of data that don’t play well together—from images to diagnostic valuations to text—into products that are specific to the problem. Some might be heavier on deep learning, some on machine, in which case the company works to accelerate the laborious task of data labeling.

They’ve been quietly doing business with the Defense Department for 15 months, after an initial outreach from the Defense Innovation Unit. Already they’re involved in nine projects, mostly related to predictive maintenance for aircraft such as the E-3 Sentry AWACS, the C-5 Galaxy, the F-16, and soon, the F-35, predicting when a part or computer system might fail on the basis of weather, deployment, mission, the age and condition of its components, and so forth.

Of course, the F-35 already has an onboard diagnostic system, the Autonomic Logistics Information System, or ALIS. Nikhil Krishnan, C3’s vice president for products, said their software won’t replace ALIS, or anything that Lockheed Martin or its F-35 subcontractors have already built. Instead, it aims to combine information from those sources to create a better, fuller picture of what’s going on with the plane.

Beside ALIS, Krishnan said, C3’s software will devour “operational data, sorties, it could include weather, the history of the part, was there repair work done on it before? We’re really on a higher level than any of these subsystems, including ALIS.” The hope is to be able to preposition parts and maintainers to make fast repairs or modifications not only in response to what the plane has been through but, perhaps, what it’s about to go through as well.

. ADVERTISEMENT .

All that is separate from C3’s work on Mission Date File optimization, which is set to complete development next summer. The file serves as a sort of threat library. “It’s the data on board that proactively notifies the pilot of the aircraft of upcoming threats. The problem today is that it takes way too long to actually generate that Mission Data File. We can apply the data aggregation capabilities that C3 has and AI to make that process an order of magnitude faster so the data are more current,” said Edward Abbo, C3’s President and CTO.

The process today is heavily manual, largely because the data sources and types are so diverse. “The analyst today would have to go data source by data source and then, within data source, data field by data field, looking, for instance, to see if this database here has this field for an object in the theatre,” explained Krishnan. Much of the data is highly unstructured, such as comments in text, that software doesn’t work well with. The hope is to automate the process of looking through sources and present the operator with a list of problems, such as potential discrepancies in the intelligence, and recommendations.

The company is also developing a new AI-based tool for gathering intelligence on potential targets. It’s similar to what Google was doing for the Pentagon under Project Maven, but with a boost. Whereas the focus of Maven was applying AI to recognizing objects in images, the new project, in development, would integrate a variety of data from diverse sources to construct a fuller picture, similar to the way the brain works to combine sensory input with lived experience and intuition in order to create an understanding of what’s going on.

“Let’s say you’re looking for a Toyota Corolla on the freeway [and] you have streaming video. We’re doing two things, analyzing the video for object identification and classification and then the second is contextualizing that information,” said Abbo. “Was a Toyota Corolla spotted by someone else five minutes ago?… That information can be added to the fact that you just spotted that Toyota Corolla now, and you can determine it’s the same one based on speed or other factors.” The objective is predictive battlefield tracking. Right now, battle field tracking is “a very isolated set of observations if you will. But if you could aggregate those observations together you would have a much better sense of predicting where someone might be going.”

Abbo wouldn’t say what branch or service of the military hired C3 for the intelligence project, but the effort bears a lot of resemblance to what Google and others were doing with the special operations community.

Of course, one thing that the company can’t solve for is the bigger problems confronting officials looking to more fully embrace AI for defense and intelligence. Neural networks often outperform machine learning solutions but are more difficult to explain for legal and policy purposes. The second is the data integrity problem, where bad data can throw off good models and skew results, a life-or-death situation in the case of military intelligence.