Project Hubs / Find out how Forum for the Future is working towards a sustainable future. If you’re interested in creating a digital community page for your project, please contact us at futurescentre@forumforthefuture.org

DARPA planning AI system to predict world events

Signal of change / DARPA planning AI system to predict world events

The Defense Advanced Research Projects Agency (DARPA) wants to create an artificial intelligence that sifts the media for early signals of potentially impactful events, such as terrorist attacks, financial crises or cold wars.

The system is called KAIROS: Knowledge-directed Artificial Intelligence Reasoning Over Schemas. Schemas are small stories made up of linked events that people use to make sense of the world. For example, the "buying a gift" schema involves entering a shop, browsing for an item, selecting the item, experiencing pangs of self-doubt, bringing it to the till, paying for it, then leaving the shop.

KAIROS will begin by ingesting massive amounts of data so it can build a library of basic schemas. Once it has compiled a set of schemas about the world, the system will try to use them to extract narratives about complex real-world events.

According to the agency, KAIROS "aims to develop a semi-automated system capable of identifying and drawing correlations between seemingly unrelated events or data, helping to inform or create broad narratives about the world around us."

So what?

Artificial intelligence technologies are not yet advanced enough to be able to tease out complex narratives from the deluge of media produced every day. DARPA does not currently know exactly how to make KAIROS either; the agency is currently soliciting proposals. If DARPA can make such a system, it is likely to be used for military and defence purposes – at least initially.

Artificially intelligent systems able to predict world events could one day be used by humanitarian organisations to plan for or even prevent catastrophes.

At the moment, KAIROS is a purely theoretical project that may not even be feasible, at least in a reasonable timeframe.