As a foreign correspondent for a large news agency, I know how important it is to improve the flow of meaningful news and information to clients’ newsrooms. That task is increasingly challenging as social media (especially Twitter) and online organisations deliver faster, and sometimes more reliable, news than traditional wire services.

For news agencies to stay relevant – and even foster a revolution in journalism – I believe they must embrace the concept of Structured Journalism.

Structured Journalism is a new way of organizing, linking, indexing and retrieving information relevant to journalists and their audiences.

Google applies the concepts of structured data and linked data to allow machines to understand the relation between pieces of information. This represents the first stage of a faster global knowledge network, and will eventually pave the way towards a world where Artificial Intelligence can answer complex questions and transmit meaningful information, even before we ask.

Well-structured journalism focuses on the classification of every dataset, every source open to the public that a reporter is using for a given story. This includes defining the origin, the relative value of that source and the relation between the story as a whole to other available content.

Through a Structured Journalism approach every quote included in an article will automatically be linked to the person who verbalized it.

Date references like “this Thursday” will be stored in a standardised format to make that information useful to readers and machines much later (so instead of ‘this Thursday’ it will be stored under the actual date, ie Thursday 27 August 2015).

Any reports cited will be linked to the article; datasets will be related to figures in the article; and notes and other references will also be stored and shared.

But there’s more to this approach. Computers will be able to access and make sense of all this data we are storing and classifying in the process of writing a story.

The concept of Linked Data and the creation of tools to define the connections between every piece of information or metadata (csv, geolocation, social impact, etc) to a wider context can help redefine the field of journalism.

The story as a whole can be related automatically to other stories if they have common elements (better than just keywords or other simple taxonomies). This would create an extremely useful body of knowledge that can be interpreted by machines and navigated by humans.

Developments in this direction could also be used to create visualisations in seconds, automatise fact-checking processes, predict which stories are more useful for readers, analyse audiences, accelerate investigative journalism and, in a not-very-distant future even allow robots to write stories, freeing journalists from the most tedious processes.

Imagine a reporter writing her story in a content management system that is integrated with the news agency’s databases. As she writes, the news agency’s software is pulling recommendations of data she might want to check, social media comments related to her topic or giving her automatic fact-checking, for example correcting unemployment figures or offering the latest numbers.

These goal can be reached in two simultaneous phases:

Structuring the data:

Newswire content should avoid being an impenetrable brick wall, slabs of text coupled with an image, a video, or hyperlinks are not enough. Last month, the first Manifesto for Structured Journalism by the BBC News Lab summed up this idea in few words: “There is a wealth of knowledge created during the ‘gathering and assessing’ phases of reporting that most publishing systems ignore.”

For that reason it is necessary to break down the current wall of a news piece into its components, its bricks: text, names, organisations, pdf sources, datasets, references, etc, and store and classify them, creating deeper and more replicable news. Applying the approach of the 5 stars of Open Data and an academic logic to the content creation is essential.

Linking the data:

The concept of linked data and the semantic web can offer a new way of articulating the vast amount of information sifted, curated, vetted and produced by news agencies. Journalists better than anyone can determine the relationship between those different bricks and pieces of information. Creating a detailed, multi-layered map of information will make news agencies’ databases (second to reporters, their most important asset) many times more valuable, actionable and useful not only for other media outlets, but for corporations, independent agencies, governments, and researchers.