The combination of information present in bathymetric and imagery-based products is a key requirement for any modern shipwreck-detection approach for areas whereas optic means cannot be adopted. If the data sources and the processing involved are correctly weighted in a fusion algorithm, the detection task can be extended beyond a simple binary (presence/absence) decision to provide a meaningful metric that evaluates confidence in the presence of new features. In combination with other existing information (e.g., Electronic Nautical Charts - ENCs), this metric can become a proxy for areas with high probability of change (for shipwrecks to be either added or removed) with respect to the baseline knowledge of the area. The dual, and partially contradictory, goals of such a system are to highlight areas with high probability of change, and to use the existing nautical documentation as a spatial filter to resource consumption on known features. Determining an appropriate balance between these is an interesting challenge.

Based on such considerations, we describe an approach for how to combine the results of different target detection algorithms, as well as in comparing such results with existing ENCs and geographic databases. The main goal is to help the analyst in focusing on specific areas (with higher likelihood of new features), prioritizing them on safety-of-navigation criteria and reducing the common pitfall of subjectivity in the processing workflow. Although mainly aimed at reducing the time required to take survey data and apply it to the chart, the approach is also well suited for different scenarios such as rapid response after hurricanes.

These concepts are tested and demonstrated by an application prototype that uses real data products, existing nautical documentation, and publicly available geospatial services to support analyst decisions. The application also supports a schema-based mechanism for consistent data exchange and content validation.