Abstract: Publication date: March 2018 Source:Computers & Geosciences, Volume 112 Author(s): Shaoming Pan, Lian Xiong, Zhengquan Xu, Yanwen Chong, Qingxiang Meng Replication strategy is one of effective solutions to meet the requirement of service response time by preparing data in advance to avoid the delay of reading data from disks. This paper presents a brand-new method to create copies considering the selection of replicas set, the number of copies for each replica and the placement strategy of all copies. First, the popularities of all data are computed considering both the historical access records and the timeliness of the records. Then, replica set can be selected based on their recent popularities. Also, an enhanced Q-value scheme is proposed to assign the number of copies for each replica. Finally, a reasonable copies placement strategy is designed to meet the requirement of load balance. In addition, we present several experiments that compare the proposed method with techniques that use other replication management strategies. The results show that the proposed model has better performance than other algorithms in all respects. Moreover, the experiments based on different parameters also demonstrated the effectiveness and adaptability of the proposed algorithm.

Abstract: Publication date: March 2018 Source:Computers & Geosciences, Volume 112 Author(s): Sait I. Ozkaya Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning. Graphical abstract

Abstract: Publication date: March 2018 Source:Computers & Geosciences, Volume 112 Author(s): Chao Zhou, Kunlong Yin, Ying Cao, Bayes Ahmed, Yuanyao Li, Filippo Catani, Hamid Reza Pourghasemi Landslide is a common natural hazard and responsible for extensive damage and losses in mountainous areas. In this study, Longju in the Three Gorges Reservoir area in China was taken as a case study for landslide susceptibility assessment in order to develop effective risk prevention and mitigation strategies. To begin, 202 landslides were identified, including 95 colluvial landslides and 107 rockfalls. Twelve landslide causal factor maps were prepared initially, and the relationship between these factors and each landslide type was analyzed using the information value model. Later, the unimportant factors were selected and eliminated using the information gain ratio technique. The landslide locations were randomly divided into two groups: 70% for training and 30% for verifying. Two machine learning models: the support vector machine (SVM) and artificial neural network (ANN), and a multivariate statistical model: the logistic regression (LR), were applied for landslide susceptibility modeling (LSM) for each type. The LSM index maps, obtained from combining the assessment results of the two landslide types, were classified into five levels. The performance of the LSMs was evaluated using the receiver operating characteristics curve and Friedman test. Results show that the elimination of noise-generating factors and the separated modeling of each landslide type have significantly increased the prediction accuracy. The machine learning models outperformed the multivariate statistical model and SVM model was found ideal for the case study area.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Jingyin Tang, Corene J. Matyas Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library “arc4nix” to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library “arcpy”. Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Magdalena Oryaëlle Chevrel, Jérémie Labroquère, Andrew J.L. Harris, Scott K. Rowland Lava flow advance can be modeled through tracking the evolution of the thermo-rheological properties of a control volume of lava as it cools and crystallizes. An example of such a model was conceived by Harris and Rowland (2001) who developed a 1-D model, FLOWGO, in which the velocity of a control volume flowing down a channel depends on rheological properties computed following the thermal path estimated via a heat balance box model. We provide here an updated version of FLOWGO written in Python that is an open-source, modern and flexible language. Our software, named PyFLOWGO, allows selection of heat fluxes and rheological models of the user's choice to simulate the thermo-rheological evolution of the lava control volume. We describe its architecture which offers more flexibility while reducing the risk of making error when changing models in comparison to the previous FLOWGO version. Three cases are tested using actual data from channel-fed lava flow systems and results are discussed in terms of model validation and convergence. PyFLOWGO is open-source and packaged in a Python library to be imported and reused in any Python program (https://github.com/pyflowgo/pyflowgo).

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Benoît Dessirier, Chin-Fu Tsang, Auli Niemi Deep crystalline bedrock formations are targeted to host spent nuclear fuel owing to their overall low permeability. They are however highly heterogeneous and only a few preferential paths pertaining to a small set of dominant rock fractures usually carry most of the flow or mass fluxes, a behavior known as channeling that needs to be accounted for in the performance assessment of repositories. Channel network models have been developed and used to investigate the effect of channeling. They are usually simpler than discrete fracture networks based on rock fracture mappings and rely on idealized full or sparsely populated lattices of channels. This study reexamines the fundamental parameter structure required to describe a channel network in terms of groundwater flow and solute transport, leading to an extended description suitable for unstructured arbitrary networks of channels. An implementation of this formalism in a Python scripting library is presented and released along with this article. A new algebraic multigrid preconditioner delivers a significant speedup in the flow solution step compared to previous channel network codes. 3D visualization is readily available for verification and interpretation of the results by exporting the results to an open and free dedicated software. The new code is applied to three example cases to verify its results on full uncorrelated lattices of channels, sparsely populated percolation lattices and to exemplify the use of unstructured networks to accommodate knowledge on local rock fractures.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Denis Marcotte, Denis Allard Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Konstantinos Evangelidis, Theofilos Papadopoulos, Konstantinos Papatheodorou, Paris Mastorokostas, Constantinos Hilas Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): M. Malovichko, N. Khokhlov, N. Yavich, M. Zhdanov This paper presents a parallel algorithm for frequency-domain acoustic modeling by the method of integral equations (IE). The algorithm is applied to seismic simulation. The IE method reduces the size of the problem but leads to a dense system matrix. A tolerable memory consumption and numerical complexity were achieved by applying an iterative solver, accompanied by an effective matrix-vector multiplication operation, based on the fast Fourier transform (FFT). We demonstrate that, the IE system matrix is better conditioned than that of the finite-difference (FD) method, and discuss its relation to a specially preconditioned FD matrix. We considered several methods of matrix-vector multiplication for the free-space and layered host models. The developed algorithm and computer code were benchmarked against the FD time-domain solution. It was demonstrated that, the method could accurately calculate the seismic field for the models with sharp material boundaries and a point source and receiver located close to the free surface. We used OpenMP to speed up the matrix-vector multiplication, while MPI was used to speed up the solution of the system equations, and also for parallelizing across multiple sources. The practical examples and efficiency tests are presented as well.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): A.G. Camacho, J. Fernández, F. Cannavò We present a software package to carry out inversions of surface deformation data (any combination of InSAR, GPS, and terrestrial data, e.g., EDM, levelling) as produced by 3D free-geometry extended bodies with anomalous pressure changes. The anomalous structures are described as an aggregation of elementary cells (whose effects are estimated as coming from point sources) in an elastic half space. The linear inverse problem (considering some simple regularization conditions) is solved by means of an exploratory approach. This software represents the open implementation of a previously published methodology (Camacho et al., 2011). It can be freely used with large data sets (e.g. InSAR data sets) or with data coming from small control networks (e.g. GPS monitoring data), mainly in volcanic areas, to estimate the expected pressure bodies representing magmatic intrusions. Here, the software is applied to some real test cases.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Eric L. Geist, Tom Parsons Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in > 10 6 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Robert W. Porritt, Meghan S. Miller Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software and the IRIS seismic software repository, https://seiscode.iris.washington.edu/.

Abstract: Publication date: Available online 9 December 2017 Source:Computers & Geosciences Author(s): Siya Chen, Tieli Sun, Fengqin Yang, Hongguang Sun, Yu Guan Remote sensing image segmentation is a key technology for processing remote sensing images. The image segmentation results can be used for feature extraction, target identification and object description. Thus, image segmentation directly affects the subsequent processing results. This paper proposes a novel Optimum-Path Forest (OPF) clustering algorithm that can be used for remote sensing segmentation. The method utilizes the principle that the cluster centres are characterized based on their densities and the distances between the centres and samples with higher densities. A new OPF clustering algorithm probability density function is defined based on this principle and applied to remote sensing image segmentation. Experiments are conducted using five remote sensing land cover images. The experimental results illustrate that the proposed method can outperform the original OPF approach.

Abstract: Publication date: Available online 5 December 2017 Source:Computers & Geosciences Author(s): John A. Barker A model that has been widely applied to fractured rock comprises randomly distributed and oriented plates. Formulae are given for the intersection statistics of infinite systems of such plates of mixed shapes and sizes with lines, planes and each other; the results are expressed in terms of the number density, n, and of the average area A and perimeter P of the plates. From Monte-Carlo studies it has been found that a mixture of elliptical plates, each of area A and perimeter P, at the dimensionless density ρ = A k P 3 − 2 k n with k = 0.774 is approximately invariant at the percolation threshold with a critical value of about ρ c = 8.2 ± 0.2 for aspect ratios up to 16. The same result is found to apply to any mixture of convex plate shapes and sizes provided that for each plate A and P are replaced by the area and perimeter of an ellipse with the same aspect ratio and product A P . The results should be of particular value in the interpretation of observed fracture statistics and in the construction of discrete fracture network models.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Ramendra Sahoo, Vikrant Jain Drainage network pattern and its associated morphometric ratios are some of the important plan form attributes of a drainage basin. Extraction of these attributes for any basin is usually done by spatial analysis of the elevation data of that basin. These planform attributes are further used as input data for studying numerous process-response interactions inside the physical premise of the basin. One of the important uses of the morphometric ratios is its usage in the derivation of hydrologic response of a basin using GIUH concept. Hence, accuracy of the basin hydrological response to any storm event depends upon the accuracy with which, the morphometric ratios can be estimated. This in turn, is affected by the spatial resolution of the source data, i.e. the digital elevation model (DEM). We have estimated the sensitivity of the morphometric ratios and the GIUH derived hydrograph parameters, to the resolution of source data using a 30 meter and a 90 meter DEM. The analysis has been carried out for 50 drainage basins in a mountainous catchment. A simple and comprehensive algorithm has been developed for estimation of the morphometric indices from a stream network. We have calculated all the morphometric parameters and the hydrograph parameters for each of these basins extracted from two different DEMs, with different spatial resolutions. Paired t-test and Sign test were used for the comparison. Our results didn't show any statistically significant difference among any of the parameters calculated from the two source data. Along with the comparative study, a first-hand empirical analysis about the frequency distribution of the morphometric and hydrologic response parameters has also been communicated. Further, a comparison with other hydrological models suggests that plan form morphometry based GIUH model is more consistent with resolution variability in comparison to topographic based hydrological model.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Shiluo Xu, Ruiqing Niu Every year, landslides pose huge threats to thousands of people in China, especially those in the Three Gorges area. It is thus necessary to establish an early warning system to help prevent property damage and save peoples’ lives. Most of the landslide displacement prediction models that have been proposed are static models. However, landslides are dynamic systems. In this paper, the total accumulative displacement of the Baijiabao landslide is divided into trend and periodic components using empirical mode decomposition. The trend component is predicted using an S-curve estimation, and the total periodic component is predicted using a long short-term memory neural network (LSTM). LSTM is a dynamic model that can remember historical information and apply it to the current output. Six triggering factors are chosen to predict the periodic term using the Pearson cross-correlation coefficient and mutual information. These factors include the cumulative precipitation during the previous month, the cumulative precipitation during a two-month period, the reservoir level during the current month, the change in the reservoir level during the previous month, the cumulative increment of the reservoir level during the current month, and the cumulative displacement during the previous month. When using one-step-ahead prediction, LSTM yields a root mean squared error (RMSE) value of 6.112 mm, while the support vector machine for regression (SVR) and the back-propagation neural network (BP) yield values of 10.686 mm and 8.237 mm, respectively. Meanwhile, the Elman network (Elman) yields an RMSE value of 6.579 mm. In addition, when using multi-step-ahead prediction, LSTM obtains an RMSE value of 8.648 mm, while SVR, BP and the Elman network obtains RSME values of 13.418 mm, 13.014 mm, and 13.370 mm. The predicted results indicate that, to some extent, the dynamic model (LSTM) achieves results that are more accurate than those of the static models (i.e., SVR and BP). LSTM even displays better performance than the Elman network, which is also a dynamic method.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Jacek Andrzej Urbanski This paper presents a Glacier Termini Tracking (GTT) toolbox for the two-dimensional analysis of glacier-terminus position changes. The input consists of a vector layer with several termini lines relating to the same glacier at different times. The output layers allow analyses to be conducted of glacier-terminus retreats, changes in retreats over time and along the ice face, and glacier-terminus fluctuations over time. The application of three tools from the toolbox is demonstrated via the analysis of eight glacier-terminus retreats and fluctuations at the Hornsund fjord in south Svalbard. It is proposed that this toolbox may also be useful in the study of other line features that change over time, like coastlines and rivers. The toolbox has been coded in Python and runs via ArcGIS.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Đorđe M. Đurđević, Igor I. Tartalja In this paper, we present a method for lossless compression of residuals with an efficient SIMD parallel decompression. The residuals originate from lossy or near lossless compression of height fields, which are commonly used to represent models of terrains. The algorithm is founded on the existing RBUC method for compression of non-uniform data sources. We have adapted the method to capture 2D spatial locality of height fields, and developed the data decompression algorithm for modern GPU architectures already present even in home computers. In combination with the point-level SIMD-parallel lossless/lossy high field compression method HFPaC, characterized by fast progressive decompression and seamlessly reconstructed surface, the newly proposed method trades off small efficiency degradation for a non negligible compression ratio (measured up to 91%) benefit.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Camilo Restrepo-Estrada, Sidgley Camargo de Andrade, Narumi Abe, Maria Clara Fava, Eduardo Mario Mendiondo, João Porto de Albuquerque Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. Thus, there is still a gap in research with regard to the use of social media as a proxy for rainfall-runoff estimations and flood forecasting. To address this, we propose using a transformation function that creates a proxy variable for rainfall by analysing geo-social media messages and rainfall measurements from authoritative sources, which are later incorporated within a hydrological model for streamflow estimation. We found that the combined use of official rainfall values with the social media proxy variable as input for the Probability Distributed Model (PDM), improved streamflow simulations for flood monitoring. The combination of authoritative sources and transformed geo-social media data during flood events achieved a 71% degree of accuracy and a 29% underestimation rate in a comparison made with real streamflow measurements. This is a significant improvement on the respective values of 39% and 58%, achieved when only authoritative data were used for the modelling. This result is clear evidence of the potential use of derived geo-social media data as a proxy for environmental variables for improving flood early-warning systems.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Jonathan Edwards, Florent Lallier, Guillaume Caumon, Cédric Carpentier We discuss the sampling and the volumetric impact of stratigraphic correlation uncertainties in basins and reservoirs. From an input set of wells, we evaluate the probability for two stratigraphic units to be associated using an analog stratigraphic model. In the presence of multiple wells, this method sequentially updates a stratigraphic column defining the stratigraphic layering for each possible set of realizations. The resulting correlations are then used to create stratigraphic grids in three dimensions. We apply this method on a set of synthetic wells sampling a forward stratigraphic model built with Dionisos. To perform cross-validation of the method, we introduce a distance comparing the relative geological time of two models for each geographic position, and we compare the models in terms of volumes. Results show the ability of the method to automatically generate stratigraphic correlation scenarios, and also highlight some challenges when sampling stratigraphic uncertainties from multiple wells. Graphical abstract

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Ruixun Lai, Min Wang, Ming Yang, Chao Zhang The accuracy of the widely-used two-dimensional hydrodynamic numerical model depends on the quality of the river terrain model, particularly in the main channel. However, in most cases, the bathymetry of the river channel is difficult or expensive to obtain in the field, and there is a lack of available data to describe the geometry of the river channel. We introduce a method that originates from the grid generation with the elliptic equation to generate streamlines of the river channel. The streamlines are numerically solved with the Laplace equations. In the process, streamlines in the physical domain are first computed in a computational domain, and then transformed back to the physical domain. The interpolated streamlines are integrated with the surrounding topography to reconstruct the entire river terrain model. The approach was applied to a meandering reach in the Qinhe River, which is a tributary in the middle of the Yellow River, China. Cross-sectional validation and the two-dimensional shallow-water equations are used to test the performance of the river terrain generated. The results show that the approach can reconstruct the river terrain using the data from measured cross-sections. Furthermore, the created river terrain can maintain a geometrical shape consistent with the measurements, while generating a smooth main channel. Finally, several limitations and opportunities for future research are discussed.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): M. Adams, T. Kempka, E. Chabab, M. Ziegler Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Babak Shabani, Javier Vilcáez A new TOUGH2 module to simulate geological CO2 storage (GCS) in saline aquifers is developed based on the widely employed ECO2N module of TOUGH2. The newly developed TOUGH2 module uses a new non-iterative fugacity-activity thermodynamic model to obtain the partitioning of CO2 and H2O between the aqueous and gas phases. Simple but robust thermophysical correlations are used to obtain density, viscosity, and enthalpy of the gas phase. The implementation and accuracy of the employed thermophysical correlations are verified by comparisons against the national institute of standards and technology (NIST) online thermophysical database. To assess the computation accuracy and efficiency, simulation results obtained with the new TOUGH2 module for a one-dimensional non-isothermal radial and a three-dimensional isothermal system are compared against the simulation results obtained with the ECO2N module. Treating salt mass fraction in the aqueous phase as a constant, along with the inclusion of a non-iterative fugacity-activity thermodynamic model, and simple thermophysical correlations, resulted in simulations much faster than simulations with ECO2N module, without losing numerical accuracy. Both modules yield virtually identical results. Additional field-scale simulations of CO2 injection into an actual non-isothermal and heterogeneous geological formation confirmed that the new module is much faster than the ECO2N module in simulating complex field-scale conditions. Owing to its capability to handle CO2-CH4-H2S-N2 gas mixtures and its compatibility with TOUGHREACT, this new TOUGH2 module offers the possibility of developing a fast and robust TOUGHREACT module to predict the fate of CO2 in GCS sites under biotic conditions where CO2, CH4, H2S, and N2 gases can be formed.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Yihui Xiong, Renguang Zuo Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

Abstract: Publication date: February 2018 Source:Computers & Geosciences, Volume 111 Author(s): Weimin Xu, Shi Chen Spectral methods provide many advantages for calculating gravity anomalies. In this paper, we derive a kernel function for a three-dimensional (3D) fault model in the wave number domain, and present the full Fortran source code developed for the forward computation of the gravity anomalies and related derivatives obtained from the model. The numerical error and computing speed obtained using the proposed spectral method are compared with those obtained using a 3D rectangular prism model solved in the space domain. The error obtained using the spectral method is shown to be dependent on the sequence length employed in the fast Fourier transform. The spectral method is applied to some examples of 3D fault models, and is demonstrated to be a straightforward and alternative computational approach to enhance computational speed and simplify the procedures for solving many gravitational potential forward problems involving complicated geological models. The proposed method can generate a great number of feasible geophysical interpretations based on a 3D model with only a few variables, and can thereby improve the efficiency of inversion.

Abstract: Publication date: Available online 8 November 2017 Source:Computers & Geosciences Author(s): Ruo-Qian Wang, Huina Mao, Yuan Wang, Chris Rae, Wesley Shaw Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

Abstract: Publication date: Available online 6 November 2017 Source:Computers & Geosciences Author(s): Pedro A.A. Penna, Nelson D.A. Mascarenhas The development of new methods to denoise images still attract researchers, who seek to combat the noise with the minimal loss of resolution and details, like edges and fine structures. Many algorithms have the goal to remove additive white Gaussian noise (AWGN). However, it is not the only type of noise which interferes in the analysis and interpretation of images. Therefore, it is extremely important to expand the filters capacity to different noise models present in li-terature, for example the multiplicative noise called speckle that is present in synthetic aperture radar (SAR) images. The state-of-the-art algorithms in remote sensing area work with similarity between patches. This paper aims to develop two approaches using the non local means (NLM), developed for AWGN. In our research, we expanded its capacity for intensity SAR ima-ges speckle. The first approach is grounded on the use of stochastic distances based on the G 0 distribution without transforming the data to the logarithm domain, like homomorphic transformation. It takes into account the speckle and backscatter to estimate the parameters necessary to compute the stochastic distances on NLM. The second method uses a priori NLM denoising with a homomorphic transformation and applies the inverse Gamma distribution to estimate the parameters that were used into NLM with stochastic distances. The latter method also presents a new alternative to compute the parameters for the G 0 distribution. Finally, this work compares and analyzes the synthetic and real results of the proposed methods with some recent filters of the literature.

Abstract: Publication date: Available online 6 November 2017 Source:Computers & Geosciences Author(s): M.A. Jardine, J.A. Miller, M. Becker Texture is one of the most basic descriptors used in the geological sciences. The value derived from textural characterisation extends into engineering applications associated with mining, mineral processing and metal extraction where quantitative textural information is required for models predicting the response of the ore through a particular process. This study extends the well-known 2D grey level co-occurrence matrices methodology into 3D as a method for image analysis of 3D x-ray computed tomography grey scale volumes of drill core. Subsequent interrogation of the information embedded within the grey level occurrence matrices (GLCM) indicates they are sensitive to changes in mineralogy and texture of samples derived from a magmatic nickel sulfide ore. The position of the peaks in the GLCM is an indication of the relative density (specific gravity, SG) of the minerals and when interpreted using a working knowledge of the mineralogy of the ore presented a means to determine the relative abundance of the sulfide minerals (SG > 4), dense silicate minerals (SG > 3), and lighter silicate minerals (SG < 3). The spread of the peaks in the GLCM away from the diagonal is an indication of the degree of grain boundary interaction with wide peaks representing fine grain sizes and narrow peaks representing coarse grain sizes. The method lends itself to application as part of a generic methodology for routine use on large XCT volumes providing quantitative, timely, meaningful and automated information on mineralogy and texture in 3D. Graphical abstract

Abstract: Publication date: January 2018 Source:Computers & Geosciences, Volume 110 Author(s): Xinye Ji, Chaopeng Shen Geoscientific models manage myriad and increasingly complex data structures as trans-disciplinary models are integrated. They often incur significant redundancy with cross-cutting tasks. Reflection, the ability of a program to inspect and modify its structure and behavior at runtime, is known as a powerful tool to improve code reusability, abstraction, and separation of concerns. Reflection is rarely adopted in high-performance Geoscientific models, especially with Fortran, where it was previously deemed implausible. Practical constraints of language and legacy often limit us to feather-weight, native-language solutions. We demonstrate the usefulness of a structural-reflection-emulating, dynamically-linked metaObjects, gd. We show real-world examples including data structure self-assembly, effortless input/output (IO) and upgrade to parallel I/O, recursive actions and batch operations. We share gd and a derived module that reproduces MATLAB-like structure in Fortran and C++. We suggest that both a gd representation and a Fortran-native representation are maintained to access the data, each for separate purposes. Embracing emulated reflection allows generically-written codes that are highly re-usable across projects.

Abstract: Publication date: January 2018 Source:Computers & Geosciences, Volume 110 Author(s): Hamzeh Sadeghisorkhani, Ólafur Gudmundsson, Ari Tryggvason We present a graphical user interface (GUI) package to facilitate phase-velocity dispersion measurements of surface waves in noise-correlation traces. The package, called GSpecDisp, provides an interactive environment for the measurements and presentation of the results. The selection of a dispersion curve can be done automatically or manually within the package. The data are time-domain cross-correlations in SAC format, but GSpecDisp measures phase velocity in the spectral domain. Two types of phase-velocity dispersion measurements can be carried out with GSpecDisp; (1) average velocity of a region, and (2) single-pair phase velocity. Both measurements are done by matching the real part of the cross-correlation spectrum with the appropriate Bessel function. Advantages of these two types of measurements are that no prior knowledge about surface-wave dispersion in the region is needed, and that phase velocity can be measured up to that period for which the inter-station distance corresponds to one wavelength. GSpecDisp can measure the phase velocity of Rayleigh and Love waves from all possible components of the noise correlation tensor. First, we briefly present the theory behind the methods that are used, and then describe different modules of the package. Finally, we validate the developed algorithms by applying them to synthetic and real data, and by comparison with other methods. The source code of GSpecDisp can be downloaded from: https://github.com/Hamzeh-Sadeghi/GSpecDisp.

Abstract: Publication date: January 2018 Source:Computers & Geosciences, Volume 110 Author(s): Xuesong Yan, Zhixin Zhu, Qinghua Wu Seismic exploration is a method of oil exploration that uses seismic information; that is, according to the inversion of seismic information, the useful information of the reservoir parameters can be obtained to carry out exploration effectively. Pre-stack data are characterised by a large amount of data, abundant information, and so on, and according to its inversion, the abundant information of the reservoir parameters can be obtained. Owing to the large amount of pre-stack seismic data, existing single-machine environments have not been able to meet the computational needs of the huge amount of data; thus, the development of a method with a high efficiency and the speed to solve the inversion problem of pre-stack seismic data is urgently needed. The optimisation of the elastic parameters by using a genetic algorithm easily falls into a local optimum, which results in a non-obvious inversion effect, especially for the optimisation effect of the density. Therefore, an intelligent optimisation algorithm is proposed in this paper and used for the elastic parameter inversion of pre-stack seismic data. This algorithm improves the population initialisation strategy by using the Gardner formula and the genetic operation of the algorithm, and the improved algorithm obtains better inversion results when carrying out a model test with logging data. All of the elastic parameters obtained by inversion and the logging curve of theoretical model are fitted well, which effectively improves the inversion precision of the density. This algorithm was implemented with a MapReduce model to solve the seismic big data inversion problem. The experimental results show that the parallel model can effectively reduce the running time of the algorithm.

Abstract: Publication date: January 2018 Source:Computers & Geosciences, Volume 110 Author(s): Sanja Scitovski A possibility of applying the density-based clustering algorithm Rough-DBSCAN for earthquake zoning is considered in the paper. By using density-based clustering for earthquake zoning it is possible to recognize nonconvex shapes, what gives much more realistic results. Special attention is thereby paid to the problem of determining the corresponding value of the parameter ɛ in the algorithm. The size of the parameter ɛ significantly influences the recognizing number and configuration of earthquake zones. A method for selecting the parameter ɛ in the case of big data is also proposed. The method is applied to the problem of earthquake data zoning in a wider area of the Republic of Croatia.

Abstract: Publication date: December 2017 Source:Computers & Geosciences, Volume 109 Author(s): Pablo Calvín, Juan J. Villalaín, Antonio M. Casas-Sainz, Lisa Tauxe, Sara Torres-López The Small Circle (SC) methods are founded upon two main starting hypotheses: (i) the analyzed sites were remagnetized contemporarily, acquiring the same paleomagnetic direction. (ii) The deviation of the acquired paleomagnetic signal from its original direction is only due to tilting around the bedding strike and therefore the remagnetization direction must be located on a small circle (SC) whose axis is the strike of bedding and contains the in situ paleomagnetic direction. Therefore, if we analyze several sites (with different bedding strikes) their SCs will intersect in the remagnetization direction. The SC methods have two applications: (1) the Small Circle Intersection (SCI) method is capable of providing adequate approximations to the expected paleomagnetic direction when dealing with synfolding remagnetizations. By comparing the SCI direction with that predicted from an apparent polar wander path, the (re)magnetization can be dated. (2) Once the remagnetization direction is known, the attitude of the beds (at each site) can be restored to the moment of the acquisition of the remagnetization, showing a palinspastic reconstructuion of the structure. Some caveats are necessary under more complex tectonic scenarios, in which SC-based methods can lead to erroneous interpretations. However, the graphical output of the methods tries to avoid ‘black-box’ effects and can minimize misleading interpretations or even help, for example, to identify local or regional vertical axis rotations. In any case, the methods must be used with caution and always considering the knowledge of the tectonic frame. In this paper, some utilities for SCs analysis are automatized by means of a new Python code and a new technique for defining the uncertainty of the solution is proposed. With pySCu the SCs methods can be easily and quickly applied, obtaining firstly a set of text files containing all calculated information and subsequently generating a graphical output on the fly. Graphical abstract

Abstract: Publication date: January 2018 Source:Computers & Geosciences, Volume 110 Author(s): Nils Hempelmann, Carsten Ehbrecht, Carmen Alvarez-Castro, Patrick Brockmann, Wolfgang Falk, Jörg Hoffmann, Stephan Kindermann, Ben Koziol, Cathy Nangini, Sabine Radanovics, Robert Vautard, Pascal Yiou Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files “at home” with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named ‘flyingpigeon’ that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.

Abstract: Publication date: Available online 28 September 2017 Source:Computers & Geosciences Author(s): Xianhai Meng, Zhongxiang Duan, Qin Yang, Xing Liang The 2.5D PEBI (PErpendicular BIsector) grid, which is the projection or extrusion of the 2D PEBI gird, has advantages on practical reservoir modeling. However, to appropriately handle the geological features, especially the reverse faults in reservoir, remains a difficult problem. To address this issue, we propose a local PEBI grid generation method in this paper. By constructing the Voronoi cell of a seed based on the search of its neighboring seeds in a background grid, our method is demonstrated to be efficient and adaptable to reverse fault constraints. In addition, the vertical and horizontal well constraints are also tackled and the cell quality is improved through the Centroidal Voronoi Tessellations (CVT) principle. The results demonstrated that our method enables the formation of high-quality grids and guarantees the conformity to the geological features in reservoirs.

Abstract: Publication date: Available online 28 September 2017 Source:Computers & Geosciences Author(s): Jose J. Camata, Vítor Silva, Patrick Valduriez, Marta Mattoso, Alvaro L.G.A. Coutinho Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

Abstract: Publication date: Available online 23 September 2017 Source:Computers & Geosciences Author(s): Oscar F. Peredo, Daniel Baeza, Julián M. Ortiz, José R. Herrero Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

Abstract: Publication date: Available online 22 September 2017 Source:Computers & Geosciences Author(s): E.P. Francisco, L.F.R. Espath, S. Laizet, J.H. Silvestrini Three-dimensional highly resolved Direct Numerical Simulations (DNS) of particle-laden gravity currents are presented for the lock-exchange problem in an original basin configuration, similar to delta formation in lakes. For this numerical study, we focus on gravity currents over a flat bed for which density differences are small enough for the Boussinesq approximation to be valid. The concentration of particles is described in an Eulerian fashion by using a transport equation combined with the incompressible Navier-Stokes equations, with the possibility of particles deposition but no erosion nor re-suspension. The focus of this study is on the influence of the Reynolds number and settling velocity on the development of the current which can freely evolve in the streamwise and spanwise direction. It is shown that the settling velocity has a strong influence on the spatial extent of the current, the sedimentation rate, the suspended mass and the shape of the lobe-and-cleft structures while the Reynolds number is mainly affecting the size and number of vortical structures at the front of the current, and the energy budget.

Abstract: Publication date: Available online 18 September 2017 Source:Computers & Geosciences Author(s): Zhangang Wang, Honggang Qu, Zixing Wu, Xianghong Wang A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).

Abstract: Publication date: December 2017 Source:Computers & Geosciences, Volume 109 Author(s): Zuzana Majdisova, Vaclav Skala Approximation of scattered data is often a task in many engineering problems. The Radial Basis Function (RBF) approximation is appropriate for big scattered datasets in n–dimensional space. It is a non-separable approximation, as it is based on the distance between two points. This method leads to the solution of an overdetermined linear system of equations. In this paper the RBF approximation methods are briefly described, a new approach to the RBF approximation of big datasets is presented, and a comparison for different Compactly Supported RBFs (CS-RBFs) is made with respect to the accuracy of the computation. The proposed approach uses symmetry of a matrix, partitioning the matrix into blocks and data structures for storage of the sparse matrix. The experiments are performed for synthetic and real datasets.

Abstract: Publication date: December 2017 Source:Computers & Geosciences, Volume 109 Author(s): Ali Seydi Keçeli, Aydın Kaya, Seda Uzunçimen Keçeli Radiolarians are planktonic protozoa and are important biostratigraphic and paleoenvironmental indicators for paleogeographic reconstructions. Radiolarian paleontology still remains as a low cost and the one of the most convenient way to obtain dating of deep ocean sediments. Traditional methods for identifying radiolarians are time-consuming and cannot scale to the granularity or scope necessary for large-scale studies. Automated image classification will allow making these analyses promptly. In this study, a method for automatic radiolarian image classification is proposed on Scanning Electron Microscope (SEM) images of radiolarians to ease species identification of fossilized radiolarians. The proposed method uses both hand-crafted features like invariant moments, wavelet moments, Gabor features, basic morphological features and deep features obtained from a pre-trained Convolutional Neural Network (CNN). Feature selection is applied over deep features to reduce high dimensionality. Classification outcomes are analyzed to compare hand-crafted features, deep features, and their combinations. Results show that the deep features obtained from a pre-trained CNN are more discriminative comparing to hand-crafted ones. Additionally, feature selection utilizes to the computational cost of classification algorithms and have no negative effect on classification accuracy.

Abstract: Publication date: December 2017 Source:Computers & Geosciences, Volume 109 Author(s): Xiuwei Yang, Peimin Zhu A new stochastic seismic inversion method based on the local gradual deformation method is proposed, which can incorporate seismic data, well data, geology and their spatial correlations into the inversion process. Geological information, such as sedimentary facies and structures, could provide significant a priori information to constrain an inversion and arrive at reasonable solutions. The local a priori conditional cumulative distributions at each node of model to be inverted are first established by indicator cokriging, which integrates well data as hard data and geological information as soft data. Probability field simulation is used to simulate different realizations consistent with the spatial correlations and local conditional cumulative distributions. The corresponding probability field is generated by the fast Fourier transform moving average method. Then, optimization is performed to match the seismic data via an improved local gradual deformation method. Two improved strategies are proposed to be suitable for seismic inversion. The first strategy is that we select and update local areas of bad fitting between synthetic seismic data and real seismic data. The second one is that we divide each seismic trace into several parts and obtain the optimal parameters for each part individually. The applications to a synthetic example and a real case study demonstrate that our approach can effectively find fine-scale acoustic impedance models and provide uncertainty estimations.

Abstract: Publication date: December 2017 Source:Computers & Geosciences, Volume 109 Author(s): Peter Morse, Anya Reading, Christopher Lueg Geoscientists are required to analyze and draw conclusions from increasingly large volumes of data. There is a need to recognise and characterise features and changing patterns of Earth observables within such large datasets. It is also necessary to identify significant subsets of the data for more detailed analysis. We present an innovative, interactive software tool and workflow to visualise, characterise, sample and tag large geoscientific datasets from both local and cloud-based repositories. It uses an animated interface and human-computer interaction to utilise the capacity of human expert observers to identify features via enhanced visual analytics. ‘Tagger’ enables users to analyze datasets that are too large in volume to be drawn legibly on a reasonable number of single static plots. Users interact with the moving graphical display, tagging data ranges of interest for subsequent attention. The tool provides a rapid pre-pass process using fast GPU-based OpenGL graphics and data-handling and is coded in the Quartz Composer visual programing language (VPL) on Mac OSX. It makes use of interoperable data formats, and cloud-based (or local) data storage and compute. In a case study, Tagger was used to characterise a decade (2000–2009) of data recorded by the Cape Sorell Waverider Buoy, located approximately 10 km off the west coast of Tasmania, Australia. These data serve as a proxy for the understanding of Southern Ocean storminess, which has both local and global implications. This example shows use of the tool to identify and characterise 4 different types of storm and non-storm events during this time. Events characterised in this way are compared with conventional analysis, noting advantages and limitations of data analysis using animation and human interaction. Tagger provides a new ability to make use of humans as feature detectors in computer-based analysis of large-volume geosciences and other data.

Abstract: Publication date: Available online 31 July 2017 Source:Computers & Geosciences Author(s): David Marks, Paul Elmore, Cheryl Ann Blain, Brian Bourgeois, Frederick Petry, Vicki Ferrini Many oceanographic applications require multi resolution representation of gridded data such as for bathymetric data. Although triangular irregular networks (TINs) allow for variable resolution, they do not provide a gridded structure. Right TINs (RTINs) are compatible with a gridded structure. We explored the use of two approaches for RTINs termed top-down and bottom-up implementations. We illustrate why the latter is most appropriate for gridded data and describe for this technique how the data can be thinned. While both the top-down and bottom-up approaches accurately preserve the surface morphology of any given region, the top-down method of vertex placement can fail to match the actual vertex locations of the underlying grid in many instances, resulting in obscured topology / bathymetry. Finally we describe the use of the bottom-up approach and data thinning in two applications. The first is to provide thinned, variable resolution bathymetry data for tests of storm surge and inundation modeling, in particular hurricane Katrina. Secondly we consider the use of the approach for an application to an oceanographic data grid of 3-D ocean temperature.

Abstract: Publication date: Available online 25 July 2017 Source:Computers & Geosciences Author(s): Franciszek J. Hasiuk, Chris Harding, Alex Raymond Renner, Eliot Winer An open-source web-application, TouchTerrain, was developed to simplify the production of 3D-printable terrain models. Direct Digital Manufacturing (DDM) using 3D Printers can change how geoscientists, students, and stakeholders interact with 3D data, with the potential to improve geoscience communication and environmental literacy. No other manufacturing technology can convert digital data into tangible objects quickly at relatively low cost; however, the expertise necessary to produce a 3D-printed terrain model can be a substantial burden: knowledge of geographical information systems, computer aided design (CAD) software, and 3D printers may all be required. Furthermore, printing models larger than the build volume of a 3D printer can pose further technical hurdles. The TouchTerrain web-application simplifies DDM for elevation data by generating digital 3D models customized for a specific 3D printer's capabilities. The only required user input is the selection of a region-of-interest using the provided web-application with a Google Maps-style interface. Publically available digital elevation data is processed via the Google Earth Engine API. To allow the manufacture of 3D terrain models larger than a 3D printer's build volume the selected area can be split into multiple tiles without third-party software. This application significantly reduces the time and effort required for a non-expert like an educator to obtain 3D terrain models for use in class. The web application is deployed at http://touchterrain.geol.iastate.edu, while source code and installation instructions for a server and a stand-alone version are available at Github: https://github.com/ChHarding/TouchTerrain_for_CAGEO.

Abstract: Publication date: Available online 21 July 2017 Source:Computers & Geosciences Author(s): Ryan Martin, Jeff Boisvert Geological domains contain non-stationary features that cannot be described by a single direction of continuity. Non-stationary estimation frameworks generate more realistic curvilinear interpretations of subsurface geometries. A radial basis function (RBF) based implicit modeling framework using domain decomposition is developed that permits introduction of locally varying orientations and magnitudes of anisotropy for boundary models to better account for the local variability of complex geological deposits. The interpolation framework is paired with a method to automatically infer the locally predominant orientations, which results in a rapid and robust iterative non-stationary boundary modeling technique that can refine locally anisotropic geological shapes automatically from the sample data. The method also permits quantification of the volumetric uncertainty associated with the boundary modeling. The methodology is demonstrated on a porphyry dataset and shows improved local geological features.