The latest version of the ROMS/TOMS svn repository, revision 568, was frozen and tagged as ROMS 3.5. This version is very stable and includes Phase I of multiple grid nesting. The technical details of nesting Phase I are described in detail in a separated post. This was a major overhaul of the ROMS nonlinear (NLM), tangent linear (TLM), representer (RPM), and adjoint (ADM) numerical kernels.

Warning: The Phase II of nesting will be released on Friday, September 23, 2011. All the lateral boundary conditions C-preprocessing options are removed to facilitate nesting layers (refinement and composite grids). This update is massive because of the periodic boundary conditions. I highly recommend that you save a copy of your current customized version of the code before you update. This will allow you to compare your customized code to the Phase II code and update to Phase II using the documentation provided in the following post.

This post only discusses a few updates and corrections to nesting Phase I.

What Is New:

Corrected the computation of potential vorticity (pvor) in routine vorticity.F. For details, please check the following trac ticket. Many thanks to Carlos Moffat for reporting this bug.

Updated all the biological models to allocate parameters that depend on the nested grid parameter Ngrids. In ROMS 3.5 all the parameters that depend on Ngrids need to be allocated. This change affected the following files located in the ROMS/Nonlinear/Biology directory: ecosim_mod.h, fennel_mod.h, nemuro_mod.h, npzd_Franks_mod.h, npzd_Powell_mod.h, and npzd_iron_mod.h.

Corrected bug due to the missing initialization of integer parameter idefAVG and idefHIS in mod_ncparam.F. This affected the creation of new average and history NetCDF files during restart. See the following trac ticket for details. Many thanks to Alessandro Coluccelli for reporting this problem.

In this version, I introduced (see ticket 512) and corrected (see ticket 515) a parallel bug in step3d_uv.F in the coupling of 2D and 3D momentum. We cannot use the following switches DOMAIN(ng)%Southern_Edge(tile) or DOMAIN(ng)%Northern_Edge(tile) inside of a pipelined J-loop

Corrected a bug in inquire.F when processing multiple files (time split data). It turns out that the local variable Fcount is not assigned after the first field is processed in a new file. For example, the temperature is processed correctly but not the salinity in time split climatology files. The logic failed to initialize Fcount when processing salinity. Compilers treat uninitialized data differently. New logic was added to correct this problem and report uninitialized Fcount failures in the future. Many thanks Fred Castruccio for reporting this problem. I also added logic, suggested by Kate, to take into account a C-language null termination character in NetCDF string variable attributes. This occurs when creating NetCDF files with C-programs.

Introduced the new CPP option POSITIVE_ZERO to impose positive zero data in output NetCDF files. Starting with F95, zero values can be signed (-0.0 or +0.0) following the IEEE 754 floating-point standard. This may produce different output data in serial and parallel applications. To enforce positive output data when this option is activated, changes were made to the following output routines: nf_fwrite2d, nf_fwrite3d, nf_fwrite4d, nf_fwrite2d_bry, and nf_fwrite3d_bry. This is essential when comparing serial and parallel solutions to guarantee a code free of parallel bugs.

Added calls to periodic boundary conditions in set_avg.F to guarantee periodic output in the time-averaged NetCDF file. The full ranges (IstrR,JstrR) and (IendR,JendR) have different values in periodic applications. Because of the way the periodic boundary conditions are calculated the range values are redefined to be one point inside the full range of the C-grid. Therefore, for output purposes, we need to apply periodic boundary conditions or the values of the fields will be zero. Many thanks to John Wilkin for bringing this to my attention. This has been in the code for years I guess that we never looked, in detail, whether the averaged data was really periodic. This only affects applications with periodic boundary conditions.

AD_AVERAGES: use if writing out time-averaged adjoint model fields. Used primarily in adjoint sensitivity simulations.

RP_AVERAGES: use if writing out time-averaged representer model fields. Used only in the R4D-Var algorithm (W4DVAR) to save the representer model time-averaged fields for each outer loop in separate files.

TL_AVERAGES: use if writing out time-averaged tangent linear model fields. This will be used in the future to load averages of the TLM kernel.

Notice that only one of these CPP options can be activated at the same time. They are mutually exclusive! This is because the same internal arrays (declared in mod_averages.F) are used to stored the accumulated field sums. The I4D-Var and 4D-PSAS algorithms (IS4DVAR and W4DPSAS, respectively) will write out the time-averaged and diagnostic fields for each outer loop in separate files if the AVERAGES and/or DIAGNOSTICS_TS/DIAGNOSTICS_UV options are activated. Many thanks to Andy Moore for suggesting these useful capabilities for the 4D-Var data assimilation algorithms.

Changed the default of the NetCDF creation mode (CMODE) to allow large file support in mod_netcdf.F:

The parallel I/O is still broken with the newer versions of the NetCDF/HDF libraries. It works with NetCDF version 4.1.1. The parameter nf90_mpiio is a newer parameter not available in older versions of the NetCDF library. Correcting the parallel I/O in ROMS with latest version of the NetCDF library is on my to-do list. Please be patient.

Who is online

Users browsing this forum: No registered users and 1 guest

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum