Sample records for years global estimates

Full Text Available This study examines the suitability of 250 m MODIS (MODerate Resolution Imaging Spectroradiometer data for mapping global cropland extent. A set of 39 multi-year MODIS metrics incorporating four MODIS land bands, NDVI (Normalized Difference Vegetation Index and thermal data was employed to depict cropland phenology over the study period. Sub-pixel training datasets were used to generate a set of global classification tree models using a bagging methodology, resulting in a global per-pixel cropland probability layer. This product was subsequently thresholded to create a discrete cropland/non-cropland indicator map using data from the USDA-FAS (Foreign Agricultural Service Production, Supply and Distribution (PSD database describing per-country acreage of production field crops. Five global land cover products, four of which attempted to map croplands in the context of multiclass land cover classifications, were subsequently used to perform regional evaluations of the global MODIS cropland extent map. The global probability layer was further examined with reference to four principle global food crops: corn, soybeans, wheat and rice. Overall results indicate that the MODIS layer best depicts regions of intensive broadleaf crop production (corn and soybean, both in correspondence with existing maps and in associated high probability matching thresholds. Probability thresholds for wheat-growing regions were lower, while areas of rice production had the lowest associated confidence. Regions absent of agricultural intensification, such as Africa, are poorly characterized regardless of crop type. The results reflect the value of MODIS as a generic global cropland indicator for intensive agriculture production regions, but with little sensitivity in areas of low agricultural intensification. Variability in mapping accuracies between areas dominated by different crop types also points to the desirability of a crop-specific approach rather than attempting

Full Text Available Leptospirosis, a spirochaetal zoonosis, occurs in diverse epidemiological settings and affects vulnerable populations, such as rural subsistence farmers and urban slum dwellers. Although leptospirosis can cause life-threatening disease, there is no global burden of disease estimate in terms of Disability Adjusted Life Years (DALYs available.We utilised the results of a parallel publication that reported globalestimates of morbidity and mortality due to leptospirosis. We estimatedYears of Life Lost (YLLs from age and gender stratified mortality rates. Years of Life with Disability (YLDs were developed from a simple disease model indicating likely sequelae. DALYs were estimated from the sum of YLLs and YLDs. The study suggested that globally approximately 2.90 million DALYs are lost per annum (UIs 1.25-4.54 million from the approximately annual 1.03 million cases reported previously. Males are predominantly affected with an estimated 2.33 million DALYs (UIs 0.98-3.69 or approximately 80% of the total burden. For comparison, this is over 70% of the global burden of cholera estimated by GBD 2010. Tropical regions of South and South-east Asia, Western Pacific, Central and South America, and Africa had the highest estimated leptospirosis disease burden.Leptospirosis imparts a significant health burden worldwide, which approach or exceed those encountered for a number of other zoonotic and neglected tropical diseases. The study findings indicate that highest burden estimates occur in resource-poor tropical countries, which include regions of Africa where the burden of leptospirosis has been under-appreciated and possibly misallocated to other febrile illnesses such as malaria.

Owing to human activity, global nitrogen (N) cycles have been altered. In the past 100 years, global N deposition has increased. Currently, the monitoring and estimating of N deposition and the evaluation of its effects on global carbon budgets are the focus of many researchers. NO2 columns retrieved by space-borne sensors provide us with a new way of exploring global N cycles and these have the ability to estimate N deposition. However, the time range limitation of NO2 columns makes the estimation of long timescale N deposition difficult. In this study we used ground-based NOx emission data to expand the density of NO2columns, and 40 years of N deposition (1970–2009) was inverted using the multivariate linear model with expanded NO2 columns. The dynamic of N deposition was examined in both global and biome scales. The results show that the average N deposition was 0.34 g N m–2 year–1 in the 2000s, which was an increase of 38.4% compared with the 1970s’. The total N deposition in different biomes is unbalanced. N deposition is only 38.0% of the global total in forest biomes; this is made up of 25.9%, 11.3, and 0.7% in tropical, temperate, and boreal forests, respectively. As N-limited biomes, there was little increase of N deposition in boreal forests. However, N deposition has increased by a total of 59.6% in tropical forests and croplands, which are N-rich biomes. Such characteristics may influence the effects on global carbon budgets.

Background Leptospirosis, a spirochaetal zoonosis, occurs in diverse epidemiological settings and affects vulnerable populations, such as rural subsistence farmers and urban slum dwellers. Although leptospirosis can cause life-threatening disease, there is no global burden of disease estimate in

In "Global health 2035: a world converging within a generation," The Lancet Commission on Investing in Health (CIH) adds the value of increased life expectancy to the value of growth in gross domestic product (GDP) when assessing national well-being. To value changes in life expectancy, the CIH relies on several strong assumptions to bridge gaps in the empirical research. It finds that the value of a life year (VLY) averages 2.3 times GDP per capita for low- and middle-income countries (LMICs) assuming the changes in life expectancy they experienced from 2000 to 2011 are permanent. The CIH VLY estimate is based on a specific shift in population life expectancy and includes a 50 percent reduction for children ages 0 through 4. We investigate the sensitivity of this estimate to the underlying assumptions, including the effects of income, age, and life expectancy, and the sequencing of the calculations. We find that reasonable alternative assumptions regarding the effects of income, age, and life expectancy may reduce the VLY estimates to 0.2 to 2.1 times GDP per capita for LMICs. Removing the reduction for young children increases the VLY, while reversing the sequencing of the calculations reduces the VLY. Because the VLY is sensitive to the underlying assumptions, analysts interested in applying this approach elsewhere must tailor the estimates to the impacts of the intervention and the characteristics of the affected population. Analysts should test the sensitivity of their conclusions to reasonable alternative assumptions. More work is needed to investigate options for improving the approach.

Full Text Available Rotavirus is a leading cause of diarrhoeal mortality in children but there is considerable disagreement about how many deaths occur each year.We compared CHERG, GBD and WHO/CDC estimates of age under 5 years (U5 rotavirus deaths at the global, regional and national level using a standard year (2013 and standard list of 186 countries. The globalestimates were 157,398 (CHERG, 122,322 (GBD and 215,757 (WHO/CDC. The three groups used different methods: (i to select data points for rotavirus-positive proportions; (ii to extrapolate data points to individual countries; (iii to account for rotavirus vaccine coverage; (iv to convert rotavirus-positive proportions to rotavirus attributable fractions; and (v to calculate uncertainty ranges. We conducted new analyses to inform future estimates. We found that acute watery diarrhoea was associated with 87% (95% CI 83-90% of U5 diarrhoea hospitalisations based on data from 84 hospital sites in 9 countries, and 65% (95% CI 57-74% of U5 diarrhoea deaths based on verbal autopsy reports from 9 country sites. We reanalysed data from the Global Enteric Multicenter Study (GEMS and found 44% (55% in Asia, and 32% in Africa rotavirus-positivity among U5 acute watery diarrhoea hospitalisations, and 28% rotavirus-positivity among U5 acute watery diarrhoea deaths. 97% (95% CI 95-98% of the U5 diarrhoea hospitalisations that tested positive for rotavirus were entirely attributable to rotavirus. For all clinical syndromes combined the rotavirus attributable fraction was 34% (95% CI 31-36%. This increased by a factor of 1.08 (95% CI 1.02-1.14 when the GEMS results were reanalysed using a more sensitive molecular test.We developed consensus on seven proposals for improving the quality and transparency of future rotavirus mortality estimates.

Exposure to ambient air pollution increases morbidity and mortality, and is a leading contributor to global disease burden. We explored spatial and temporal trends in mortality and burden of disease attributable to ambient air pollution from 1990 to 2015 at global, regional, and country levels. We estimatedglobal population-weighted mean concentrations of particle mass with aerodynamic diameter less than 2·5 μm (PM 2·5 ) and ozone at an approximate 11 km × 11 km resolution with satellite-based estimates, chemical transport models, and ground-level measurements. Using integrated exposure-response functions for each cause of death, we estimated the relative risk of mortality from ischaemic heart disease, cerebrovascular disease, chronic obstructive pulmonary disease, lung cancer, and lower respiratory infections from epidemiological studies using non-linear exposure-response functions spanning the global range of exposure. Ambient PM 2·5 was the fifth-ranking mortality risk factor in 2015. Exposure to PM 2·5 caused 4·2 million (95% uncertainty interval [UI] 3·7 million to 4·8 million) deaths and 103·1 million (90·8 million 115·1 million) disability-adjusted life-years (DALYs) in 2015, representing 7·6% of total global deaths and 4·2% of global DALYs, 59% of these in east and south Asia. Deaths attributable to ambient PM 2·5 increased from 3·5 million (95% UI 3·0 million to 4·0 million) in 1990 to 4·2 million (3·7 million to 4·8 million) in 2015. Exposure to ozone caused an additional 254 000 (95% UI 97 000-422 000) deaths and a loss of 4·1 million (1·6 million to 6·8 million) DALYs from chronic obstructive pulmonary disease in 2015. Ambient air pollution contributed substantially to the global burden of disease in 2015, which increased over the past 25 years, due to population ageing, changes in non-communicable disease rates, and increasing air pollution in low-income and middle-income countries. Modest reductions in burden will

This paper introduces a new bias reducing method for kernel hazard estimation. The method is called global polynomial adjustment (GPA). It is a global correction which is applicable to any kernel hazard estimator. The estimator works well from a theoretical point of view as it asymptotically redu...

National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...

BACKGROUND: Exposure to ambient air pollution increases morbidity and mortality, and is a leading contributor to global disease burden. We explored spatial and temporal trends in mortality and burden of disease attributable to ambient air pollution from 1990 to 2015 at global, regional, and country

In this study, we have developed time series of global temperature from 1980-97 based on the Microwave Sounding Unit (MSU) Ch 2 (53.74 GHz) observations taken from polar-orbiting NOAA operational satellites. In order to create these time series, systematic errors (approx. 0.1 K) in the Ch 2 data arising from inter-satellite differences are removed objectively. On the other hand, smaller systematic errors (approx. 0.03 K) in the data due to orbital drift of each satellite cannot be removed objectively. Such errors are expected to remain in the time series and leave an uncertainty in the inferred global temperature trend. With the help of a statistical method, the error in the MSU inferred global temperature trend resulting from orbital drifts and residual inter-satellite differences of all satellites is estimated to be 0.06 K decade. Incorporating this error, our analysis shows that the global temperature increased at a rate of 0.13 +/- 0.06 K decade during 1980-97.

textabstractBackground: Disability-adjusted life years (DALYs) link data on disease occurrence to health outcomes, and they are a useful aid in establishing country-specific agendas regarding cancer control. The variables required to compute DALYs are however multiple and not readily available in

The problem of oceanographic state estimation, by means of an ocean general circulation model (GCM) and a multitude of observations, is described and contrasted with the meteorological process of data assimilation. In practice, all such methods reduce, on the computer, to forms of least-squares. The global oceanographic problem is at the present time focussed primarily on smoothing, rather than forecasting, and the data types are unlike meteorological ones. As formulated in the consortium Estimating the Circulation and Climate of the Ocean (ECCO), an automatic differentiation tool is used to calculate the so-called adjoint code of the GCM, and the method of Lagrange multipliers used to render the problem one of unconstrained least-squares minimization. Major problems today lie less with the numerical algorithms (least-squares problems can be solved by many means) than with the issues of data and model error. Results of ongoing calculations covering the period of the World Ocean Circulation Experiment, and including among other data, satellite altimetry from TOPEX/POSEIDON, Jason-1, ERS- 1/2, ENVISAT, and GFO, a global array of profiling floats from the Argo program, and satellite gravity data from the GRACE mission, suggest that the solutions are now useful for scientific purposes. Both methodology and applications are developing in a number of different directions.

Dry deposition at the Earth's surface is an important sink of atmospheric ozone. Currently, dry deposition of ozone to the ocean surface in atmospheric chemistry models has the largest uncertainty compared to deposition to other surface types, with implications for global tropospheric ozone budget and associated radiative forcing. Most global models assume that the dominant term of surface resistance in the parameterisation of ozone dry deposition velocity at the oceanic surface is constant. There have been recent mechanistic parameterisations for air-sea exchange that account for the simultaneous waterside processes of ozone solubility, molecular diffusion, turbulent transfer, and first-order chemical reaction of ozone with dissolved iodide and other compounds, but there are questions about their performance and consistency. We present a new two-layer parameterisation scheme for the oceanic surface resistance by making the following realistic assumptions: (a) the thickness of the top water layer is of the order of a reaction-diffusion length scale (a few micrometres) within which ozone loss is dominated by chemical reaction and the influence of waterside turbulent transfer is negligible; (b) in the water layer below, both chemical reaction and waterside turbulent transfer act together and are accounted for; and (c) chemical reactivity is present through the depth of the oceanic mixing layer. The new parameterisation has been evaluated against dry deposition velocities from recent open-ocean measurements. It is found that the inclusion of only the aqueous iodide-ozone reaction satisfactorily describes the measurements. In order to better quantify the global dry deposition loss and its interannual variability, modelled 3-hourly ozone deposition velocities are combined with the 3-hourly MACC (Monitoring Atmospheric Composition and Climate) reanalysis ozone for the years 2003-2012. The resulting ozone dry deposition is found to be 98.4 ± 30.0 Tg O3 yr-1 for the ocean

require the further re-examination of inter-mission consistency issues. Here we present an assessment of these recent improvements to the accuracy of the 17 -year sea surface height time series, and evaluate the subsequent impact on global and regional mean sea level estimates.

Density estimation employed in multi-pass global illumination algorithms gives cause to a trade-off problem between bias and noise. The problem is seen most evident as blurring of strong illumination features. This thesis addresses the problem, presenting four methods that reduce both noise...

Urban material resource requirements are significant at the global level and these are expected to expand with future urban population growth. However, there are no global scale studies on the future material consumption of urban areas. This paper provides estimates of global urban domestic material consumption (DMC) in 2050 using three approaches based on: current gross statistics; a regression model; and a transition theoretic logistic model. All methods use UN urban population projections and assume a simple ‘business-as-usual’ scenario wherein historical aggregate trends in income and material flow continue into the future. A collation of data for 152 cities provided a year 2000 world average DMC/capita estimate, 12 tons/person/year (±22%), which we combined with UN population projections to produce a first-order estimation of urban DMC at 2050 of ~73 billion tons/year (±22%). Urban DMC/capita was found to be significantly correlated (R 2 > 0.9) to urban GDP/capita and area per person through a power law relation used to obtain a second estimate of 106 billion tons (±33%) in 2050. The inelastic exponent of the power law indicates a global tendency for relative decoupling of direct urban material consumption with increasing income. These estimates are global and influenced by the current proportion of developed-world cities in the global population of cities (and in our sample data). A third method employed a logistic model of transitions in urban DMC/capita with regional resolution. This method estimatedglobal urban DMC to rise from approximately 40 billion tons/year in 2010 to ~90 billion tons/year in 2050 (modelled range: 66–111 billion tons/year). DMC/capita across different regions was estimated to converge from a range of 5–27 tons/person/year in the year 2000 to around 8–17 tons/person/year in 2050. The urban population does not increase proportionally during this period and thus the global average DMC/capita increases from ~12 to ~14 tons/person/year

Empirical Models for the Estimation of Global Solar Radiation in Yola, Nigeria. ... and average daily wind speed (WS) for the interval of three years (2010 – 2012) measured using various instruments for Yola of recorded data collected from the Center for Atmospheric Research (CAR), Anyigba are presented and analyzed.

The goal of the first phase of the NASA Energy and Water Cycle Study (NEWS) Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. Here we describe results of the water cycle assessment, including mean annual and monthly fluxes over continents and ocean basins during the first decade of the millennium. To the extent possible, the water flux estimates are based on (1) satellite measurements and (2) data-integrating models. A careful accounting of uncertainty in each flux was applied within a routine that enforced multiple water and energy budget constraints simultaneously in a variational framework, in order to produce objectively-determined, optimized estimates. Simultaneous closure of the water and energy budgets caused the ocean evaporation and precipitation terms to increase by about 10% and 5% relative to the original estimates, mainly because the energy budget required turbulent heat fluxes to be substantially larger in order to balance net radiation. In the majority of cases, the observed annual, surface and atmospheric water budgets over the continents and oceans close with much less than 10% residual. Observed residuals and optimized uncertainty estimates are considerably larger for monthly surface and atmospheric water budget closure, often nearing or exceeding 20% in North America, Eurasia, Australia and neighboring islands, and the Arctic and South Atlantic Oceans. The residuals in South America and Africa tend to be smaller, possibly because cold land processes are a non-issue. Fluxes are poorly observed over the Arctic Ocean, certain seas, Antarctica, and the Australasian and Indonesian Islands, leading to reliance on atmospheric analysis estimates. Other details of the study and future directions will be discussed.

Full Text Available Sampling from a Boltzmann distribution is NP-hard and so requires heuristic approaches. Quantum annealing is one promising candidate. The failure of annealing dynamics to equilibrate on practical time scales is a well understood limitation, but does not always prevent a heuristically useful distribution from being generated. In this paper we evaluate several methods for determining a useful operational temperature range for annealers. We show that, even where distributions deviate from the Boltzmann distribution due to ergodicity breaking, these estimates can be useful. We introduce the concepts of local and global temperatures that are captured by different estimation methods. We argue that for practical application it often makes sense to analyze annealers that are subject to post-processing in order to isolate the macroscopic distribution deviations that are a practical barrier to their application.

The U.S. Congress has determined that the safe use of nuclear materials for peaceful purposes is a legitimate and important national goal. It has entrusted the Nuclear Regulatory Commission (NRC) with the primary Federal responsibility for achieving that goal. The NRC's mission, therefore, is to regulate the Nation's civilian use of byproduct, source, and special nuclear materials to ensure adequate protection of public health and safety, to promote the common defense and security, and to protect the environment. The NRC's FY 1998 budget requests new budget authority of $481,300,000 to be funded by two appropriations - one is the NRC's Salaraies and Expenses appropriation for $476,500,000, and the other is NRC's Office of Inspector General appropriation for $4,800,000. Of the funds appropriated to the NRC's Salaries and Expenses, $17,000,000, shall be derived from the Nuclear Waste Fund and $2,000,000 shall be derived from general funds. The proposed FY 1998 appropriation legislation would also exempt the $2,000,000 for regulatory reviews and other assistance provided to the Department of Energy from the requirement that the NRC collect 100 percent of its budget from fees. The sums appropriated to the NRC's Salaries and Expenses and NRC's Office of Inspector General shall be reduced by the amount of revenues received during FY 1998 from licensing fees, inspection services, and other services and collections, so as to result in a final FY 1998 appropriation for the NRC of an estimated $19,000,000 - the amount appropriated from the Nuclear Waste Fund and from general funds. Revenues derived from enforcement actions shall be deposited to miscellaneous receipts of the Treasury

The U.S. Congress has determined that the safe use of nuclear materials for peaceful purposes is a legitimate and important national goal. It has entrusted the Nuclear Regulatory Commission (NRC) with the primary Federal responsibility for achieving that goal. The NRC`s mission, therefore, is to regulate the Nation`s civilian use of byproduct, source, and special nuclear materials to ensure adequate protection of public health and safety, to promote the common defense and security, and to protect the environment. The NRC`s FY 1998 budget requests new budget authority of $481,300,000 to be funded by two appropriations - one is the NRC`s Salaraies and Expenses appropriation for $476,500,000, and the other is NRC`s Office of Inspector General appropriation for $4,800,000. Of the funds appropriated to the NRC`s Salaries and Expenses, $17,000,000, shall be derived from the Nuclear Waste Fund and $2,000,000 shall be derived from general funds. The proposed FY 1998 appropriation legislation would also exempt the $2,000,000 for regulatory reviews and other assistance provided to the Department of Energy from the requirement that the NRC collect 100 percent of its budget from fees. The sums appropriated to the NRC`s Salaries and Expenses and NRC`s Office of Inspector General shall be reduced by the amount of revenues received during FY 1998 from licensing fees, inspection services, and other services and collections, so as to result in a final FY 1998 appropriation for the NRC of an estimated $19,000,000 - the amount appropriated from the Nuclear Waste Fund and from general funds. Revenues derived from enforcement actions shall be deposited to miscellaneous receipts of the Treasury.

The International Thermodynamic Equation of Seawater - 2010 has defined the thermodynamic properties of seawater in terms of a new salinity variable, Absolute Salinity, which takes into account the spatial variation of the composition of seawater. Absolute Salinity more accurately reflects the effects of the dissolved material in seawater on the thermodynamic properties (particularly density) than does Practical Salinity. When a seawater sample has standard composition (i.e. the ratios of the constituents of sea salt are the same as those of surface water of the North Atlantic), Practical Salinity can be used to accurately evaluate the thermodynamic properties of seawater. When seawater is not of standard composition, Practical Salinity alone is not sufficient and the Absolute Salinity Anomaly needs to be estimated; this anomaly is as large as 0.025 g kg-1 in the northernmost North Pacific. Here we provide an algorithm for estimating Absolute Salinity Anomaly for any location (x, y, p) in the world ocean. To develop this algorithm, we used the Absolute Salinity Anomaly that is found by comparing the density calculated from Practical Salinity to the density measured in the laboratory. These estimates of Absolute Salinity Anomaly however are limited to the number of available observations (namely 811). In order to provide a practical method that can be used at any location in the world ocean, we take advantage of approximate relationships between Absolute Salinity Anomaly and silicate concentrations (which are available globally).

This report contains the fiscal year budget justification to Congress. The budget provides estimates for salaries and expenses and for the Office of the Inspector General for fiscal years 1994 and 1995

Full Text Available For the past 50 years, the public has been made to feel guilty about the tragedy of human-caused biodiversity loss due to the extinction of hundreds or thousands of species every year. Numerous articles and books from the scientific and popular press and publicity on the internet have contributed to a propaganda wave about our grievous loss and the beginning of a sixth mass extinction. However, within the past few years, questions have arisen about the validity of the data which led to the doom scenario. Here I show that, for the past 500 years, terrestrial animals (insects and vertebrates have been losing less than two species per year due to human causes. The majority of the extinctions have occurred on oceanic islands with little effect on continental ecology. In the marine environment, losses have also been very low. At the same time, speciation has continued to occur and biodiversity gain by this means may have equaled or even surpassed the losses. While species loss is not, so far, a global conservation problem, ongoing population declines within thousands of species that are at risk on land and in the sea constitute an extinction debt that will be paid unless those species can be rescued.

Full Text Available The International Thermodynamic Equation of Seawater – 2010 has defined the thermodynamic properties of seawater in terms of a new salinity variable, Absolute Salinity, which takes into account the spatial variation of the composition of seawater. Absolute Salinity more accurately reflects the effects of the dissolved material in seawater on the thermodynamic properties (particularly density than does Practical Salinity.

When a seawater sample has standard composition (i.e. the ratios of the constituents of sea salt are the same as those of surface water of the North Atlantic, Practical Salinity can be used to accurately evaluate the thermodynamic properties of seawater. When seawater is not of standard composition, Practical Salinity alone is not sufficient and the Absolute Salinity Anomaly needs to be estimated; this anomaly is as large as 0.025 g kg−1 in the northernmost North Pacific. Here we provide an algorithm for estimating Absolute Salinity Anomaly for any location (x, y, p in the world ocean.

To develop this algorithm, we used the Absolute Salinity Anomaly that is found by comparing the density calculated from Practical Salinity to the density measured in the laboratory. These estimates of Absolute Salinity Anomaly however are limited to the number of available observations (namely 811. In order to provide a practical method that can be used at any location in the world ocean, we take advantage of approximate relationships between Absolute Salinity Anomaly and silicate concentrations (which are available globally.

National Aeronautics and Space Administration — The GlobalEstimated Net Migration by Decade: 1970-2000 data set provides estimates of net migration over the three decades from 1970 to 2000. Because of the lack of...

Degenerate parabolic operators have received increasing attention in recent years because they are associated with both important theoretical analysis, such as stochastic diffusion processes, and interesting applications to engineering, physics, biology, and economics. This manuscript has been conceived to introduce the reader to global Carleman estimates for a class of parabolic operators which may degenerate at the boundary of the space domain, in the normal direction to the boundary. Such a kind of degeneracy is relevant to study the invariance of a domain with respect to a given stochastic diffusion flow, and appears naturally in climatology models.

Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ∼3°S to ∼70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual temperature can be considered as BR in empirical models. A strong correlation was found between the mean annual ER and mean annual gross primary production (GPP). Consequently, GPP, which is typically more accurately modeled, can be used to estimate BR. A light use efficiency GPP model (i.e., EC-LUE) was applied to estimateglobal GPP, BR and ER with input data from MERRA (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate resolution Imaging Spectroradiometer). The global ER was 103 Pg C yr −1, with the highest respiration rate over tropical forests and the lowest value in dry and high-latitude areas.

Microwave Sounding Unit (MSU) Ch 2 data sets, collected from sequential, polar-orbiting, Sun-synchronous National Oceanic and Atmospheric Administration operational satellites, contain systematic calibration errors that are coupled to the diurnal temperature cycle over the globe. Since these coupled errors in MSU data differ between successive satellites, it is necessary to make compensatory adjustments to these multisatellite data sets in order to determine long-term global temperature change. With the aid of the observations during overlapping periods of successive satellites, we can determine such adjustments and use them to account for the coupled errors in the long-term time series of MSU Ch 2 global temperature. In turn, these adjusted MSU Ch 2 data sets can be used to yield global temperature trend. In a pioneering study, Spencer and Christy (SC) (1990) developed a procedure to derive the global temperature trend from MSU Ch 2 data. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedure, the magnitude of the coupled errors is not determined explicitly. Furthermore, based on some assumptions, these coupled errors are eliminated in three separate steps. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedures. Based on our analysis, we find there is a global warming of 0.23+/-0.12 K between 1980 and 1991. Also, in this study, the time series of global temperature anomalies constructed by removing the global mean annual temperature cycle compares favorably with a similar

This paper presents a new estimator of the global regularity index of a multifractional Brownian motion. Our estimation method is based upon a ratio statistic, which compares the realized global quadratic variation of a multifractional Brownian motion at two different frequencies. We show that a ...... that a logarithmic transformation of this statistic converges in probability to the minimum of the Hurst functional parameter, which is, under weak assumptions, identical to the global regularity index of the path....

This report contains the Nuclear Regulatory Commission (NRC) fiscal year budget justification to Congress. The budget provides estimates for salaries and expenses and for the Office of the Inspector General for fiscal year 1995. The NRC 1995 budget request is $546,497,000. This is an increase of $11,497,000 above the proposed level for FY 1994. The NRC FY 1995 budget request is 3,218 FTEs. This is a decrease of 75 FTEs below the 1994 proposed level

This report contains the Nuclear Regulatory Commission (NRC) fiscal year budget justification to Congress. The budget provides estimates for salaries and expenses and for the Office of the Inspector General for fiscal year 1995. The NRC 1995 budget request is $546,497,000. This is an increase of $11,497,000 above the proposed level for FY 1994. The NRC FY 1995 budget request is 3,218 FTEs. This is a decrease of 75 FTEs below the 1994 proposed level.

Coastal Indigenous peoples rely on ocean resources and are highly vulnerable to ecosystem and economic change. Their challenges have been observed and recognized at local and regional scales, yet there are no global-scale analyses to inform international policies. We compile available data for over 1,900 coastal Indigenous communities around the world representing 27 million people across 87 countries. Based on available data at local and regional levels, we estimate a total globalyearly seafood consumption of 2.1 million (1.5 million-2.8 million) metric tonnes by coastal Indigenous peoples, equal to around 2% of globalyearly commercial fisheries catch. Results reflect the crucial role of seafood for these communities; on average, consumption per capita is 15 times higher than non-Indigenous country populations. These findings contribute to an urgently needed sense of scale to coastal Indigenous issues, and will hopefully prompt increased recognition and directed research regarding the marine knowledge and resource needs of Indigenous peoples. Marine resources are crucial to the continued existence of coastal Indigenous peoples, and their needs must be explicitly incorporated into management policies.

This paper analyses the relationship between global solar irradiation and sunshine duration with different estimation models for the island of Gran Canaria (Spain). These parameters were taken from six measurement stations around the Island, and selected for their reliability and the long period of time they covered. All data used in this paper were handed over by the Canary Islands Technological Institute (I.T.C.). As a first approach, it was decided to study the Angstrom lineal model. In order to improve the knowledge on solar resources, a Typical Meteorological Year (TMY) was created from all daily data. TMY shows differences between southern and northern locations, where Trade Winds generate clouds during the summer months. TMY resumes a data bank much longer than a year in duration, generating the characteristics for a year series of each location, for both irradiation and sunshine duration. To create the TMY, weighted means have been used to smooth high or low values. At first, Angstrom lineal model has been used to estimate solar global irradiation from sunshine duration values, using TMY. But the lineal model didn't reproduce satisfactory results when used to obtain global solar radiation from all daily sunshine duration data. For this reason, different models based in both parameters were used. The parameters estimation of this model was achieved both from TMY daily and monthly series and from all daily data for every location. Because of the weather stability all over the year in the Island, most of the daily data are concentrated in a close range, occasioning a deviation in the lineal equations. To avoid this deviation it was proposed to consider a limit condition data, taking into account values out of the main cloud of data. Additionally, different models were proposed (quadratic, cubic, logarithmic and exponential) to make a regression from all daily data. The best results were obtained with the exponential model proposed in this paper. The

The performance of Sayigh's universal formula for the estimation of global solar radiation is tested against that of Angstrom-Black model for 13 stations in Ghana, using monthly mean daily global solar radiation averaged over the years 1957-1981. Sayigh's model is found not to perform as credibility as the Angstrom-Black model in the estimation of monthly global solar radiation in Ghana. Of the 156 values of monthly global solar radiation estimated by Sayigh's model, 123 (or 78.8%) had discrepancies of more than 10% with the measured values. The corresponding value for the Angstrom-Black model was 7 (or 4.5%). (author). 5 refs

The budget estimates for the NRC for fiscal year 1990 provide for obligations of $475,000,000, to be funded in total by two new appropriations---one is NRC's Salaries and Expenses appropriation for $472,100,000 and the other is NRC's Office of the Inspector General appropriation of $2,900,000. Of the funds appropriated to the NRC's Salaries and Expenses, $23,195,000 shall be derived from the Nuclear Waste Fund. The sum appropriated to the NRC's Salaries and Expenses shall be reduced by the amount of revenues received during fiscal year 1990 from licensing fees, inspection services, other services and collections, and from the Nuclear Waste Fund, excluding those moneys received for the cooperative nuclear safety research program, services rendered to foreign governments and international organizations, and the material and information access authorization programs, so as to result in a final fiscal year 1990 appropriation estimated at not more than $292,155,000

Full Text Available HCFC-22 (CHClF2, chlorodifluoromethane is an ozone-depleting substance (ODS as well as a significant greenhouse gas (GHG. HCFC-22 has been used widely as a refrigerant fluid in cooling and air-conditioning equipment since the 1960s, and it has also served as a traditional substitute for some chlorofluorocarbons (CFCs controlled under the Montreal Protocol. A low frequency record on tropospheric HCFC-22 since the late 1970s is available from measurements of the Southern Hemisphere Cape Grim Air Archive (CGAA and a few Northern Hemisphere air samples (mostly from Trinidad Head using the Advanced Global Atmospheric Gases Experiment (AGAGE instrumentation and calibrations. Since the 1990s high-frequency, high-precision, in situ HCFC-22 measurements have been collected at these AGAGE stations. Since 1992, the Global Monitoring Division of the National Oceanic and Atmospheric Administration/Earth System Research Laboratory (NOAA/ESRL has also collected flasks on a weekly basis from remote sites across the globe and analyzed them for a suite of halocarbons including HCFC-22. Additionally, since 2006 flasks have been collected approximately daily at a number of tower sites across the US and analyzed for halocarbons and other gases at NOAA. All results show an increase in the atmospheric mole fractions of HCFC-22, and recent data show a growth rate of approximately 4% per year, resulting in an increase in the background atmospheric mole fraction by a factor of 1.7 from 1995 to 2009. Using data on HCFC-22 consumption submitted to the United Nations Environment Programme (UNEP, as well as existing bottom-up emission estimates, we first create globally-gridded a priori HCFC-22 emissions over the 15 yr since 1995. We then use the three-dimensional chemical transport model, Model for Ozone and Related Chemical Tracers version 4 (MOZART v4, and a Bayesian inverse method to estimateglobal as well as regional annual emissions. Our inversion indicates

Background Rabies is a notoriously underreported and neglected disease of low-income countries. This study aims to estimate the public health and economic burden of rabies circulating in domestic dog populations, globally and on a country-by-country basis, allowing an objective assessment of how much this preventable disease costs endemic countries. Methodology/Principal Findings We established relationships between rabies mortality and rabies prevention and control measures, which we incorporated into a model framework. We used data derived from extensive literature searches and questionnaires on disease incidence, control interventions and preventative measures within this framework to estimate the disease burden. The burden of rabies impacts on public health sector budgets, local communities and livestock economies, with the highest risk of rabies in the poorest regions of the world. This study estimates that globally canine rabies causes approximately 59,000 (95% Confidence Intervals: 25-159,000) human deaths, over 3.7 million (95% CIs: 1.6-10.4 million) disability-adjusted life years (DALYs) and 8.6 billion USD (95% CIs: 2.9-21.5 billion) economic losses annually. The largest component of the economic burden is due to premature death (55%), followed by direct costs of post-exposure prophylaxis (PEP, 20%) and lost income whilst seeking PEP (15.5%), with only limited costs to the veterinary sector due to dog vaccination (1.5%), and additional costs to communities from livestock losses (6%). Conclusions/Significance This study demonstrates that investment in dog vaccination, the single most effective way of reducing the disease burden, has been inadequate and that the availability and affordability of PEP needs improving. Collaborative investments by medical and veterinary sectors could dramatically reduce the current large, and unnecessary, burden of rabies on affected communities. Improved surveillance is needed to reduce uncertainty in burden estimates and to

Full Text Available Rabies is a notoriously underreported and neglected disease of low-income countries. This study aims to estimate the public health and economic burden of rabies circulating in domestic dog populations, globally and on a country-by-country basis, allowing an objective assessment of how much this preventable disease costs endemic countries.We established relationships between rabies mortality and rabies prevention and control measures, which we incorporated into a model framework. We used data derived from extensive literature searches and questionnaires on disease incidence, control interventions and preventative measures within this framework to estimate the disease burden. The burden of rabies impacts on public health sector budgets, local communities and livestock economies, with the highest risk of rabies in the poorest regions of the world. This study estimates that globally canine rabies causes approximately 59,000 (95% Confidence Intervals: 25-159,000 human deaths, over 3.7 million (95% CIs: 1.6-10.4 million disability-adjusted life years (DALYs and 8.6 billion USD (95% CIs: 2.9-21.5 billion economic losses annually. The largest component of the economic burden is due to premature death (55%, followed by direct costs of post-exposure prophylaxis (PEP, 20% and lost income whilst seeking PEP (15.5%, with only limited costs to the veterinary sector due to dog vaccination (1.5%, and additional costs to communities from livestock losses (6%.This study demonstrates that investment in dog vaccination, the single most effective way of reducing the disease burden, has been inadequate and that the availability and affordability of PEP needs improving. Collaborative investments by medical and veterinary sectors could dramatically reduce the current large, and unnecessary, burden of rabies on affected communities. Improved surveillance is needed to reduce uncertainty in burden estimates and to monitor the impacts of control efforts.

The simulation of gross primary production (GPP) at various spatial and temporal scales remains a major challenge for quantifying the global carbon cycle. We developed a light use efficiency model, called EC-LUE, driven by only four variables: normalized difference vegetation index (NDVI), photosynthetically active radiation (PAR), air temperature, and the Bowen ratio of sensible to latent heat flux. The EC-LUE model may have the most potential to adequately address the spatial and temporal dynamics of GPP because its parameters (i.e., the potential light use efficiency and optimal plant growth temperature) are invariant across the various land cover types. However, the application of the previous EC-LUE model was hampered by poor prediction of Bowen ratio at the large spatial scale. In this study, we substituted the Bowen ratio with the ratio of evapotranspiration (ET) to net radiation, and revised the RS-PM (Remote Sensing-Penman Monteith) model for quantifying ET. Fifty-four eddy covariance towers, including various ecosystem types, were selected to calibrate and validate the revised RS-PM and EC-LUE models. The revised RS-PM model explained 82% and 68% of the observed variations of ET for all the calibration and validation sites, respectively. Using estimated ET as input, the EC-LUE model performed well in calibration and validation sites, explaining 75% and 61% of the observed GPP variation for calibration and validation sites respectively.Global patterns of ET and GPP at a spatial resolution of 0.5° latitude by 0.6° longitude during the years 2000–2003 were determined using the global MERRA dataset (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate Resolution Imaging Spectroradiometer). The globalestimates of ET and GPP agreed well with the other global models from the literature, with the highest ET and GPP over tropical forests and the lowest values in dry and high latitude areas. However, comparisons with observed

Full Text Available Illegal markets represent a phenomenon of considerable economic, political and social significance whose annual income exceeds the value of a thousand billion USD. Illegal market participants are beyond the reach of government institutions and rule of law while social connections and personal acquaintances play an important role of functional substitute. In the last decade there was a significant increase of illegal trafficking of narcotics, people, fire arms, counterfeit products and natural resources. Both selling and purchase of these as well as other kinds of products and services at illegal markets are generally characterized by high level of organization and presence of strong criminal groups and networks. Although these activities existed in the past their present scope and geographic distribution are without precedent. Measuring unlawful financial flows at illegal markets represents quite a complex task. Various estimates are the result of inexistence of uniform and generally accepted methodology. In addition to this, the special problem is also the consensus of market actors, because of which the phenomenon of illegal markets and distribution of products and services at these markets is rather hidden. The paper defines and analyzes the key features of illegal markets, the role of organized crime at illegal markets, as well as the estimates of the values of financial flows at the markets of counterfeit products, narcotics, and people as goods, or human organs and sexual services, weapons, tobacco products and dirty money.

BackgroundThe public health impact of foodborne diseases globally is unknown. The WHO Initiative to Estimate the Global Burden of Foodborne Diseases was launched out of the need to fill this data gap. It is anticipated that this effort will enable policy makers and other stakeholders to set...... appropriate, evidence-informed priorities in the area of food safety. MethodsThe Initiative aims to provide estimates on the global burden of foodborne diseases by age, sex, and region; strengthen country capacity for conducting burden of foodborne disease assessments in parallel with food safety policy...

Population modelling--forecasting. To estimate the global incidence of traumatic spinal cord injury (TSCI). An initiative of the International Spinal Cord Society (ISCoS) Prevention Committee. Regression techniques were used to derive regional and globalestimates of TSCI incidence. Using the findings of 31 published studies, a regression model was fitted using a known number of TSCI cases as the dependent variable and the population at risk as the single independent variable. In the process of deriving TSCI incidence, an alternative TSCI model was specified in an attempt to arrive at an optimal way of estimating the global incidence of TSCI. The global incidence of TSCI was estimated to be 23 cases per 1,000,000 persons in 2007 (179,312 cases per annum). World Health Organization's regional results are provided. Understanding the incidence of TSCI is important for health service planning and for the determination of injury prevention priorities. In the absence of high-quality epidemiological studies of TSCI in each country, the estimation of TSCI obtained through population modelling can be used to overcome known deficits in global spinal cord injury (SCI) data. The incidence of TSCI is context specific, and an alternative regression model demonstrated how TSCI incidence estimates could be improved with additional data. The results highlight the need for data standardisation and comprehensive reporting of national level TSCI data. A step-wise approach from the collation of conventional epidemiological data through to population modelling is suggested.

We estimatedglobal cyanobacterial biomass in the main reservoirs of cyanobacteria on Earth: marine and freshwater plankton, arid land soil crusts, and endoliths. Estimates were based on typical population density values as measured during our research, or as obtained from literature surveys, which were then coupled with data on global geographical area coverage. Among the marine plankton, the global biomass of Prochlorococcus reaches 120 × 1012 grams of carbon (g C), and that of Synechoccus some 43 × 1012 g C. This makes Prochlorococcus and Synechococcus, in that order, the most abundant cyanobacteria on Earth. Tropical marine blooms of Trichodesmium account for an additional 10 × 1012 g C worldwide. In terrestrial environments, the mass of cyanobacteria in arid land soil crusts is estimated to reach 54 × 1012 g C and that of arid land endolithic communities an additional 14 × 1012 g C. The global biomass of planktic cyanobacteria in lakes is estimated to be around 3 × 1012 g C. Our conservative estimates, which did not include some potentially significant biomass reservoirs such as polar and subarctic areas, topsoils in subhumid climates, and shallow marine and freshwater benthos, indicate that the total global cyanobacterial biomass is in the order of 3 × 1014 g C, surpassing a thousand million metric tons (1015 g) of wet biomass.

The global stereo matching algorithms are of high accuracy for the estimation of disparity map, but the time-consuming in the optimization process still faces a curse, especially for the image pairs with high resolution and large baseline setting. To improve the computational efficiency of the global algorithms, a disparity range estimation scheme for the global stereo matching is proposed to estimate the disparity map of rectified stereo images in this paper. The projective geometry in a parallel binocular stereo vision is investigated to reveal a relationship between two disparities at each pixel in the rectified stereo images with different baselines, which can be used to quickly obtain a predicted disparity map in a long baseline setting estimated by that in the small one. Then, the drastically reduced disparity ranges at each pixel under a long baseline setting can be determined by the predicted disparity map. Furthermore, the disparity range estimation scheme is introduced into the graph cuts with expansion moves to estimate the precise disparity map, which can greatly save the cost of computing without loss of accuracy in the stereo matching, especially for the dense global stereo matching, compared to the traditional algorithm. Experimental results with the Middlebury stereo datasets are presented to demonstrate the validity and efficiency of the proposed algorithm.

Full Text Available Climate, land use, and other anthropogenic and natural drivers have the potential to influence fire dynamics in many regions. To develop a mechanistic understanding of the changing role of these drivers and their impact on atmospheric composition, long-term fire records are needed that fuse information from different satellite and in situ data streams. Here we describe the fourth version of the Global Fire Emissions Database (GFED and quantify global fire emissions patterns during 1997–2016. The modeling system, based on the Carnegie–Ames–Stanford Approach (CASA biogeochemical model, has several modifications from the previous version and uses higher quality input datasets. Significant upgrades include (1 new burned area estimates with contributions from small fires, (2 a revised fuel consumption parameterization optimized using field observations, (3 modifications that improve the representation of fuel consumption in frequently burning landscapes, and (4 fire severity estimates that better represent continental differences in burning processes across boreal regions of North America and Eurasia. The new version has a higher spatial resolution (0.25° and uses a different set of emission factors that separately resolves trace gas and aerosol emissions from temperate and boreal forest ecosystems. Global mean carbon emissions using the burned area dataset with small fires (GFED4s were 2.2 × 1015 grams of carbon per year (Pg C yr−1 during 1997–2016, with a maximum in 1997 (3.0 Pg C yr−1 and minimum in 2013 (1.8 Pg C yr−1. These estimates were 11 % higher than our previous estimates (GFED3 during 1997–2011, when the two datasets overlapped. This net increase was the result of a substantial increase in burned area (37 %, mostly due to the inclusion of small fires, and a modest decrease in mean fuel consumption (−19 % to better match estimates from field studies, primarily in savannas and

Using three sets of satellite data for burned areas together with the tree cover imagery and a biogeochemical component of the Integrated Science Assessment Model (ISAM) the global emissions of CO and associated uncertainties are estimated for the year 2000. The available fuel load (AFL) is calculated using the ISAM biogeochemical model, which accounts for the aboveground and surface fuel removed by land clearing for croplands and pasturelands, as well as the influence on fuel load of various ecosystem processes (such as stomatal conductance, evapotranspiration, plant photosynthesis and respiration, litter production, and soil organic carbon decomposition) and important feedback mechanisms (such as climate and fertilization feedback mechanism). The ISAM estimatedglobal total AFL in the year 2000 was about 687 Pg AFL. All forest ecosystems account for about 90% of the global total AFL. The estimatedglobal CO emissions based on three global burned area satellite data sets (GLOBSCAR, GBA, and Global Fire Emissions Database version 2 (GFEDv2)) for the year 2000 ranges between 320 and 390 Tg CO. Emissions from open fires are highest in tropical Africa, primarily due to forest cutting and burning. The estimated overall uncertainty in global CO emission is about ±65%, with the highest uncertainty occurring in North Africa and Middle East region (±99%). The results of this study suggest that the uncertainties in the calculated emissions stem primarily from the area burned data.

Though urban agriculture (UA), defined here as growing of crops in cities, is increasing in popularity and importance globally, little is known about the aggregate benefits of such natural capital in built-up areas. Here, we introduce a quantitative framework to assess global aggregate ecosystem services from existing vegetation in cities and an intensive UA adoption scenario based on data-driven estimates of urban morphology and vacant land. We analyzed global population, urban, meteorological, terrain, and Food and Agriculture Organization (FAO) datasets in Google Earth Engine to derive global scale estimates, aggregated by country, of services provided by UA. We estimate the value of four ecosystem services provided by existing vegetation in urban areas to be on the order of 33 billion annually. We project potential annual food production of 100-180 million tonnes, energy savings ranging from 14 to 15 billion kilowatt hours, nitrogen sequestration between 100,000 and 170,000 tonnes, and avoided storm water runoff between 45 and 57 billion cubic meters annually. In addition, we estimate that food production, nitrogen fixation, energy savings, pollination, climate regulation, soil formation and biological control of pests could be worth as much as 80-160 billion annually in a scenario of intense UA implementation. Our results demonstrate significant country-to-country variability in UA-derived ecosystem services and reduction of food insecurity. These estimates represent the first effort to consistently quantify these incentives globally, and highlight the relative spatial importance of built environments to act as change agents that alleviate mounting concerns associated with global environmental change and unsustainable development.

textabstractThis paper addresses global error estimation and control for initial value problems for ordinary differential equations. The focus lies on a comparison between a novel approach based onthe adjoint method combined with a small sample statistical initialization and the classical approach

Abstract. This paper addresses global error estimation and control for initial value problems for ordinary differential equations. The focus lies on a comparison between a novel approach based on the adjoint method combined with a small sample statistical initialization and the classical approach

We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

The author has attempted to detect the presence of low-dimensional deterministic chaos in temperature data by estimating the correlation dimension with the Hill estimate that has been recently developed by Mikosch and Wang. There is no convincing evidence of low dimensionality with either global dataset (Southern Hemisphere monthly average temperatures from 1858 to 1984) or local temperature dataset (daily minimums at Auckland, New Zealand). Any apparent reduction in the dimension estimates appears to be due large1y, if not entirely, to effects of statistical bias, but neither is it a purely random stochastic process. The dimension of the climatic attractor may be significantly larger than 10.

Studies on the contribution of milk production to global greenhouse gas (GHG) emissions are rare (FAO 2010) and often based on crude data which do not appropriately reflect the heterogeneity of farming systems. This article estimates GHG emissions from milk production in different dairy regions of the world based on a harmonised farm data and assesses the contribution of milk production to global GHG emissions. The methodology comprises three elements: (1) the International Farm Comparison Network (IFCN) concept of typical farms and the related globally standardised dairy model farms representing 45 dairy regions in 38 countries; (2) a partial life cycle assessment model for estimating GHG emissions of the typical dairy farms; and (3) standard regression analysis to estimate GHG emissions from milk production in countries for which no typical farms are available in the IFCN database. Across the 117 typical farms in the 38 countries analysed, the average emission rate is 1.50 kg CO(2) equivalents (CO(2)-eq.)/kg milk. The contribution of milk production to the global anthropogenic emissions is estimated at 1.3 Gt CO(2)-eq./year, accounting for 2.65% of total global anthropogenic emissions (49 Gt; IPCC, Synthesis Report for Policy Maker, Valencia, Spain, 2007). We emphasise that our estimates of the contribution of milk production to global GHG emissions are subject to uncertainty. Part of the uncertainty stems from the choice of the appropriate methods for estimating emissions at the level of the individual animal.

A data set of quality controlled radiation observations from stations scattered throughout Australia was formed and further screened to remove residual doubtful observations. It was then divided into groups by solar elevation, and used to find average relationships for each elevation group between relative global radiation (clearness index - the measured global radiation expressed as a proportion of the radiation on a horizontal surface at the top of the atmosphere) and relative diffuse radiation. Clear-cut relationships were found, which were then fitted by polynomial expressions giving the relative diffuse radiation as a function of relative global radiation and solar elevation. When these expressions were used to estimate the diffuse radiation from the global, the results had a slightly smaller spread of errors than those from an earlier technique given by Spencer. It was found that the errors were related to cloud amount, and further relationships were developed giving the errors as functions of global radiation, solar elevation, and the fraction of sky obscured by high cloud and by opaque (low and middle level) cloud. When these relationships were used to adjust the first estimates of diffuse radiation, there was a considerable reduction in the number of large errors

Estimatingglobal scale of the terrestrial carbon flux change with high accuracy and high resolution is important to understand global environmental changes. Furthermore the estimations of the global spatiotemporal distribution may contribute to the political and social activities such as REDD+. In order to reveal the current state of terrestrial carbon fluxes covering all over the world and a decadal scale. The satellite-based diagnostic biosphere model is suitable for achieving this purpose owing to observing on the present global land surface condition uniformly at some time interval. In this study, we estimated the global terrestrial carbon fluxes with 1km grids by using the terrestrial biosphere model (BEAMS). And we evaluated our new carbon flux estimations on various spatial scales and showed the transition of forest carbon stocks in some regions. Because BEAMS required high resolution meteorological data and satellite data as input data, we made 1km interpolated data using a kriging method. The data used in this study were JRA-55, GPCP, GOSAT L4B atmospheric CO2 data as meteorological data, and MODIS land product as land surface satellite data. Interpolating process was performed on the meteorological data because of insufficient resolution, but not on MODIS data. We evaluated our new carbon flux estimations using the flux tower measurement (FLUXNET2015 Datasets) in a point scale. We used 166 sites data for evaluating our model results. These flux sites are classified following vegetation type (DBF, EBF, ENF, mixed forests, grass lands, croplands, shrub lands, Savannas, wetlands). In global scale, the BEAMS estimations was underestimated compared to the flux measurements in the case of carbon uptake and release. The monthly variations of NEP showed relatively high correlations in DBF and mixed forests, but the correlation coefficients of EBF, ENF, and grass lands were less than 0.5. In the meteorological factors, air temperature and solar radiation showed

The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models. (author)

Artificial intelligence techniques, such as fuzzy logic and neural networks, have been used for estimating hourly global radiation from satellite images. The models have been fitted to measured global irradiance data from 15 Spanish terrestrial stations. Both satellite imaging data and terrestrial information from the years 1994, 1995 and 1996 were used. The results of these artificial intelligence models were compared to a multivariate regression based upon Heliosat I model. A general better behaviour was observed for the artificial intelligence models

The relationship between image features and scene structure is central to the study of human visual perception and computer vision, but many of the specifics of real-world layout perception remain unknown. We do not know which image features are relevant to perceiving layout properties, or whether those features provide the same information for every type of image. Furthermore, we do not know the spatial resolutions required for perceiving different properties. This paper describes an experiment and a computational model that provides new insights on these issues. Humans perceive the global spatial layout properties such as dominant depth, openness, and perspective, from a single image. This work describes an algorithm that reliably predicts human layout judgments. This model's predictions are general, not specific to the observers it trained on. Analysis reveals that the optimal spatial resolutions for determining layout vary with the content of the space and the property being estimated. Openness is best estimated at high resolution, depth is best estimated at medium resolution, and perspective is best estimated at low resolution. Given the reliability and simplicity of estimating the global layout of real-world environments, this model could help resolve perceptual ambiguities encountered by more detailed scene reconstruction schemas.

Three empirical models suggested by different investigators, for estimating monthly mean daily global radiation on a horizontal surface, are compared statistically to test their universal applicability. The models thus compared are those suggested by Rietveld, Glover and McCulloch and Gopinathan. The models are compared by calculating the root mean square error, mean bias error and mean relative percentage error values. The model suggested by Gopinathan yields the best results in terms of root mean square, mean bias and mean percentage errors. The model by Rietveld is the second best and the model by Glover and McCulloch comes at third place. However, the differences in the magnitude of errors among the three models are very small and all the three models can be considered to be accurate for global radiation estimation for any location in the world

OBJECTIVE: To validate the estimates of Global Burden of Disease (GBD) due to congenital anomaly for Europe by comparing infant mortality data collected by EUROCAT registries with the WHO Mortality Database, and by assessing the significance of stillbirths and terminations of pregnancy for fetal...... the burden of disease due to congenital anomaly, and thus declining YLL over time may obscure lack of progress in primary, secondary and tertiary prevention....

Full Text Available Herpes simplex virus type 1 (HSV-1 commonly causes orolabial ulcers, while HSV-2 commonly causes genital ulcers. However, HSV-1 is an increasing cause of genital infection. Previously, the World Health Organization estimated the global burden of HSV-2 for 2003 and for 2012. The global burden of HSV-1 has not been estimated.We fitted a constant-incidence model to pooled HSV-1 prevalence data from literature searches for 6 World Health Organization regions and used 2012 population data to derive global numbers of 0-49-year-olds with prevalent and incident HSV-1 infection. To estimate genital HSV-1, we applied values for the proportion of incident infections that are genital.We estimated that 3709 million people (range: 3440-3878 million aged 0-49 years had prevalent HSV-1 infection in 2012 (67%, with highest prevalence in Africa, South-East Asia and Western Pacific. Assuming 50% of incident infections among 15-49-year-olds are genital, an estimated 140 million (range: 67-212 million people had prevalent genital HSV-1 infection, most of which occurred in the Americas, Europe and Western Pacific.The global burden of HSV-1 infection is huge. Genital HSV-1 burden can be substantial but varies widely by region. Future control efforts, including development of HSV vaccines, should consider the epidemiology of HSV-1 in addition to HSV-2, and especially the relative contribution of HSV-1 to genital infection.

In this work, the efficiency of a radiative transfer model in estimating the annual solar global radiation has been evaluated, over different locations at Galicia, Spain, in clear sky periods. Due to its quantitative significance, special attention has been focused on the analysis of the influence of visibility over the global radiation. By comparison of both estimated and measured global solar radiation along year 2002, a typical annual visibility series was obtained over every location. These visibility values has been analysed in order to identify patterns and typical values, in order to be used to estimate the global solar radiation along a different year. Validation was done over the year 2003, obtaining an annual estimation less than 10 % different to the measured value. (Author)

Full Text Available Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale.

Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale.

In actual damage detection methods, localization and severity estimation can be treated separately. The severity is commonly estimated using fracture mechanics approach, with the main disadvantage of involving empirically deduced relations. In this paper, a damage severity estimator based on the global stiffness reduction is proposed. This feature is computed from the deflections of the intact and damaged beam, respectively. The damage is always located where the bending moment achieves maxima. If the damage is positioned elsewhere on the beam, its effect becomes lower, because the stress is produced by a diminished bending moment. It is shown that the global stiffness reduction produced by a crack is the same for all beams with a similar cross-section, regardless of the boundary conditions. One mathematical relation indicating the severity and another indicating the effect of removing damage from the beam. Measurements on damaged beams with different boundary conditions and cross-sections are carried out, and the location and severity are found using the proposed relations. These comparisons prove that the proposed approach can be used to accurately compute the severity estimator. (paper)

The International Year of Astronomy 2009 (IYA2009) is a global collaboration between nations and organisations for peaceful purposes - the search for our cosmic origin, a common heritage that connects everyone. The science of astronomy represents millennia of collaborations across all boundaries: geographic, gender, age, culture and race, in accordance with the principles of the UN Charter. 1 January 2009 will mark the beginning of the IYA2009 in the eyes of the public. However this immense worldwide science outreach and education event began more than six years earlier, with IAU's initiative in 2003. The IYA2009 aims to unite nations under the umbrella of astronomy and science, while at the same time acknowledging cultural differences and national and regional particularities. Never before has such a network of scientists, amateur astronomers, educators, journalists and scientific institutions come together. When the IYA2009 officially kicks off in Paris on 15 January 2009, it is estimated that more than 5000 people will be directly involved in the organisation of IYA2009 activities across the globe. During this talk we will outline the status of the principal projects and activities that make up the Year.

Full Text Available We consider the parabolic system $ u_{t}-\\Delta u = u^{r}v^{p}$, $v_{t}-\\Delta v = u^{q}v^{s}$ in $\\Omega\\times(0,\\infty$, complemented by the homogeneous Dirichlet boundary conditions and the initial conditions $(u,v(\\cdot,0 = (u_{0},v_{0}$ in $\\Omega$, where $\\Omega $ is a smooth bounded domain in $ \\mathbb{R}^{N} $ and $ u_{0},v_{0}\\in L^{\\infty}(\\Omega$ are nonnegative functions. We find conditions on $ p,q,r,s $ guaranteeing a priori estimates of nonnegative classical global solutions. More precisely every such solution is bounded by a constant depending on suitable norm of the initial data. Our proofs are based on bootstrap in weighted Lebesgue spaces, universal estimates of auxiliary functions and estimates of the Dirichlet heat kernel.

Retrievals of falling snow from space represent an important data set for understanding and linking the Earth's atmospheric, hydrological, and energy cycles. Estimates of falling snow must be captured to obtain the true global precipitation water cycle, snowfall accumulations are required for hydrological studies, and without knowledge of the frozen particles in clouds one cannot adequately understand the energy and radiation budgets. This work focuses on comparing the first stable falling snow retrieval products (released May 2017) for the Global Precipitation Measurement (GPM) Core Observatory (GPM-CO), which was launched February 2014, and carries both an active dual frequency (Ku- and Ka-band) precipitation radar (DPR) and a passive microwave radiometer (GPM Microwave Imager-GMI). Five separate GPM-CO falling snow retrieval algorithm products are analyzed including those from DPR Matched (Ka+Ku) Scan, DPR Normal Scan (Ku), DPR High Sensitivity Scan (Ka), combined DPR+GMI, and GMI. While satellite-based remote sensing provides global coverage of falling snow events, the science is relatively new, the different on-orbit instruments don't capture all snow rates equally, and retrieval algorithms differ. Thus a detailed comparison among the GPM-CO products elucidates advantages and disadvantages of the retrievals. GPM and CloudSat global snowfall evaluation exercises are natural investigative pathways to explore, but caution must be undertaken when analyzing these datasets for comparative purposes. This work includes outlining the challenges associated with comparing GPM-CO to CloudSat satellite snow estimates due to the different sampling, algorithms, and instrument capabilities. We will highlight some factors and assumptions that can be altered or statistically normalized and applied in an effort to make comparisons between GPM and CloudSat global satellite falling snow products as equitable as possible.

Full Text Available We evaluate black carbon (BC model predictions from the AeroCom model intercomparison project by considering the diversity among year 2000 model simulations and comparing model predictions with available measurements. These model-measurement intercomparisons include BC surface and aircraft concentrations, aerosol absorption optical depth (AAOD retrievals from AERONET and Ozone Monitoring Instrument (OMI and BC column estimations based on AERONET. In regions other than Asia, most models are biased high compared to surface concentration measurements. However compared with (column AAOD or BC burden retreivals, the models are generally biased low. The average ratio of model to retrieved AAOD is less than 0.7 in South American and 0.6 in African biomass burning regions; both of these regions lack surface concentration measurements. In Asia the average model to observed ratio is 0.7 for AAOD and 0.5 for BC surface concentrations. Compared with aircraft measurements over the Americas at latitudes between 0 and 50N, the average model is a factor of 8 larger than observed, and most models exceed the measured BC standard deviation in the mid to upper troposphere. At higher latitudes the average model to aircraft BC ratio is 0.4 and models underestimate the observed BC loading in the lower and middle troposphere associated with springtime Arctic haze. Low model bias for AAOD but overestimation of surface and upper atmospheric BC concentrations at lower latitudes suggests that most models are underestimating BC absorption and should improve estimates for refractive index, particle size, and optical effects of BC coating. Retrieval uncertainties and/or differences with model diagnostic treatment may also contribute to the model-measurement disparity. Largest AeroCom model diversity occurred in northern Eurasia and the remote Arctic, regions influenced by anthropogenic sources. Changing emissions, aging, removal, or optical properties within a single model

Full Text Available Accurately estimating vegetation productivity is important in research on terrestrial ecosystems, carbon cycles and climate change. Eight-day gross primary production (GPP and annual net primary production (NPP are contained in MODerate Resolution Imaging Spectroradiometer (MODIS products (MOD17, which are considered the first operational datasets for monitoring global vegetation productivity. However, the cloud-contaminated MODIS leaf area index (LAI and Fraction of Photosynthetically Active Radiation (FPAR retrievals may introduce some considerable errors to MODIS GPP and NPP products. In this paper, global eight-day GPP and eight-day NPP were first estimated based on Global LAnd Surface Satellite (GLASS LAI and FPAR products. Then, GPP and NPP estimates were validated by FLUXNET GPP data and BigFoot NPP data and were compared with MODIS GPP and NPP products. Compared with MODIS GPP, a time series showed that estimated GLASS GPP in our study was more temporally continuous and spatially complete with smoother trajectories. Validated with FLUXNET GPP and BigFoot NPP, we demonstrated that estimated GLASS GPP and NPP achieved higher precision for most vegetation types.

Religious affiliation influences societal practices regarding death and dying, including palliative care, religiously acceptable health service procedures, funeral rites and beliefs about an afterlife. We aimed to estimate and project religious affiliation at the time of death globally, as this information has been lacking. We compiled data on demographic information and religious affiliation from more than 2500 surveys, registers and censuses covering 198 nations/territories. We present estimates of religious affiliation at the time of death as of 2010, projections up to and including 2060, taking into account trends in mortality, religious conversion, intergenerational transmission of religion, differential fertility, and gross migration flows, by age and sex. We find that Christianity continues to be the most common religion at death, although its share will fall from 37% to 31% of global deaths between 2010 and 2060. The share of individuals identifying as Muslim at the time of death increases from 21% to 24%. The share of religiously unaffiliated will peak at 17% in 2035 followed by a slight decline thereafter. In specific regions, such as Europe, the unaffiliated share will continue to rises from 14% to 21% throughout the period. Religious affiliation at the time of death is changing globally, with distinct regional patterns. This could affect spatial variation in healthcare and social customs relating to death and dying.

The objective of this work was the determination of the ''a'' and '' b'' constants of the Angstrom linear model in order to estimate the global solar radiation in Lavras, MG. The work was carried out in the Climatological Station of Lavras (ECP/INMET/UFLA), at the Federal University of Lavras, from December 2001 to November 2002, through insolation daily data and global solar radiation daily records. The ''a'' and '' b'' constants, that express the atmospheric transmitance, were obtained by regression analysis of those data. The obtained equation, Qg/Qt = 0,23 + 0,49 presented a determination coefficient of 0,89. The results are smaller than those suggested by the recommendations that uses the local latitude. According to the results, its possible to indicate the values of 0,23 and 0,49 to be used as the ''a'' and '' b'' constants on the Angstrom equation to estimate the global solar radiation in Lavras, MG. (author) [pt

Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ~3°S to ~70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual

Full Text Available Female sex workers (FSWs are at high risk of HIV infection. Our objective was to determine the proportion of HIV prevalence in the general female adult population that is attributable to the occupational exposure of female sex work, due to unprotected sexual intercourse.Population attributable fractions of HIV prevalence due to female sex work were estimated for 2011. A systematic search was conducted to retrieve required input data from available sources. Data gaps of HIV prevalence in FSWs for 2011 were filled using multilevel modeling and multivariate linear regression. The fraction of HIV attributable to female sex work was estimated as the excess HIV burden in FSWs deducting the HIV burden in FSWs due to injecting drug use.An estimated fifteen percent of HIV in the general female adult population is attributable to (unsafe female sex work. The region with the highest attributable fraction is Sub Saharan Africa, but the burden is also substantial for the Caribbean, Latin America and South and Southeast Asia. We estimate 106,000 deaths from HIV are a result of female sex work globally, 98,000 of which occur in Sub-Saharan Africa. If HIV prevalence in other population groups originating from sexual contact with FSWs had been considered, the overall attributable burden would probably be much larger.Female sex work is an important contributor to HIV transmission and the global HIV burden. Effective HIV prevention measures exist and have been successfully targeted at key populations in many settings. These must be scaled up.FSWs suffer from high HIV burden and are a crucial core population for HIV transmission. Surveillance, prevention and treatment of HIV in FSWs should benefit both this often neglected vulnerable group and the general population.

Introduction Female sex workers (FSWs) are at high risk of HIV infection. Our objective was to determine the proportion of HIV prevalence in the general female adult population that is attributable to the occupational exposure of female sex work, due to unprotected sexual intercourse. Methods Population attributable fractions of HIV prevalence due to female sex work were estimated for 2011. A systematic search was conducted to retrieve required input data from available sources. Data gaps of HIV prevalence in FSWs for 2011 were filled using multilevel modeling and multivariate linear regression. The fraction of HIV attributable to female sex work was estimated as the excess HIV burden in FSWs deducting the HIV burden in FSWs due to injecting drug use. Results An estimated fifteen percent of HIV in the general female adult population is attributable to (unsafe) female sex work. The region with the highest attributable fraction is Sub Saharan Africa, but the burden is also substantial for the Caribbean, Latin America and South and Southeast Asia. We estimate 106,000 deaths from HIV are a result of female sex work globally, 98,000 of which occur in Sub-Saharan Africa. If HIV prevalence in other population groups originating from sexual contact with FSWs had been considered, the overall attributable burden would probably be much larger. Discussion Female sex work is an important contributor to HIV transmission and the global HIV burden. Effective HIV prevention measures exist and have been successfully targeted at key populations in many settings. These must be scaled up. Conclusion FSWs suffer from high HIV burden and are a crucial core population for HIV transmission. Surveillance, prevention and treatment of HIV in FSWs should benefit both this often neglected vulnerable group and the general population. PMID:23717432

Estimates of the global radiative forcing by line-shaped contrails differ mainly due to the large uncertainty in contrail optical depth. Most contrails are optically thin so that their radiative forcing is roughly proportional to their optical depth and increases with contrail coverage. In recent assessments, the best estimate of mean contrail radiative forcing was significantly reduced, because global climate model simulations pointed at lower optical depth values than earlier studies. We revise these estimates by comparing the probability distribution of contrail optical depth diagnosed with a climate model with the distribution derived from a microphysical, cloud-scale model constrained by satellite observations over the United States. By assuming that the optical depth distribution from the cloud model is more realistic than that from the climate model, and by taking the difference between the observed and simulated optical depth over the United States as globally representative, we quantify uncertainties in the climate model's diagnostic contrail parameterization. Revising the climate model results accordingly increases the global mean radiative forcing estimate for line-shaped contrails by a factor of 3.3, from 3.5 mW/m(2) to 11.6 mW/m(2) for the year 1992. Furthermore, the satellite observations and the cloud model point at higher global mean optical depth of detectable contrails than often assumed in radiative transfer (off-line) studies. Therefore, we correct estimates of contrail radiative forcing from off-line studies as well. We suggest that the global net radiative forcing of line-shaped persistent contrails is in the range 8-20 mW/m(2) for the air traffic in the year 2000.

Multiple linear regression models were developed to estimate the monthly daily sunshine hours using four parameters during a period of eleven years (1997 to 2007) for Warri, Nigeria (Latitude of 5o. 34' 21.0''); the parameters include, Relative Humidity, Maximum and Minimum Temperature, Rainfall and Wind Speed.

Multiple linear regression models were developed to estimate the monthly daily sunshine hours using four parameters during a period of eleven years (1997 to 2007) for Warri, Nigeria (Latitude of 5o 34' 21.0''); the parameters include, Relative Humidity, Maximum and Minimum Temperature, Rainfall and Wind Speed.

Global cropland net primary production (NPP) has tripled over the last 50 years, contributing 17-45 % to the increase in global atmospheric CO2 seasonal amplitude. Although many regional-scale comparisons have been made between statistical data and modeling results, long-term national comparisons across global croplands are scarce due to the lack of detailed spatiotemporal management data. Here, we conducted a simulation study of global cropland NPP from 1961 to 2010 using a process-based model called Vegetation-Global Atmosphere-Soil (VEGAS) and compared the results with Food and Agriculture Organization of the United Nations (FAO) statistical data on both continental and country scales. According to the FAO data, the global cropland NPP was 1.3, 1.8, 2.2, 2.6, 3.0, and 3.6 PgC yr-1 in the 1960s, 1970s, 1980s, 1990s, 2000s, and 2010s, respectively. The VEGAS model captured these major trends on global and continental scales. The NPP increased most notably in the US Midwest, western Europe, and the North China Plain and increased modestly in Africa and Oceania. However, significant biases remained in some regions such as Africa and Oceania, especially in temporal evolution. This finding is not surprising as VEGAS is the first global carbon cycle model with full parameterization representing the Green Revolution. To improve model performance for different major regions, we modified the default values of management intensity associated with the agricultural Green Revolution differences across various regions to better match the FAO statistical data at the continental level and for selected countries. Across all the selected countries, the updated results reduced the RMSE from 19.0 to 10.5 TgC yr-1 (˜ 45 % decrease). The results suggest that these regional differences in model parameterization are due to differences in socioeconomic development. To better explain the past changes and predict the future trends, it is important to calibrate key parameters on regional

Models such as the Angstroem-Prescott equation are used to estimateglobal solar radiation from sunshine duration. In the literature, researchers investigate either the goodness of the model itself or the goodness of the estimation of global solar radiation based on a set of statistical parameters such as R 2 , RMSE, MBE, MABE, MPE and MAPE. If the former is the objective, then the statistical analysis should naturally be based on H/H o - S/S o (the ratio of daily solar radiation to extraterrestrial daily solar radiation vs. the ratio of sunshine duration to day length). If the latter is investigated, then the statistical analysis should be based on H c - H m (calculated daily solar radiation vs. measured daily solar radiation). A literature survey undertaken in the present article showed that these two data sets are apt to be confused, drawing the statistical parameters to be used in assessment of the estimation model from the latter data set or the vice versa set. The statistical parameters are clearly derived from the basics for both of the data sets, and the inconsistencies caused by this confusion and other factors are exposed. A case study of the estimation models and global solar radiation estimation from sunshine duration is presented using five different models (linear, quadratic, cubic, logarithmic and exponential), which are the most common models used in the literature, based on 6 years long measured hourly global solar radiation data

Accurate estimation of the amount of horizontal global solar radiation for a particular field is an important input for decision processes in solar radiation investments. In this article, we focus on the estimation of yearly mean daily horizontal global solar radiation by using an approach that utilizes fuzzy regression functions with support vector machine (FRF-SVM). This approach is not seriously affected by outlier observations and does not suffer from the over-fitting problem. To demonstrate the utility of the FRF-SVM approach in the estimation of horizontal global solar radiation, we conduct an empirical study over a dataset collected in Turkey and applied the FRF-SVM approach with several kernel functions. Then, we compare the estimation accuracy of the FRF-SVM approach to an adaptive neuro-fuzzy system and a coplot supported-genetic programming approach. We observe that the FRF-SVM approach with a Gaussian kernel function is not affected by both outliers and over-fitting problem and gives the most accurate estimates of horizontal global solar radiation among the applied approaches. Consequently, the use of hybrid fuzzy functions and support vector machine approaches is found beneficial in long-term forecasting of horizontal global solar radiation over a region with complex climatic and terrestrial characteristics. - Highlights: • A fuzzy regression functions with support vector machines approach is proposed. • The approach is robust against outlier observations and over-fitting problem. • Estimation accuracy of the model is superior to several existent alternatives. • A new solar radiation estimation model is proposed for the region of Turkey. • The model is useful under complex terrestrial and climatic conditions.

For this study, we developed a new statistical model to estimate the daily accumulated global solar radiation on the earth's surface and used the model to generate a high-resolution climate change scenario of the radiation field in Japan. The statistical model mainly relies on precipitable water vapor calculated from air temperature and relative humidity on the surface to estimate seasonal changes in global solar radiation. On the other hand, to estimate daily radiation fluctuations, the model uses either a diurnal temperature range or relative humidity. The diurnal temperature range, calculated from the daily maximum and minimum temperatures, and relative humidity is a general output of most climate models, and pertinent observation data are comparatively easy to access. The statistical model performed well when estimating the monthly mean value, daily fluctuation statistics, and regional differences in the radiation field in Japan. To project the change in the radiation field for the years 2081 to 2100, we applied the statistical model to the climate change scenario of a high-resolution Regional Climate Model with a 20-km mesh size (RCM20) developed at the Meteorological Research Institute based on the Special Report for Emission Scenario (SRES)-A2. The projected change shows the following tendency: global solar radiation will increase in the warm season and decrease in the cool season in many areas of Japan, indicating that global warming may cause changes in the radiation field in Japan. The generated climate change scenario for the radiation field is linked to long-term and short-term changes in air temperature and relative humidity obtained from the RCM20 and, consequently, is expected to complement the RCM20 datasets for an impact assessment study in the agricultural sector

A series of major scientific programs carried out over the past 40 years has greatly increased understanding of our global environment and has led to the present concern over global change. Each program responded to a specific and urgent scientific need or opportunity. In each case, institutions and resources were created that provided the foundation for later programs. Increased scientific understanding has exposed threats to future welfare and has raised serious policy implications for governments. Institutions for responding to global policy issues need to be created or strengthened. Recommendations for better procedures and institutional structures are provided in this article. 39 refs

This paper examines global carbon dioxide (CO 2 ) efficiency by employing a stochastic cost frontier analysis of about 170 countries in 1997 and 2007. The main contribution lies in providing a new approach to environmental efficiency estimation, in which the efficiency estimates quantify the distance from the policy objective of minimum emissions. We are able to examine a very large pool of nations and provide country-wise efficiency estimates. We estimate three econometric models, corresponding with alternative interpretations of the Cancun vision (Conference of the Parties 2011). The models reveal progress in global environmental efficiency during a preceding decade. The estimates indicate vast differences in efficiency levels, and efficiency changes across countries. The highest efficiency levels are observed in Africa and Europe, while the lowest are clustered around China. The largest efficiency gains were observed in central and eastern Europe. CO 2 efficiency also improved in the US and China, the two largest emitters, but their ranking in terms of CO 2 efficiency deteriorated. Policy implications are discussed. - Highlights: ► We estimateglobal environmental efficiency in line with the Cancun vision, using a stochastic cost frontier. ► The study covers 170 countries during a 10 year period, ending in 2007. ► The biggest improvements occurred in Europe, and efficiency falls in South America. ► The efficiency ranking of US and China, the largest emitters, deteriorated. ► In 2007, highest efficiency was observed in Africa and Europe, and the lowest around China.

Full Text Available The existing estimate of the global burden of latent TB infection (LTBI as "one-third" of the world population is nearly 20 y old. Given the importance of controlling LTBI as part of the End TB Strategy for eliminating TB by 2050, changes in demography and scientific understanding, and progress in TB control, it is important to re-assess the global burden of LTBI.We constructed trends in annual risk in infection (ARI for countries between 1934 and 2014 using a combination of direct estimates of ARI from LTBI surveys (131 surveys from 1950 to 2011 and indirect estimates of ARI calculated from World Health Organisation (WHO estimates of smear positive TB prevalence from 1990 to 2014. Gaussian process regression was used to generate ARIs for country-years without data and to represent uncertainty. Estimated ARI time-series were applied to the demography in each country to calculate the number and proportions of individuals infected, recently infected (infected within 2 y, and recently infected with isoniazid (INH-resistant strains. Resulting estimates were aggregated by WHO region. We estimated the contribution of existing infections to TB incidence in 2035 and 2050. In 2014, the global burden of LTBI was 23.0% (95% uncertainty interval [UI]: 20.4%-26.4%, amounting to approximately 1.7 billion people. WHO South-East Asia, Western-Pacific, and Africa regions had the highest prevalence and accounted for around 80% of those with LTBI. Prevalence of recent infection was 0.8% (95% UI: 0.7%-0.9% of the global population, amounting to 55.5 (95% UI: 48.2-63.8 million individuals currently at high risk of TB disease, of which 10.9% (95% UI:10.2%-11.8% was isoniazid-resistant. Current LTBI alone, assuming no additional infections from 2015 onwards, would be expected to generate TB incidences in the region of 16.5 per 100,000 per year in 2035 and 8.3 per 100,000 per year in 2050. Limitations included the quantity and methodological heterogeneity of direct ARI

The availability of hourly solar radiation data is very important for applications utilizing solar energy and for climate and environmental aspects. The aim of this work is to use a simple model for estimating hourly global solar radiation under clear sky condition in Iraq. Calculations were compared with measurements obtained from local station in Baghdad city and from Meteosat satellite data for different locations in Iraq. The statistical test methods of the mean bias error (MBE), root mean square error (RMSE) and t-test were used to evaluate the performance of the model. Results indicated that a fairly good agreement exists between calculated and measured values for all locations in Iraq. Since the model is independent of any meteorological variable, it would be of a practical use for rural areas where no meteorological data are available.

Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual

Current statistical methods of reconstructing the climate of the last centuries are based on statistical models linking climate observations (temperature, sea-level-pressure) and proxy-climate data (tree-ring chronologies, ice-cores isotope concentrations, varved sediments, etc.). These models are calibrated in the instrumental period, and the longer time series of proxy data are then used to estimate the past evolution of the climate variables. Using such methods the global mean temperature of the last 600 years has been recently estimated. In this work this method of reconstruction is tested using data from a very long simulation with a climate model. This testing allows to estimate the errors of the estimations as a function of the number of proxy data and the time scale at which the estimations are probably reliable. (orig.)

Human resources are consistently cited as a leading contributor to health care costs; however the availability of internationally comparable data on health worker earnings for all countries is a challenge for estimating the costs of health care services. This paper describes an econometric model using cross sectional earnings data from the International Labour Organization (ILO) that the World Health Organizations (WHO)-Choosing Interventions that are Cost-effective programme (CHOICE) has used to prepare estimates of health worker earnings (in 2010 USD) for all WHO member states. The ILO data contained 324 observations of earnings data across 4 skill levels for 193 countries. Using this data, along with the assumption that data were missing not at random, we used a Heckman two stage selection model to estimate earning data for each of the 4 skill levels for all WHO member states. It was possible to develop a prediction model for health worker earnings for all countries for which GDP data was available. Health worker earnings vary both within country due to skill level, as well as across countries. As a multiple of GDP per capita, earnings show a negative correlation with GDP-that is lower income countries pay their health workers relatively more than higher income countries. Limited data on health worker earnings is a limiting factor in estimating the costs of global health programmes. It is hoped that these estimates will support robust health care intervention costings and projections of resources needs over the Sustainable Development Goal period.

Forest conservation efforts are increasingly being implemented at the scale of sub-national jurisdictions in order to mitigate global climate change and provide other ecosystem services. We see an urgent need for robust estimates of historic forest carbon emissions at this scale, as the basis for credible measures of climate and other benefits achieved. Despite the arrival of a new generation of global datasets on forest area change and biomass, confusion remains about how to produce credible jurisdictional estimates of forest emissions. We demonstrate a method for estimating the relevant historic forest carbon fluxes within the Regency of Berau in eastern Borneo, Indonesia. Our method integrates best available global and local datasets, and includes a comprehensive analysis of uncertainty at the regency scale. We find that Berau generated 8.91 ± 1.99 million tonnes of net CO2 emissions per year during 2000-2010. Berau is an early frontier landscape where gross emissions are 12 times higher than gross sequestration. Yet most (85%) of Berau's original forests are still standing. The majority of net emissions were due to conversion of native forests to unspecified agriculture (43% of total), oil palm (28%), and fiber plantations (9%). Most of the remainder was due to legal commercial selective logging (17%). Our overall uncertainty estimate offers an independent basis for assessing three other estimates for Berau. Two other estimates were above the upper end of our uncertainty range. We emphasize the importance of including an uncertainty range for all parameters of the emissions equation to generate a comprehensive uncertainty estimate-which has not been done before. We believe comprehensive estimates of carbon flux uncertainty are increasingly important as national and international institutions are challenged with comparing alternative estimates and identifying a credible range of historic emissions values.

Brain drain, the international migration of scientists in search of better opportunities, has been a long-standing concern, but quantitative measurements are uncommon and limited to specific countries or disciplines. We need to understand brain drain at a global level and estimate the extent to which scientists born in countries with low opportunities never realize their potential. Data on 1523 of the most highly cited scientists for 1981-1999 are analyzed. Overall, 31.9% of these scientists did not reside in the country where they were born (range 18.1-54.6% across 21 different scientific fields). There was great variability across developed countries in the proportions of foreign-born resident scientists and emigrating scientists. Countries without a critical mass of native scientists lost most scientists to migration. This loss occurred in both developed and developing countries. Adjusting for population and using the U.S. as reference, the number of highly cited native-born scientists was at least 75% of the expected number in only 8 countries other than the U.S. It is estimated that approximately 94% of the expected top scientists worldwide have not been able to materialize themselves due to various adverse conditions. Scientific deficit is only likely to help perpetuate these adverse conditions.

Full Text Available The Foodborne Disease Burden Epidemiology Reference Group (FERG was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs. This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates.The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution. All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process.We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level.

We discuss the high skill of real-time forecasts of global surface temperature a year ahead issued by the UK Met Office, and their scientific background. Although this is a forecasting and not a formal attribution study, we show that the main instrumental global annual surface temperature data sets since 1891 are structured consistently with a set of five physical forcing factors except during and just after the second World War. Reconstructions use a multiple application of cross validated linear regression to minimise artificial skill allowing time-varying uncertainties in the contribution of each forcing factor to global temperature to be assessed. Mean cross validated reconstructions for the data sets have total correlations in the range 0.93-0.95,interannual correlations in the range 0.72-0.75 and root mean squared errors near 0.06oC, consistent with observational uncertainties.Three transient runs of the HadCM3 coupled model for 1888-2002 demonstrate quite similar reconstruction skill from similar forcing factors defined appropriately for the model, showing that skilful use of our technique is not confined to observations. The observed reconstructions show that the Atlantic Multidecadal Oscillation (AMO) likely contributed to the re-commencement of global warming between 1976 and 2010 and to global cooling observed immediately beforehand in 1965-1976. The slowing of global warming in the last decade is likely to be largely due to a phase-delayed response to the downturn in the solar cycle since 2001-2, with no net ENSO contribution. The much reduced trend in 2001-10 is similar in size to other weak decadal temperature trends observed since global warming resumed in the 1970s. The causes of variations in decadal trends can be mostly explained by variations in the strength of the forcing factors. Eleven real-time forecasts of global mean surface temperature for the year ahead for 2000-2010, based on broadly similar methods, provide an independent test of the

Angstrom equation H=H 0 (a+bS/S 0 ) has been fitted using the least-square method to the global irradiation and the sunshine duration data of 31 Italian locations for the duration 1965-1974. Three more linear equations: i) the equation H'=H 0 (a+bS/S 0 ), obtained by incorporating the effect of the multiple reflections between the earth's surface and the atmosphere, ii) the equation H=H 0 (a+bS/S' 0 ), obtained by incorporating the effect of not burning of the sunshine recorder chart when the elevation of the sun is less than 5 deg., and iii) the equation H'=H 0 (a+bS/S' 0 ), obtained by incorporating both the above effects simultaneously, have also each been fitted to the same data. Good correlation with correlation coefficients around 0.9 or more are obtained for most of the locations with all the four equations. Substantial spatial scatter is obtained in the values of the regression parameters. The use of any of the three latter equations does not result in any advantage over that of the simpler Angstrom equation; it neither results in a decrease in the spatial scatter in the values of the regression parameters nor does it yield better correlation. The computed values of the regression parameters in the Angstrom equation yield estimates of the global irradiation that are on the average within +- 4% of the measured values for most of the locations. (author)

The millennium development goals triggered an increased demand for data on child and maternal mortalities for monitoring progress. With the advent of the sustainable development goals and growing evidence of an epidemiological transition toward non-communicable diseases, policymakers need data on mortality and disease trends and distribution to inform effective policies and support monitoring progress. Where there are limited capacities to produce national health estimates (NHEs), global health estimates (GHEs) can fill gaps for global monitoring and comparisons. This paper discusses lessons learned from Thailand's burden of disease (BOD) study on capacity development on NHEs and discusses the contributions and limitations of GHEs in informing policies at the country level. Through training and technical support by external partners, capacities are gradually strengthened and institutionalized to enable regular updates of BOD at national and subnational levels. Initially, the quality of cause-of-death reporting in death certificates was inadequate, especially for deaths occurring in the community. Verbal autopsies were conducted, using domestic resources, to determine probable causes of deaths occurring in the community. This method helped to improve the estimation of years of life lost. Since the achievement of universal health coverage in 2002, the quality of clinical data on morbidities has also considerably improved. There are significant discrepancies between the Global Burden of Disease 2010 study estimates for Thailand and the 1999 nationally generated BOD, especially for years of life lost due to HIV/AIDS, and the ranking of priority diseases. National ownership of NHEs and an effective interface between researchers and decision-makers contribute to enhanced country policy responses, whereas subnational data are intended to be used by various subnational partners. Although GHEs contribute to benchmarking country achievement compared with global health

This paper presents an innovative hybrid approach for the estimation of the solar global radiation. New prediction equations were developed for the global radiation using an integrated search method of genetic programming (GP) and simulated annealing (SA), called GP/SA. The solar radiation was formulated in terms of several climatological and meteorological parameters. Comprehensive databases containing monthly data collected for 6 years in two cities of Iran were used to develop GP/SA-based models. Separate models were established for each city. The generalization of the models was verified using a separate testing database. A sensitivity analysis was conducted to investigate the contribution of the parameters affecting the solar radiation. The derived models make accurate predictions of the solar global radiation and notably outperform the existing models. -- Highlights: ► A hybrid approach is presented for the estimation of the solar global radiation. ► The proposed method integrates the capabilities of GP and SA. ► Several climatological and meteorological parameters are included in the analysis. ► The GP/SA models make accurate predictions of the solar global radiation.

Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the

Full Text Available This paper develops two fisheries models in order to estimate the effect of global warming (GW on firm value. GW is defined as an increase in the average temperature of the Earth’s surface as a result of emissions. It is assumed that (i GW exists, and (ii higher temperatures negatively affect biomass. CO2 The literature on biology and GW supporting these two crucial assumptions is reviewed. The main argument presented is that temperature increase has two effects on biomass, both of which have an impact on firm value. First, higher temperatures cause biomass to oscillate. To measure the effect of biomass oscillation on firm value the model in [1] is modified to include water temperature as a variable. The results indicate that a 1 to 20% variation in biomass causes firm value to fall from 6 to 44%, respectively. Second, higher temperatures reduce biomass, and a modification of the model in [2] reveals that an increase in temperature anomaly between +1 and +8°C causes fishing firm value to decrease by 8 to 10%.

Wilderness areas in the world are threatened by the environmental impacts of the growing global human population. This study estimates the impact of birth rate on the future surface area of biodiverse wilderness and on the proportion of this area without major extinctions. The following four drivers are considered: human population growth (1), agricultural efficiency (2), groundwater drawdown by irrigation (3), and non-agricultural space used by humans (buildings, gardens, roads, etc.) (4). This study indicates that the surface area of biodiverse unmanaged land will reduce with about 5.4% between 2012 and 2050. Further, it indicates that the biodiverse land without major extinctions will reduce with about 10.5%. These percentages are based on a commonly used population trajectory which assumes that birth rates across the globe will reduce in a similar way as has occurred in the past in many developed countries. Future birth rate is however very uncertain. Plausible future birth rates lower than the expected rates lead to much smaller reductions in surface area of biodiverse unmanaged land (0.7% as opposed to 5.4%), and a reduction in the biodiverse land without major extinctions of about 5.6% (as opposed to 10.5%). This indicates that birth rate is an important factor influencing the quality and quantity of wilderness remaining in the future.

Full Text Available The burden of chronic obstructive pulmonary disease (COPD across many world regions is high. We aim to estimate COPD prevalence and number of disease cases for the years 1990 and 2010 across world regions based on the best available evidence in publicly accessible scientific databases. We conducted a systematic search of Medline, EMBASE and Global Health for original, population–based studies providing spirometry–based prevalence rates of COPD across the world from January 1990 to December 2014. Random effects meta–analysis was conducted on extracted crude prevalence rates of COPD, with overall summaries of the meta–estimates (and confidence intervals reported separately for World Health Organization (WHO regions, the World Bank's income categories and settings (urban and rural. We developed a meta–regression epidemiological model that we used to estimate the prevalence of COPD in people aged 30 years or more. Our search returned 37 472 publications. A total of 123 studies based on a spirometry–defined prevalence were retained for the review. From the meta–regression epidemiological model, we estimated about 227.3 million COPD cases in the year 1990 among people aged 30 years or more, corresponding to a global prevalence of 10.7% (95% confidence interval (CI 7.3%–14.0% in this age group. The number of COPD cases increased to 384 million in 2010, with a global prevalence of 11.7% (8.4%–15.0%. This increase of 68.9% was mainly driven by global demographic changes. Across WHO regions, the highest prevalence was estimated in the Americas (13.3% in 1990 and 15.2% in 2010, and the lowest in South East Asia (7.9% in 1990 and 9.7% in 2010. The percentage increase in COPD cases between 1990 and 2010 was the highest in the Eastern Mediterranean region (118.7%, followed by the African region (102.1%, while the European region recorded the lowest increase (22.5%. In 1990, we estimated about 120.9 million COPD cases among urban dwellers

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles.

Today's global tobacco epidemic may represent one of the first instances of the globalization of a noninfectious cause of disease. This article focuses on the first century of the global tobacco epidemic and its current status, reviewing the current and projected future of the global tobacco epidemic and the steps that are in progress to end it. In the United States and many countries of Western Europe, tobacco consumption peaked during the 1960s and 1970s and declined as tobacco control programs were initiated, motivated by the evidence indicting smoking as a leading cause of disease. Despite this policy advancement and the subsequent reductions in tobacco consumption, the global tobacco epidemic continued to grow exponentially in the later years of the twentieth century, as the multinational companies sought new markets to replace those shrinking in high-income countries. In response, between 2000 and 2004, the World Health Organization developed its first public health treaty, the Framework Convention on Tobacco Control, which entered into force in 2005. An accompanying package of interventions has been implemented. New approaches to tobacco control, including plain packaging and single representation of brands, have been implemented by Australia and Uruguay, respectively, but have been challenged by the tobacco industry.

Global population is expected to reach 9 billion people by the year 2050, causing increased demands for water and potential threats to human security. This study attempts to frame the overpopulation problem through a hydrological resources lens by hypothesizing that observed groundwater trends should be directly attributed to human water consumption. This study analyzes the relationships between available blue water, population, and cropland area on a global scale. Using satellite data from NASA's Gravity Recovery and Climate Experiment (GRACE) along with land surface model data from the Global Land Data Assimilation System (GLDAS), a global groundwater depletion trend is isolated, the validity of which has been verified in many regional studies. By using the inherent distributions of these relationships, we estimate the regional populations that have exceeded their local hydrological carrying capacity. Globally, these populations sum to ~3.5 billion people that are living in presently water-stressed or potentially water-scarce regions, and we estimate total cropland is exceeding a sustainable threshold by about 80 million km^2. Key study areas such as the North China Plain, northwest India, and Mexico City were qualitatively chosen for further analysis of regional water resources and policies, based on our distributions of water stress. These case studies are used to verify the groundwater level changes seen in the GRACE trend . Tfor the many populous, arid regions of the world that have already begun to experience the strains of high water demand.he many populous, arid regions of the world have already begun to experience the strains of high water demand. It will take a global cooperative effort of improving domestic and agricultural use efficiency, and summoning a political will to prioritize environmental issues to adapt to a thirstier planet. Global Groundwater Depletion Trend (Mar 2003-Dec 2011)

Full Text Available Recent attention has focused on the high rates of annual carbon sequestration in vegetated coastal ecosystems--marshes, mangroves, and seagrasses--that may be lost with habitat destruction ('conversion'. Relatively unappreciated, however, is that conversion of these coastal ecosystems also impacts very large pools of previously-sequestered carbon. Residing mostly in sediments, this 'blue carbon' can be released to the atmosphere when these ecosystems are converted or degraded. Here we provide the first globalestimates of this impact and evaluate its economic implications. Combining the best available data on global area, land-use conversion rates, and near-surface carbon stocks in each of the three ecosystems, using an uncertainty-propagation approach, we estimate that 0.15-1.02 Pg (billion tons of carbon dioxide are being released annually, several times higher than previous estimates that account only for lost sequestration. These emissions are equivalent to 3-19% of those from deforestation globally, and result in economic damages of $US 6-42 billion annually. The largest sources of uncertainty in these estimates stems from limited certitude in global area and rates of land-use conversion, but research is also needed on the fates of ecosystem carbon upon conversion. Currently, carbon emissions from the conversion of vegetated coastal ecosystems are not included in emissions accounting or carbon market protocols, but this analysis suggests they may be disproportionally important to both. Although the relevant science supporting these initial estimates will need to be refined in coming years, it is clear that policies encouraging the sustainable management of coastal ecosystems could significantly reduce carbon emissions from the land-use sector, in addition to sustaining the well-recognized ecosystem services of coastal habitats.

Recent attention has focused on the high rates of annual carbon sequestration in vegetated coastal ecosystems—marshes, mangroves, and seagrasses—that may be lost with habitat destruction (‘conversion’). Relatively unappreciated, however, is that conversion of these coastal ecosystems also impacts very large pools of previously-sequestered carbon. Residing mostly in sediments, this ‘blue carbon’ can be released to the atmosphere when these ecosystems are converted or degraded. Here we provide the first globalestimates of this impact and evaluate its economic implications. Combining the best available data on global area, land-use conversion rates, and near-surface carbon stocks in each of the three ecosystems, using an uncertainty-propagation approach, we estimate that 0.15–1.02 Pg (billion tons) of carbon dioxide are being released annually, several times higher than previous estimates that account only for lost sequestration. These emissions are equivalent to 3–19% of those from deforestation globally, and result in economic damages of $US 6–42 billion annually. The largest sources of uncertainty in these estimates stems from limited certitude in global area and rates of land-use conversion, but research is also needed on the fates of ecosystem carbon upon conversion. Currently, carbon emissions from the conversion of vegetated coastal ecosystems are not included in emissions accounting or carbon market protocols, but this analysis suggests they may be disproportionally important to both. Although the relevant science supporting these initial estimates will need to be refined in coming years, it is clear that policies encouraging the sustainable management of coastal ecosystems could significantly reduce carbon emissions from the land-use sector, in addition to sustaining the well-recognized ecosystem services of coastal habitats. PMID:22962585

Satellite technologies have facilitated a recent boom in high resolution, large-scale biomass estimation and mapping. These data are the input into a wide range of global models and are becoming the gold standard for required national carbon (C) emissions reporting. Yet their geographical and/or thematic scope may exclude some or all parts of a given country or region. Most datasets tend to focus exclusively on forest biomass. Grasslands and shrublands generally store less C than forests but cover nearly twice as much global land area and may represent a significant portion of a given country's biomass C stock. To address these shortcomings, we set out to create synthetic, global above- and below-ground biomass maps that combine recently-released satellite based data of standing forest biomass with novel estimates for non-forest biomass stocks that are typically neglected. For forests we integrated existing publicly available regional, global and biome-specific biomass maps and modeled below ground biomass using empirical relationships described in the literature. For grasslands, we developed models for both above- and below-ground biomass based on NPP, mean annual temperature and precipitation to extrapolate field measurements across the globe. Shrubland biomass was extrapolated from existing regional biomass maps using environmental factors to generate the first globalestimate of shrub biomass. Our new synthetic map of global biomass carbon circa 2010 represents an update to the IPCC Tier-1 Global Biomass Carbon Map for the Year 2000 (Ruesch and Gibbs, 2008) using the best data currently available. In the absence of a single seamless remotely sensed map of global biomass, our synthetic map provides the only globally-consistent source of comprehensive biomass C data and is valuable for land change analyses, carbon accounting, and emissions modeling.

This report was prepared by the Applied Research Corporation (ARC), College Station, Texas, under subcontract to Pacific Northwest Laboratory (PNL) as part of a global climate studies task. The task supports site characterization work required for the selection of a potential high-level nuclear waste repository and is part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work is under the overall direction of the Office of Civilian Radioactive Waste Management (OCRWM), US Department of Energy Headquarters, Washington, DC. The scope of the report is to present the results of the third year`s work on the atmospheric modeling part of the global climate studies task. The development testing of computer models and initial results are discussed. The appendices contain several studies that provide supporting information and guidance to the modeling work and further details on computer model development. Complete documentation of the models, including user information, will be prepared under separate reports and manuals.

VLSI implementation of gradient-based global motion estimation (GME) faces two main challenges: irregular data access and high off-chip memory bandwidth requirement. We previously proposed a fast GME method that reduces computational complexity by choosing certain number of small patches containing corners and using them in a gradient-based framework. A hardware architecture is designed to implement this method and further reduce off-chip memory bandwidth requirement. On-chip memories are used to store coordinates of the corners and template patches, while the Gaussian pyramids of both the template and reference frame are stored in off-chip SDRAMs. By performing geometric transform only on the coordinates of the center pixel of a 3-by-3 patch in the template image, a 5-by-5 area containing the warped 3-by-3 patch in the reference image is extracted from the SDRAMs by burst read. Patched-based and burst mode data access helps to keep the off-chip memory bandwidth requirement at the minimum. Although patch size varies at different pyramid level, all patches are processed in term of 3x3 patches, so the utilization of the patch-processing circuit reaches 100%. FPGA implementation results show that the design utilizes 24,080 bits on-chip memory and for a sequence with resolution of 352x288 and frequency of 60Hz, the off-chip bandwidth requirement is only 3.96Mbyte/s, compared with 243.84Mbyte/s of the original gradient-based GME method. This design can be used in applications like video codec, video stabilization, and super-resolution, where real-time GME is a necessity and minimum memory bandwidth requirement is appreciated.

The budget estimates for Salaries and Expenses for FY 1989 provide for obligations of $450,000,000 to be funded in total by a new appropriation. The sum appropriated shall be reduced by the amount of revenues received during fiscal year 1989 from licensing fees, inspection services, and other services and collections, excluding those monies received for the cooperative nuclear safety research program, services rendered to foreign governments and international organizations, and the material and information access authorization programs, so as to result in a final fiscal year 1989 appropriation estimated at not more than $247,500,000

Highlights: • Precise horizontal global solar radiation estimation models are proposed for Turkey. • Genetic programming technique is used to construct the models. • Robust coplot analysis is applied to reduce the impact of outlier observations. • Better estimation and prediction properties are observed for the models. - Abstract: Renewable energy sources have been attracting more and more attention of researchers due to the diminishing and harmful nature of fossil energy sources. Because of the importance of solar energy as a renewable energy source, an accurate determination of significant covariates and their relationships with the amount of global solar radiation reaching the Earth is a critical research problem. There are numerous meteorological and terrestrial covariates that can be used in the analysis of horizontal global solar radiation. Some of these covariates are highly correlated with each other. It is possible to find a large variety of linear or non-linear models to explain the amount of horizontal global solar radiation. However, models that explain the amount of global solar radiation with the smallest set of covariates should be obtained. In this study, use of the robust coplot technique to reduce the number of covariates before going forward with advanced modelling techniques is considered. After reducing the dimensionality of model space, yearly and monthly mean daily horizontal global solar radiation estimation models for Turkey are built by using the genetic programming technique. It is observed that application of robust coplot analysis is helpful for building precise models that explain the amount of global solar radiation with the minimum number of covariates without suffering from outlier observations and the multicollinearity problem. Consequently, over a dataset of Turkey, precise yearly and monthly mean daily global solar radiation estimation models are introduced using the model spaces obtained by robust coplot technique and

in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources c

We report here a first-of-its-kind analysis of the potential for intensification of global grazing systems. Intensification is calculated using the statistical yield gap methodology developed previously by others (Mueller et al 2012 and Licker et al 2010) for global crop systems. Yield gaps are estimated by binning global pasture land area into 100 equal area sized bins of similar climate (defined by ranges of rainfall and growing degree days). Within each bin, grid cells of pastureland are ranked from lowest to highest productivity. The global intensification potential is defined as the sum of global production across all bins at a given percentile ranking (e.g. performance at the 90th percentile) divided by the total current global production. The previous yield gap studies focused on crop systems because productivity data on these systems is readily available. Nevertheless, global crop land represents only one-third of total global agricultural land, while pasture systems account for the remaining two-thirds. Thus, it is critical to conduct the same kind of analysis on what is the largest human use of land on the planet—pasture systems. In 2013, Herrero et al announced the completion of a geospatial data set that augmented the animal census data with data and modeling about production systems and overall food productivity (Herrero et al, PNAS 2013). With this data set, it is now possible to apply yield gap analysis to global pasture systems. We used the Herrero et al data set to evaluate yield gaps for meat and milk production from pasture based systems for cattle, sheep and goats. The figure included with this abstract shows the intensification potential for kcal per hectare per year of meat and milk from global cattle, sheep and goats as a function of increasing levels of performance. Performance is measured as the productivity achieved at a given ranked percentile within each bin.We find that if all pasture land were raised to their 90th percentile of

Full Text Available A modelling experiment has been conceived to assess the impact of transport model errors on methane emissions estimated in an atmospheric inversion system. Synthetic methane observations, obtained from 10 different model outputs from the international TransCom-CH4 model inter-comparison exercise, are combined with a prior scenario of methane emissions and sinks, and integrated into the three-component PYVAR-LMDZ-SACS (PYthon VARiational-Laboratoire de Météorologie Dynamique model with Zooming capability-Simplified Atmospheric Chemistry System inversion system to produce 10 different methane emission estimates at the global scale for the year 2005. The same methane sinks, emissions and initial conditions have been applied to produce the 10 synthetic observation datasets. The same inversion set-up (statistical errors, prior emissions, inverse procedure is then applied to derive flux estimates by inverse modelling. Consequently, only differences in the modelling of atmospheric transport may cause differences in the estimated fluxes. In our framework, we show that transport model errors lead to a discrepancy of 27 Tg yr−1 at the global scale, representing 5% of total methane emissions. At continental and annual scales, transport model errors are proportionally larger than at the global scale, with errors ranging from 36 Tg yr−1 in North America to 7 Tg yr−1 in Boreal Eurasia (from 23 to 48%, respectively. At the model grid-scale, the spread of inverse estimates can reach 150% of the prior flux. Therefore, transport model errors contribute significantly to overall uncertainties in emission estimates by inverse modelling, especially when small spatial scales are examined. Sensitivity tests have been carried out to estimate the impact of the measurement network and the advantage of higher horizontal resolution in transport models. The large differences found between methane flux estimates inferred in these different configurations highly

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

The Fuel Economy Guide is published by the U.S. Department of Energy as an aid to consumers considering the purchase of a new vehicle. The Guide lists estimates of miles per gallon (mpg) for each vehicle available for the new model year. These estimates are provided by the U.S. Environmental Protection Agency in compliance with Federal Law. By using this Guide, consumers can estimate the average yearly fuel cost for any vehicle. The Guide is intended to help consumers compare the fuel economy of similarly sized cars, light duty trucks and special purpose vehicles. The vehicles listed have been divided into three classes of cars, three classes of light duty trucks, and three classes of special purpose vehicles.

This chapter contributes to the debate around net-zero energy concept from a global perspective. By means of comprehensive modelling, it analyses how much global building energy consumption could be reduced through utilisation of building-integrated solar energy technologies and energy......-efficiency improvements. Valuable insights on the locations and building types, in which it is feasible to achieve a net-zero level of energy performance through solar energy utilisation, are presented in world maps....

Long-term persistence (LTP) of annual river runoff is a topic of ongoing hydrological research, due to its implications to water resources management. Here, we estimate its strength, measured by the Hurst coefficient H, in 696 annual, globally distributed, streamflow records with at least 80 years of data. We use three estimation methods (maximum likelihood estimator, Whittle estimator and least squares variance) resulting in similar mean values of H close to 0.65. Subsequently, we explore potential factors influencing H by two linear (Spearman's rank correlation, multiple linear regression) and two non-linear (self-organizing maps, random forests) techniques. Catchment area is found to be crucial for medium to larger watersheds, while climatic controls, such as aridity index, have higher impact to smaller ones. Our findings indicate that long-term persistence is weaker than found in other studies, suggesting that enhanced LTP is encountered in large-catchment rivers, were the effect of spatial aggregation is more intense. However, we also show that the estimated values of H can be reproduced by a short-term persistence stochastic model such as an auto-regressive AR(1) process. A direct consequence is that some of the most common methods for the estimation of H coefficient, might not be suitable for discriminating short- and long-term persistence even in long observational records.

This paper estimates the marginal, total, and average cost and effectiveness of carbon taxes applied either by the Organization for Economic Cooperation and Development (OECD) members alone, or as part of a global cooperative strategy, to reduce potential future emissions and their direct implications for employment in the US coal industry. Two sets of cases are examined, one set in which OECD members acts alone, and another set in which the world acts in concert. In each case set taxes are examined which achieve four alternative levels of emissions reduction: halve the rate of emissions growth, no emissions growth, 20% reduction from 1988 levels, and 50% reduction from 1988 levels. For the global cooperation case, carbon tax rates of $32, $113, $161, and $517 per metric ton of carbon (mtC) were needed in the year 2025 to achieve the objectives. Total costs were respectively $40, $178, $253, and $848 billions of 1990 US dollars per year in the year 2025. Average costs were $32, $55, $59, and $135 per mtC. Costs were significantly higher in the cases in which the OECD members states acted alone. OECD member states, acting alone, could not reduce global emissions by 50% or 20% relative to 1988, given reference case assumptions regarding developing and recently planned nations economic growth

Tree cover defined structurally as the proportional, vertically projected area of vegetation (including leaves, stems, branches, etc.) of woody plants above a given height affects terrestrial energy and water exchanges, photosynthesis and transpiration, net primary production, and carbon and nutrient fluxes. Tree cover provides a measurable attribute upon which forest cover may be defined. Changes in tree cover over time can be used to monitor and retrieve site-specific histories of forest disturbance, succession, and degradation. Measurements of Earth's tree cover have been produced at regional, national, and global extents. However, most representations are static, and those for which multiple time periods have been produced are neither intended nor adequate for consistent, long-term monitoring. Moreover, although a substantial proportion of change has been shown to occur at resolutions below 250 m, existing long-term, Landsat-resolution datasets are either produced as static layers or with annual, five- or ten-year temporal resolution. We have developed an algorithms to retrieve seamless and consistent, sub-hectare resolution estimates of tree-canopy from optical and radar satellite data sources (e.g., Landsat, Sentinel-2, and ALOS-PALSAR). Our approach to estimation enables assimilation of multiple data sources and produces estimates of both cover and its uncertainty at the scale of pixels. It has generated the world's first Landsat-based percent tree cover dataset in 2013. Our previous algorithms are being adapted to produce prototype percent-tree and water-cover layers globally in 2000, 2005, and 2010—as well as annually over North and South America from 2010 to 2015—from passive-optical (Landsat and Sentinel-2) and SAR measurements. Generating a global, annual dataset is beyond the scope of this support; however, North and South America represent all of the world's major biomes and so offer the complete global range of environmental sources of error and

parasitic helminths, were highly localised. Thus, the burden of FBD is borne particularly by children under five years old-although they represent only 9% of the global population-and people living in low-income regions of the world. These estimates are conservative, i.e., underestimates rather than......Illness and death from diseases caused by contaminated food are a constant threat to public health and a significant impediment to socio-economic development worldwide. To measure the global and regional burden of foodborne disease (FBD), the World Health Organization (WHO) established...... different burdens of FBD, with the greatest falling on the subregions in Africa, followed by the subregions in South-East Asia and the Eastern Mediterranean D subregion. Some hazards, such as non-typhoidal S. enterica, were important causes of FBD in all regions of the world, whereas others, such as certain...

common in adults. For the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015), we estimated the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015. METHODS: We estimated...... incidence and prevalence by age, sex, cause, year, and geography with a wide range of updated and standardised analytical procedures. Improvements from GBD 2013 included the addition of new data sources, updates to literature reviews for 85 causes, and the identification and inclusion of additional studies...... causes of years lived with disability (YLDs) on a global basis. NCDs accounted for 18 of the leading 20 causes of age-standardised YLDs on a global scale. Where rates were decreasing, the rate of decrease for YLDs was slower than that of years of life lost (YLLs) for nearly every cause included in our...

Carbonaceous aerosols significantly affect global radiative forcing and climate through absorption and scattering of sunlight. Black carbon (BC) and brown carbon (BrC) are light-absorbing carbonaceous aerosols. The global distribution and climate effect of BrC is uncertain. A recent study suggests that BrC absorption is comparable to BC in the upper troposphere over biomass burning region and that the resulting heating tends to stabilize the atmosphere. Yet current climate models do not include proper treatments of BrC. In this study, we derived a BrC global biomass burning emission inventory from Global Fire Emissions Database 4 (GFED4) and developed a BrC module in the Community Atmosphere Model version 5 (CAM5) of Community Earth System Model (CESM) model. The model simulations compared well to BrC observations of the Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) and Deep Convective Clouds and Chemistry Project (DC-3) campaigns and includes BrC bleaching. Model results suggested that BrC in the upper troposphere due to convective transport is as important an absorber as BC globally. Upper tropospheric BrC radiative forcing is particularly significant over the tropics, affecting the atmosphere stability and Hadley circulation.

The upcoming GNSS Galileo, with its new satellite geometry and frequency plan, will not only bring many benefits for navigation and positioning but also help to improve ionosphere delay estimation. This paper investigates ionosphere estimation with Galileo and compares it with the results from

We use geodetic data taken over four years with the Global Positioning System (GPS) to estimate: (1) motion between six major plates and (2) motion relative to these plates of ten sites in plate boundary zones. The degree of consistency between geodetic velocities and rigid plates requires the (one-dimensional) standard errors in horizontal velocities to be approx. 2 mm/yr. Each of the 15 angular velocities describing motion between plate pairs that we estimate with GPS differs insignificantly from the corresponding angular velocity in global plate motion model NUVEL-1A, which averages motion over the past 3 m.y. The motion of the Pacific plate relative to both the Eurasian and North American plates is observed to be faster than predicted by NUVEL-1A, supporting the inference from Very Long B ase- line Interferometry (VLBI) that motion of the Pacific plate has speed up over the past few m.y. The Eurasia-North America pole of rotation is estimated to be north of NUVEL-1A, consistent with the independent hypothesis that the pole has recently migrated northward across northeast Asia to near the Lena River delta. Victoria, which lies above the main thrust at the Cascadia subduction zone, moves relative to the interior of the overriding plate at 30% of the velocity of the subducting plate, reinforcing the conclusion that the thrust there is locked beneath the continental shelf and slope.

Full Text Available Daniel Low-Beer and colleagues provide a response from The Global Fund on the PLOS Medicine article by David McCoy and colleagues critiquing their lives saved assessment models. Please see later in the article for the Editors' Summary.

Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models sti...

This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

The Construction of Facilities (CoF) appropriation provides contractual services for the repair, rehabilitation, and modification of existing facilities; the construction of new facilities and the acquisition of related collateral equipment; the acquisition or condemnation of real property; environmental compliance and restoration activities; the design of facilities projects; and advanced planning related to future facilities needs. Fiscal year 1994 budget estimates are broken down according to facility location of project and by purpose.

Full Text Available This study explores the estimation of land surface temperature (LST for the globe from Landsat 5, 7 and 8 thermal infrared sensors, using different surface emissivity sources. A single channel algorithm is used for consistency among the estimated LST products, whereas the option of using emissivity from different sources provides flexibility for the algorithm’s implementation to any area of interest. The Google Earth Engine (GEE, an advanced earth science data and analysis platform, allows the estimation of LST products for the globe, covering the time period from 1984 to present. To evaluate the method, the estimated LST products were compared against two reference datasets: (a LST products derived from ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer, as higher-level products based on the temperature-emissivity separation approach; (b Landsat LST data that have been independently produced, using different approaches. An overall RMSE (root mean square error of 1.52 °C was observed and it was confirmed that the accuracy of the LST product is dependent on the emissivity; different emissivity sources provided different LST accuracies, depending on the surface cover. The LST products, for the full Landsat 5, 7 and 8 archives, are estimated “on-the-fly” and are available on-line via a web application.

in the first half of the 20th century. Decreased summer runoff affects water supply for agriculture, domestic water supply, cooling needs for thermoelectric power generation, and ecosystem needs. In addition to the reduced volume of streamflow during warm summer months, less water results in elevated stream temperature, which also has significant effects on cooling of power generating facilities and on aquatic ecosystem needs. We are now required to include fish and other aquatic species in negotiation over how much water to leave in the river, rather than, as in the past, how much water we could remove from a river. Additionally, we must pay attention to the quality of that water, including its temperature. This is driven in the US by the Endangered Species Act and the Clean Water Act. Furthermore, we must now better understand and manage the whole hydrograph and the influence of hydrologic variability on aquatic ecosystems. Man has trimmed the tails off the probability distribution of flows. We need to understand how to put the tails back on but can’t do that without improved understanding of aquatic ecosystems. Sea level rise presents challenges for fresh water extraction from coastal aquifers as they are compromised by increased saline intrusion. A related problem faces users of ‘run-of-the-river’ water-supply intakes that are threatened by a salt front that migrates further upstream because of higher sea level. We face significant challenges with water infrastructure. The U.S. has among the highest quality drinking water in the world piped to our homes. However, our water and sewage treatment plants and water and sewer pipelines have not had adequate maintenance or investment for decades. The US Environmental Protection Agency estimates that there are up to 3.5M illnesses per year from recreational contact with sewage from sanitary sewage overflows. Infrastructure investment needs have been put at 5 trillion nationally. Global change and water resources

In this work we investigate three techniques for estimation of the non-linear phase present due to defocus in opticalcoherence tomography, and apply them with the angular spectrum method. The techniques are: Least squarestting the of unwrapped phase of the angular spectrum, iterative optimization......, and sub-aperture correlations. The estimated phase of a single en-face image is used to extrapolate the non-linear phase at all depths, whichin the end can be used to correct the entire 3-D tomogram, and any other tomogram from the same system.......In this work we investigate three techniques for estimation of the non-linear phase present due to defocus in opticalcoherence tomography, and apply them with the angular spectrum method. The techniques are: Least squarestting the of unwrapped phase of the angular spectrum, iterative optimization...

The data on sources and levels of the 85 Kr biosphere contamination are presented on the basis of generalization and analysis of literature. The potential irradiation doses for people are calculated and the biological estimation of the hazard of 85 Kr accumulation in the atmosphere up to 2050 is given taking into account the prospects for development of nuclear power engineering. The basis of the estimation is the radionuclide blastomogeneous and genetic effect. The conclusion is made that the prospects for development of nuclear power engineering do not lead to any sufficient increase in the number of malignant tumors and genetic abnormalities caused by 85 Kr radiation comparing with their natural frequency

Based on a 20-year (1991-2010) simulation of dust aerosol deposition with the global climate model CAM5.1 (Community Atmosphere Model, version 5.1), the spatial and temporal variations of dust aerosol deposition were analyzed using climate statistical methods. The results indicated that the annual amount of global dust aerosol deposition was approximately 1161±31Mt, with a decreasing trend, and its interannual variation range of 2.70% over 1991-2010. The 20-year average ratio of global dust dry to wet depositions was 1.12, with interannual variation of 2.24%, showing the quantity of dry deposition of dust aerosol was greater than dust wet deposition. High dry deposition was centered over continental deserts and surrounding regions, while wet deposition was a dominant deposition process over the North Atlantic, North Pacific and northern Indian Ocean. Furthermore, both dry and wet deposition presented a zonal distribution. To examine the regional changes of dust aerosol deposition on land and sea areas, we chose the North Atlantic, Eurasia, northern Indian Ocean, North Pacific and Australia to analyze the interannual and seasonal variations of dust deposition and dry-to-wet deposition ratio. The deposition amounts of each region showed interannual fluctuations with the largest variation range at around 26.96% in the northern Indian Ocean area, followed by the North Pacific (16.47%), Australia (9.76%), North Atlantic (9.43%) and Eurasia (6.03%). The northern Indian Ocean also had the greatest amplitude of interannual variation in dry-to-wet deposition ratio, at 22.41%, followed by the North Atlantic (9.69%), Australia (6.82%), North Pacific (6.31%) and Eurasia (4.36%). Dust aerosol presented a seasonal cycle, with typically strong deposition in spring and summer and weak deposition in autumn and winter. The dust deposition over the northern Indian Ocean exhibited the greatest seasonal change range at about 118.00%, while the North Atlantic showed the lowest seasonal

Mar 21, 2018 ... One of the key a priori estimates in the theory of second-order elliptic .... It is well known that the maximal functions satisfy strong p–p .... Here we prove the following auxiliary result, which will be a crucial ingredient in the proof.

Accurate estimation of Daily Global Solar Radiation (DGSR) has been a major goal for solar energy applications. In this paper we show the possibility of developing a simple model based on the Support Vector Regression (SVM-R), which could be used to estimate DGSR on the horizontal surface in Algeria based only on sunshine ratio as input. The SVM model has been developed and tested using a data set recorded over three years (2005-2007). The data was collected at the Applied Research Unit for Renewable Energies (URAER) in Ghardaïa city. The data collected between 2005-2006 are used to train the model while the 2007 data are used to test the performance of the selected model. The measured and the estimated values of DGSR were compared during the testing phase statistically using the Root Mean Square Error (RMSE), Relative Square Error (rRMSE), and correlation coefficient (r2), which amount to 1.59(MJ/m2), 8.46 and 97,4%, respectively. The obtained results show that the SVM-R is highly qualified for DGSR estimation using only sunshine ratio.

The Interstellar Boundary Explorer ( IBEX ) has now operated in space for 7 years and returned nearly continuous observations that have led to scientific discoveries and reshaped our entire understanding of the outer heliosphere and its interaction with the local interstellar medium. Here we extend prior work, adding the 2014–2015 data for the first time, and examine, validate, initially analyze, and provide a complete 7-year set of Energetic Neutral Atom (ENA) observations from ∼0.1 to 6 keV. The data, maps, and documentation provided here represent the 10th major release of IBEX data and include improvements to various prior corrections to provide the citable reference for the current version of IBEX data. We are now able to study time variations in the outer heliosphere and interstellar interaction over more than half a solar cycle. We find that the Ribbon has evolved differently than the globally distributed flux (GDF), with a leveling off and partial recovery of ENAs from the GDF, owing to solar wind output flattening and recovery. The Ribbon has now also lost its latitudinal ordering, which reflects the breakdown of solar minimum solar wind conditions and exhibits a greater time delay than for the surrounding GDF. Together, the IBEX observations strongly support a secondary ENA source for the Ribbon, and we suggest that this be adopted as the nominal explanation of the Ribbon going forward.

Full Text Available A data assimilation system has been developed to estimateglobal nitrogen oxides (NOx emissions using OMI tropospheric NO2 columns (DOMINO product and a global chemical transport model (CTM, the Chemical Atmospheric GCM for Study of Atmospheric Environment and Radiative Forcing (CHASER. The data assimilation system, based on an ensemble Kalman filter approach, was applied to optimize daily NOx emissions with a horizontal resolution of 2.8° during the years 2005 and 2006. The background error covariance estimated from the ensemble CTM forecasts explicitly represents non-direct relationships between the emissions and tropospheric columns caused by atmospheric transport and chemical processes. In comparison to the a priori emissions based on bottom-up inventories, the optimized emissions were higher over eastern China, the eastern United States, southern Africa, and central-western Europe, suggesting that the anthropogenic emissions are mostly underestimated in the inventories. In addition, the seasonality of the estimated emissions differed from that of the a priori emission over several biomass burning regions, with a large increase over Southeast Asia in April and over South America in October. The data assimilation results were validated against independent data: SCIAMACHY tropospheric NO2 columns and vertical NO2 profiles obtained from aircraft and lidar measurements. The emission correction greatly improved the agreement between the simulated and observed NO2 fields; this implies that the data assimilation system efficiently derives NOx emissions from concentration observations. We also demonstrated that biases in the satellite retrieval and model settings used in the data assimilation largely affect the magnitude of estimated emissions. These dependences should be carefully considered for better understanding NOx sources from top-down approaches.

Full Text Available BACKGROUND: Envenoming resulting from snakebites is an important public health problem in many tropical and subtropical countries. Few attempts have been made to quantify the burden, and recent estimates all suffer from the lack of an objective and reproducible methodology. In an attempt to provide an accurate, up-to-date estimate of the scale of the global problem, we developed a new method to estimate the disease burden due to snakebites. METHODS AND FINDINGS: The globalestimates were based on regional estimates that were, in turn, derived from data available for countries within a defined region. Three main strategies were used to obtain primary data: electronic searching for publications on snakebite, extraction of relevant country-specific mortality data from databases maintained by United Nations organizations, and identification of grey literature by discussion with key informants. Countries were grouped into 21 distinct geographic regions that are as epidemiologically homogenous as possible, in line with the Global Burden of Disease 2005 study (Global Burden Project of the World Bank. Incidence rates for envenoming were extracted from publications and used to estimate the number of envenomings for individual countries; if no data were available for a particular country, the lowest incidence rate within a neighbouring country was used. Where death registration data were reliable, reported deaths from snakebite were used; in other countries, deaths were estimated on the basis of observed mortality rates and the at-risk population. We estimate that, globally, at least 421,000 envenomings and 20,000 deaths occur each year due to snakebite. These figures may be as high as 1,841,000 envenomings and 94,000 deaths. Based on the fact that envenoming occurs in about one in every four snakebites, between 1.2 million and 5.5 million snakebites could occur annually. CONCLUSIONS: Snakebites cause considerable morbidity and mortality worldwide. The

The geometry of real world objects can be described by Minkowski tensors. Algorithms have been suggested to approximate Minkowski tensors if only a binary image of the object is available. This paper presents implementations of two such algorithms. The theoretical convergence properties...... are confirmed by simulations on test sets, and recommendations for input arguments of the algorithms are given. For increasing resolutions, we obtain more accurate estimators for the Minkowski tensors. Digitisations of more complicated objects are shown to require higher resolutions....

Calculating global solar irradiation from global horizontal irradiation only is a difficult task, especially when the time step is small and the data are not averaged. We used an Artificial Neural Network (ANN) to realize this conversion. The ANN is optimized and tested on the basis of five years of solar data; the accuracy of the optimal configuration is around 6% for the RRMSE (relative root mean square error) and around 3.5% for the RMAE (relative mean absolute value) i.e. a better performance than the empirical correlations available in the literature. -- Highlights: ► ANN (Artificial Neural Network) methodology applied to hourly global solar irradiation in order to estimate tilted irradiations. ► Model validation with more than 23,000 data. ► Comparison with “conventional” models. ► The precision in the results is better than with empirical correlations. ► 6% for the RMSE (root means square error) and around 3.5% for the RMAE (Relative Mean Absolute Value).

While the Emissions Database for Global Atmospheric Research (EDGAR) focuses on globalestimates for the full set of anthropogenic activities, the Land Use, Land-Use Change and Forestry (LULUCF) sector might be the most diverse and most challenging to cover consistently for all countries of the world. Parties to United Nations Framework Convention on Climate Change (UNFCCC) are required to provide periodic estimates of greenhouse gas (GHG) emissions, following the latest approved methodological guidance by the International Panel on Climate Change (IPCC). The current study aims to consistently estimate the carbon (C) stock changes from living forest biomass for all countries of the world, in order to complete the LULUCF sector in EDGAR. In order to derive comparable estimates for developing and developed countries, it is crucial to use a single methodology with global applicability. Data for developing countries are generally poor, such that only the Tier 1 methods from either the IPCC Good Practice Guide for Land Use, Land-Use Change and Forestry (GPG-LULUCF) 2003 or the IPCC 2006 Guidelines can be applied to these countries. For this purpose, we applied the IPCC Tier 1 method at global level following both IPCC GPG-LULUCF 2003 and IPCC 2006, using spatially coarse activity data (i.e. area, obtained combining two different global forest maps: the Global Land Cover map and the eco-zones subdivision of the Global Ecological Zone (GEZ) map) in combination with the IPCC default C stocks and C stock change factors. Results for the C stock changes were calculated separately for gains, harvest, fires (Global Fire Emissions Database version 3, GFEDv.3) and net deforestation for the years 1990, 2000, 2005 and 2010. At the global level, results obtained with the two sets of IPCC guidance differed by about 40 %, due to different assumptions and default factors. The IPCC Tier 1 method unavoidably introduced high uncertainties due to the "globalization" of parameters. When the

We have produced annual estimates of national and global gas flaring and gas flaring efficiency from 1994 through 2008 using low light imaging data acquired by the Defense Meteorological Satellite Program (DMSP). Gas flaring is a widely used practice for the disposal of associated gas in oil production and processing facilities where there is insufficient infrastructure for utilization of the gas (primarily methane). Improved utilization of the gas is key to reducing global carbon emissions to the atmosphere. The DMSP estimates of flared gas volume are based on a calibration developed with a pooled set of reported national gas flaring volumes and data from individual flares. Flaring efficiency was calculated as the volume of flared gas per barrel of crude oil produced. Global gas flaring has remained largely stable over the past fifteen years, in the range of 140 to 170 billion cubic meters (BCM). Global flaring efficiency was in the seven to eight cubic meters per barrel from 1994 to 2005 and declined to 5.6 m 3 per barrel by 2008. The 2008 gas flaring estimate of 139 BCM represents 21% of the natural gas consumption of the USA with a potential retail market value of 68 billions USD. The 2008 flaring added more than 278 million metric tons of carbon dioxide equivalent (CO 2e ) into the atmosphere. The DMSP estimated gas flaring volumes indicate that global gas flaring has declined by 19% since 2005, led by gas flaring reductions in Russia and Nigeria, the two countries with the highest gas flaring levels. The flaring efficiency of both Russia and Nigeria improved from 2005 to 2008, suggesting that the reductions in gas flaring are likely the result of either improved utilization of the gas, reinjection, or direct venting of gas into the atmosphere, although the effect of uncertainties in the satellite data cannot be ruled out. It is anticipated that the capability to estimate gas flaring volumes based on satellite data will spur improved utilization of gas that

An analysis of cost for each components of Small Hydroelectric Power Plant, taking into account the real costs of these projects is shown. It also presents a global equation which allows a preliminary estimation of cost for each construction. (author)

Data collected over six days from a worldwide Global Positioning System (GPS) tracking network during the Epoch '92 campaign are used to estimate variations of the Earth's pole position every 30 minutes.

At the request of Senator Conrad, the Congressional Budget Office (CBO) has estimated the costs of military operations in Iraq and Afghanistan and other operations associated with the global war on terrorism (GWOT...

In 4 studies, the authors examined the prediction derived from construal level theory (CLT) that higher level of perceptual construal would enhance estimated egocentric psychological distance. The authors primed participants with global perception, local perception, or both (the control condition).

Full Text Available Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week.

A daily, high-resolution, global fire emissions model has been built to estimate emissions from open burning for air quality modeling applications: The Fire INventory from NCAR (FINN version 1). The model framework uses daily fire detections from the MODIS instruments and updated emission factors, specifically for speciated non-methane organic compounds (NMOC). Global...

Industrially produced N-fertilizer is essential to the production of cereals that supports current and projected human populations. We constructed a top-down global N budget for maize, rice, and wheat for a 50-year period (1961 to 2010). Cereals harvested a total of 1551 Tg of N, of which 48% was supplied through fertilizer-N and 4% came from net soil depletion. An estimated 48% (737 Tg) of crop N, equal to 29, 38, and 25 kg ha(-1) yr(-1) for maize, rice, and wheat, respectively, is contributed by sources other than fertilizer- or soil-N. Non-symbiotic N2 fixation appears to be the major source of this N, which is 370 Tg or 24% of total N in the crop, corresponding to 13, 22, and 13 kg ha(-1) yr(-1) for maize, rice, and wheat, respectively. Manure (217 Tg or 14%) and atmospheric deposition (96 Tg or 6%) are the other sources of N. Crop residues and seed contribute marginally. Our scaling-down approach to estimate the contribution of non-symbiotic N2 fixation is robust because it focuses on global quantities of N in sources and sinks that are easier to estimate, in contrast to estimating N losses per se, because losses are highly soil-, climate-, and crop-specific.

Rivers deliver large amounts of fresh water, nutrients, and other terrestrially derived materials to the coastal ocean. Where inputs accumulate on the shelf, harmful effects such as hypoxia and eutrophication can result. In contrast, where export to the open ocean is efficient riverine inputs contribute to global biogeochemical budgets. Assessing the fate of riverine inputs is difficult on a global scale. Global ocean models are generally too coarse to resolve the relatively small scale features of river plumes. High-resolution regional models have been developed for individual river plume systems, but it is impractical to apply this approach globally to all rivers. Recently, generalized parameterizations have been proposed to estimate the export of riverine fresh water to the open ocean (Izett & Fennel, 2018, https://doi.org/10.1002/2017GB005667; Sharples et al., 2017, https://doi.org/10.1002/2016GB005483). Here the relationships of Izett and Fennel, https://doi.org/10.1002/2017GB005667 are used to derive globalestimates of open-ocean export of fresh water and dissolved inorganic silicate, dissolved organic carbon, and dissolved organic and inorganic phosphorus and nitrogen. We estimate that only 15-53% of riverine fresh water reaches the open ocean directly in river plumes; nutrient export is even less efficient because of processing on continental shelves. Due to geographic differences in riverine nutrient delivery, dissolved silicate is the most efficiently exported to the open ocean (7-56.7%), while dissolved inorganic nitrogen is the least efficiently exported (2.8-44.3%). These results are consistent with previous estimates and provide a simple way to parameterize export to the open ocean in global models.

Global ethanol fuel consumption has increased exponentially over the last two decades and the US plans to double annual renewable fuel production in the next five years as required by the renewable fuel standard. Regardless of the technology or feedstock used to produce the renewable fuel, the primary end product will be ethanol. Increasing ethanol fuel consumption will have an impact on the oxidizing capacity of the atmosphere and increase atmospheric concentrations of the secondary pollutant peroxyacetyl nitrate as well a variety of VOCs with relatively high ozone reactivities (e.g. ethanol, formaldehyde, acetaldehyde). Despite these documented effects of ethanol emissions on atmospheric chemistry, current global atmospheric ethanol budget models have large uncertainties in the magnitude of ethanol sources and sinks. The presented work investigates the global wet deposition sink by providing the first estimate of the global wet deposition flux of ethanol (2.4 ± 1.6 Tg/yr) based on empirical wet deposition data (219 samples collected at 12 locations). This suggests the wet deposition sink removes between 6 and 17% of atmospheric ethanol annually. Concentrations of ethanol in marine wet deposition (25 ± 6 nM) were an order of magnitude less than in the majority of terrestrial deposition (345 ± 280 nM). Terrestrial deposition collected in locations impacted by high local sources of biofuel usage and locations downwind from ethanol distilleries were an order of magnitude higher in ethanol concentration (3090 ± 448 nM) compared to deposition collected in terrestrial locations not impacted by these sources. These results indicate that wet deposition of ethanol is heavily influenced by local sources and ethanol emission impacts on air quality may be more significant in highly populated areas. As established and developing countries continue to rapidly increase ethanol fuel consumption and subsequent emissions, understanding the magnitude of all ethanol sources and

Describes events and factors contributing to a global orientation for community colleges, including external and internal forces, the role of university professors, agencies and professional organizations, foreign students, and the influence of staff and student travel. Possibilities and requirements of continued global development are discussed.…

An optimized artificial neural network ensemble model is built to estimate daily global solar radiation over large areas. The model uses clear-sky estimates and satellite images as input variables. Unlike most studies using satellite imagery based on visible channels, our model also exploits all information within infrared channels of the Meteosat 9 satellite. A genetic algorithm is used to optimize selection of model inputs, for which twelve are selected – eleven 3-km Meteosat 9 channels and one clear-sky term. The model is validated in Andalusia (Spain) from January 2008 through December 2008. Measured data from 83 stations across the region are used, 65 for training and 18 independent ones for testing the model. At the latter stations, the ensemble model yields an overall root mean square error of 6.74% and correlation coefficient of 99%; the generated estimates are relatively accurate and errors spatially uniform. The model yields reliable results even on cloudy days, improving on current models based on satellite imagery. - Highlights: • Daily solar radiation data are generated using an artificial neural network ensemble. • Eleven Meteosat channels observations and a clear sky term are used as model inputs. • Model exploits all information within infrared Meteosat channels. • Measured data for a year from 83 ground stations are used. • The proposed approach has better performance than existing models on daily basis

Nitrous oxide (N2O) levels have been steadily increasing in the atmosphere over the past few decades at a rate of approximately 0.3% per year. This trend is of major concern as N2O is both a long-lived Greenhouse Gas (GHG) and an Ozone Depleting Substance (ODS), as it is a precursor of NO and NO2, which catalytically destroy ozone in the stratosphere. Recently, N2O emissions have been recognised as the most important ODS emissions and are now of greater importance than emissions of CFC's. The growth in atmospheric N2O is predominantly due to the enhancement of surface emissions by human activities. Most notably, the intensification and proliferation of agriculture since the mid-19th century, which has been accompanied by the increased input of reactive nitrogen to soils and has resulted in significant perturbations to the natural N-cycle and emissions of N2O. There exist two approaches for estimating N2O emissions, the so-called 'bottom-up' and 'top-down' approaches. Top-down approaches, based on the inversion of atmospheric measurements, require an estimate of the loss of N2O via photolysis and oxidation in the stratosphere. Uncertainties in the loss magnitude contribute uncertainties of 15 to 20% to the global annual surface emissions, complicating direct comparisons between bottom-up and top-down estimates. In this study, we present a novel inversion framework for the simultaneous optimization of N2O surface emissions and the magnitude of the loss, which avoids errors in the emissions due to incorrect assumptions about the lifetime of N2O. We use a Bayesian inversion with a variational formulation (based on 4D-Var) in order to handle very large datasets. N2O fluxes are retrieved at 4-weekly resolution over a global domain with a spatial resolution of 3.75° x 2.5° longitude by latitude. The efficacy of the simultaneous optimization of emissions and losses is tested using a global synthetic dataset, which mimics the available atmospheric data. Lastly, using real

Full Text Available Insufficient data exist for accurate estimation of global nutrient supplies. Commonly used global datasets contain key weaknesses: 1 data with global coverage, such as the FAO food balance sheets, lack specific information about many individual foods and no information on micronutrient supplies nor heterogeneity among subnational populations, while 2 household surveys provide a closer approximation of consumption, but are often not nationally representative, do not commonly capture many foods consumed outside of the home, and only provide adequate information for a few select populations. Here, we attempt to improve upon these datasets by constructing a new model--the Global Expanded Nutrient Supply (GENuS model--to estimate nutrient availabilities for 23 individual nutrients across 225 food categories for thirty-four age-sex groups in nearly all countries. Furthermore, the model provides historical trends in dietary nutritional supplies at the national level using data from 1961-2011. We determine supplies of edible food by expanding the food balance sheet data using FAO production and trade data to increase food supply estimates from 98 to 221 food groups, and then estimate the proportion of major cereals being processed to flours to increase to 225. Next, we estimate intake among twenty-six demographic groups (ages 20+, both sexes in each country by using data taken from the Global Dietary Database, which uses nationally representative surveys to relate national averages of food consumption to individual age and sex-groups; for children and adolescents where GDD data does not yet exist, average calorie-adjusted amounts are assumed. Finally, we match food supplies with nutrient densities from regional food composition tables to estimate nutrient supplies, running Monte Carlo simulations to find the range of potential nutrient supplies provided by the diet. To validate our new method, we compare the GENuS estimates of nutrient supplies against

Background: The effect of ambient air pollution on global variations and trends in asthma prevalence is unclear. Objectives: Our goal was to investigate community-level associations between asthma prevalence data from the International Study of Asthma and Allergies in Childhood (ISAAC) and satellite-based estimates of particulate matter with aerodynamic diameter < 2.5 microm (PM2.5) and nitrogen dioxide (NO2), and modelled estimates of ozone. Methods: We assigned satellite-based estimates of PM2.5 and NO2 at a spatial resolution of 0.1deg × 0.1deg and modeled estimates of ozone at a resolution of 1deg × 1deg to 183 ISAAC centers. We used center-level prevalence of severe asthma as the outcome and multilevel models to adjust for gross national income (GNI) and center- and country-level sex, climate, and population density. We examined associations (adjusting for GNI) between air pollution and asthma prevalence over time in centers with data from ISAAC Phase One (mid-1900s) and Phase Three (2001-2003). Results: For the 13- to 14-year age group (128 centers in 28 countries), the estimated average within-country change in center-level asthma prevalence per 100 children per 10% increase in center-level PM2.5 and NO2 was -0.043 [95% confidence interval (CI): -0.139, 0.053] and 0.017 (95% CI: -0.030, 0.064) respectively. For ozone the estimated change in prevalence per parts per billion by volume was -0.116 (95% CI: -0.234, 0.001). Equivalent results for the 6- to 7-year age group (83 centers in 20 countries), though slightly different, were not significantly positive. For the 13- to 14-year age group, change in center-level asthma prevalence over time per 100 children per 10% increase in PM2.5 from Phase One to Phase Three was -0.139 (95% CI: -0.347, 0.068). The corresponding association with ozone (per ppbV) was -0.171 (95% CI: -0.275, -0.067). Conclusion: In contrast to reports from within-community studies of individuals exposed to traffic pollution, we did not find

We present an analytical method for the continuous in situ measurement of nitrogen trifluoride (NF3) - an anthropogenic gas with a global warming potential of ~16800 over a 100 year time horizon. NF3 is not included in national reporting emissions inventories under the United Nations Framework Convention on Climate Change (UNFCCC). However, it is a rapidly emerging greenhouse gas due to emission from a growing number of manufacturing facilities with increasing output and modern end-use applications, namely in microcircuit etching, and in production of flat panel displays and thin-film photovoltaic cells. Despite success in measuring the most volatile long lived halogenated species such as CF4, the Medusa preconcentration GC/MS system of Miller et al. (2008) is unable to detect NF3 under remote operation. Using altered techniques of gas separation and chromatography after initial preconcentration, we are now able to make continuous atmospheric measurements of NF3 with average precisions NF3 produced. Emission factors are shown to have reduced over the last decade; however, rising production and end-use have caused the average global atmospheric concentration to double between 2005 and 2011 i.e. half the atmospheric NF3 present today originates from emissions after 2005. Finally we show the first continuous in situ measurements from La Jolla, California, illustrating how global deployment of our technique could improve the temporal and spatial scale of NF3 'top-down' emission estimates over the coming years. These measurements will be important for independent verification of emissions should NF3 be regulated under a new climate treaty.

Surface global solar radiation (GSR) is the primary renewable energy in nature. Geostationary satellite data are used to map GSR in many inversion algorithms in which ground GSR measurements merely serve to validate the satellite retrievals. In this study, a simple algorithm with artificial neural network (ANN) modeling is proposed to explore the non-linear physical relationship between ground daily GSR measurements and Multi-functional Transport Satellite (MTSAT) all-channel observations in an effort to fully exploit information contained in both data sets. Singular value decomposition is implemented to extract the principal signals from satellite data and a novel method is applied to enhance ANN performance at high altitude. A three-layer feed-forward ANN model is trained with one year of daily GSR measurements at ten ground sites. This trained ANN is then used to map continuous daily GSR for two years, and its performance is validated at all 83 ground sites in China. The evaluation result demonstrates that this algorithm can quickly and efficiently build the ANN model that estimates daily GSR from geostationary satellite data with good accuracy in both space and time. -- Highlights: → A simple and efficient algorithm to estimate GSR from geostationary satellite data. → ANN model fully exploits both the information from satellite and ground measurements. → Good performance of the ANN model is comparable to that of the classical models. → Surface elevation and infrared information enhance GSR inversion.

For the last three years, a new ESA Data User Element (DUE) project had focussed on creating improved knowledge about the Essential Climate Variable Biomass. The main purpose of the DUE GlobBiomass project is to better characterize and to reduce uncertainties of AGB estimates by developing an innovative synergistic mapping approach in five regional sites (Sweden, Poland, Mexico, Kalimantan, South Africa) for the epochs 2005, 2010 and 2015 and for one global map for the year 2010. The project team includes leading Earth Observation experts of Europe and is linked through Partnership Agreements with further national bodies from Brazil, Canada, China, Russia and South Africa. GlobBiomass has demonstrated how EO observation data can be integrated with in situ measurements and ecological understanding to provide improved biomass estimates that can be effectively exploited by users. The target users had mainly be drawn from the climate and carbon cycle modelling communities and included users concerned with carbon emissions and uptake due to biomass changes within initiatives such as REDD+. GlobBiomass provided a harmonised structure that can be exploited to address user needs for biomass information, but will be capable of being progressively refined as new data and methods become available. This presentation will give an overview of the technical prerequisites and final results of the GlobBiomass project.

For improving the estimation of the spatio-temporal dynamics of the terrestrial carbon cycle, a new time series of the leaf area index (LAI) is generated for the global land surface at 8 km resolution from 1981 to 2012 by combining AVHRR and MODIS satellite data. This product differs from existing LAI products in the following two aspects: (1) the non-random spatial distribution of leaves with the canopy is considered, and (2) the seasonal variation of the vegetation background is included. The non-randomness of the leaf spatial distribution in the canopy is considered using the second vegetation structural parameter named clumping index (CI), which quantifies the deviation of the leaf spatial distribution from the random case. Using the MODIS Bidirectional Reflectance Distribution Function product, a global map of CI is produced at 500 m resolution. In our LAI algorithm, CI is used to convert the effective LAI obtained from mono-angle remote sensing into the true LAI, otherwise LAI would be considerably underestimated. The vegetation background is soil in crop, grass and shrub but includes soil, grass, moss, and litter in forests. Through processing a large volume of MISR data from 2000 to 2010, monthly red and near-infrared reflectances of the vegetation background is mapped globally at 1 km resolution. This new LAI product has been validated extensively using ground-based LAI measurements distributed globally. In carbon cycle modeling, the use of CI in addition to LAI allows for accurate separation of sunlit and shaded leaves as an important step in terrestrial photosynthesis and respiration modeling. Carbon flux measurements over 100 sites over the globe are used to validate an ecosystem model named Boreal Ecosystem Productivity Simulator (BEPS). The validated model is run globally at 8 km resolution for the period from 1981 to 2012 using the LAI product and other spatial datasets. The modeled results suggest that changes in vegetation structure as quantified

Full Text Available The epidemiology of malaria makes surveillance-based methods of estimating its disease burden problematic. Cartographic approaches have provided alternative malaria burden estimates, but there remains widespread misunderstanding about their derivation and fidelity. The aims of this study are to present a new cartographic technique and its application for deriving global clinical burden estimates of Plasmodium falciparum malaria for 2007, and to compare these estimates and their likely precision with those derived under existing surveillance-based approaches.In seven of the 87 countries endemic for P. falciparum malaria, the health reporting infrastructure was deemed sufficiently rigorous for case reports to be used verbatim. In the remaining countries, the mapped extent of unstable and stable P. falciparum malaria transmission was first determined. Estimates of the plausible incidence range of clinical cases were then calculated within the spatial limits of unstable transmission. A modelled relationship between clinical incidence and prevalence was used, together with new maps of P. falciparum malaria endemicity, to estimate incidence in areas of stable transmission, and geostatistical joint simulation was used to quantify uncertainty in these estimates at national, regional, and global scales. Combining these estimates for all areas of transmission risk resulted in 451 million (95% credible interval 349-552 million clinical cases of P. falciparum malaria in 2007. Almost all of this burden of morbidity occurred in areas of stable transmission. More than half of all estimated P. falciparum clinical cases and associated uncertainty occurred in India, Nigeria, the Democratic Republic of the Congo (DRC, and Myanmar (Burma, where 1.405 billion people are at risk. Recent surveillance-based methods of burden estimation were then reviewed and discrepancies in national estimates explored. When these cartographically derived national estimates were ranked

Global economic growth at the end of the year strongly predicts returns from a wide spectrum of international assets, such as global, regional, and individual-country stocks, FX, and commodities. Global economic growth at other times of the year does not predict international returns. Low growth...

Nearly 50 years ago, Henry Bent published his groundbreaking article in this "Journal" introducing the "global" formulation of thermodynamics. In the following years, the global formulation was elaborated by Bent and by one of the present authors. The global formulation of the first law focuses on conservation of energy and the recognition that…

Full Text Available The Journal of Global Health (JoGH is three years old. To assess its impact, we analysed online access to JoGH’s articles using PubMed Central and Google Analytics tools. Moreover, we tracked citations that JoGH received in 2013 using ISI Web of KnowledgeSM and Google Scholar® tools. The 66 items (articles, viewpoints and editorials published between June 2011 and December 2013 were accessed more than 50 000 times during 2013, from more than 160 countries of the world. Seven among the 13 most accessed papers were focused on global, regional and national epidemiological estimates of important infectious diseases. JoGH articles published in 2011 and 2012 received 77 citations in Journal Citation Reports® (JCR–indexed journals in 2013 to 24 original research articles, setting our first, unofficial impact factor at 3.208. In addition, JoGH received 11 citations during 2013 to its 12 original research papers published during 2013, resulting in an immediacy index of 0.917. The number of external, non–commissioned submissions that we consider to be of high quality is continuously increasing, leading to current JoGH’s rejection rate of about 80%. The current citation analysis raises favourable expectations for the JoGH’s overall impact on the global health community in future years.

The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produ......-method ensemble, it was possible to quantify ‘method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.......The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce...... similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries...

The Millennium Development Goals (MDGs) triggered increased demand for data on child and maternal mortality for monitoring progress. With the advent of the Sustainable Development Goals (SDGs) and growing evidence of an epidemiological transition towards non-communicable diseases, policy makers need data on mortality and disease trends and distribution to inform effective policies and support monitoring progress. Where there are limited capacities to produce national health estimates (NHEs), global health estimates (GHEs) can fill gaps for global monitoring and comparisons. This paper draws lessons learned from Thailand's burden of disease study (BOD) on capacity development for NHEs, and discusses the contributions and limitation of GHEs in informing policies at country level. Through training and technical support by external partners, capacities are gradually strengthened and institutionalized to enable regular updates of BOD at national and sub-national levels. Initially, the quality of cause of death reporting in the death certificates was inadequate, especially for deaths occurring in the community. Verbal autopsies were conducted, using domestic resources, to determine probable causes of deaths occurring in the community. This helped improve the estimation of years of life lost. Since the achievement of universal health coverage in 2002, the quality of clinical data on morbidities has also considerably improved. There are significant discrepancies between the 2010 Global Burden of Diseases (GBD) estimates for Thailand and the 1999 nationally generated BOD, especially for years of life lost due to HIV/AIDS, and the ranking of priority diseases. National ownership of NHEs and effective interfaces between researchers and decision makers contribute to enhanced country policy responses, while sub-national data are intended to be used by various sub-national-level partners. Though GHEs contribute to benchmarking country achievement compared with global health

Objective To validate the estimates of Global Burden of Disease (GBD) due to congenital anomaly for Europe by comparing infant mortality data collected by EUROCAT registries with the WHO Mortality Database, and by assessing the significance of stillbirths and terminations of pregnancy for fetal anomaly (TOPFA) in the interpretation of infant mortality statistics. Design, setting and outcome measures EUROCAT is a network of congenital anomaly registries collecting data on live births, fetal deaths from 20 weeks’ gestation and TOPFA. Data from 29 registries in 19 countries were analysed for 2005–2009, and infant mortality (deaths of live births at age congenital anomaly. In 11 EUROCAT countries, average infant mortality with congenital anomaly was 1.1 per 1000 births, with higher rates where TOPFA is illegal (Malta 3.0, Ireland 2.1). The rate of stillbirths with congenital anomaly was 0.6 per 1000. The average TOPFA prevalence was 4.6 per 1000, nearly three times more prevalent than stillbirths and infant deaths combined. TOPFA also impacted on the prevalence of postneonatal survivors with non-lethal congenital anomaly. Conclusions By excluding TOPFA and stillbirths from GBD years of life lost (YLL) estimates, GBD underestimates the burden of disease due to congenital anomaly, and thus declining YLL over time may obscure lack of progress in primary, secondary and tertiary prevention. PMID:28667189

Full Text Available The demand for more efficient and environmentally benign, non-conventional sources of energy came into picture due to increasing demands for human comforts. Solar energy is now the ultimate option. In this paper, the instruments used to measure the solar radiation at Innovation Centre, MIT Manipal were connected to a Raspberry Pi to access the data remotely. Genetic Algorithms were formulated, so that the monthly mean global solar radiation in Manipal can be effectively estimated. Meteorological data such as humidity, temperature, wind speed, etc. were used as inputs to train the networks. A successful network was made between the data loggers and the Raspberry Pi. The data collected by the data loggers from the devices are transmitted to the Raspberry Pi which in turn sends the data to an internal server. The Raspberry Pi can be accessed using any SSH client such as PuTTY. The meteorological data was collected for the years 2010-2014 in order to formulate the Artificial Intelligence models. The validity of the formulated models were checked by comparing the measured data with the estimated data using tools such as RMSE, correlation coefficient, etc. The modelling of solar radiation using GA was carried out in GeneXpro tools version 5.0.

In 4 studies, the authors examined the prediction derived from construal level theory (CLT) that higher level of perceptual construal would enhance estimated egocentric psychological distance. The authors primed participants with global perception, local perception, or both (the control condition). Relative to the control condition, global processing made participants estimate larger psychological distances in time (Study 1), space (Study 2), social distance (Study 3), and hypotheticality (Study 4). Local processing had the opposite effect. Consistent with CLT, all studies show that the effect of global-versus-local processing did emerge when participants estimated egocentric distances, which are distances from the experienced self in the here and now, but did not emerge with temporal distances not from now (Study 1), spatial distances not from here (Study 2), social distances not from the self (Study 3), or hypothetical events that did not involve altering an experienced reality (Study 4).

Proper policies for the prevention or mitigation of the effects of global warming require profound analysis of the costs and benefits of alternative policy strategies. Given the uncertainty about the scientific aspects of the process of global warming, in this paper a sensitivity analysis for the impact of various estimates of costs and benefits of greenhouse gas reduction strategies is carried out to analyze the potential social and economic impacts of climate change

We study nonlinear elliptic problems with nonstandard growth and ellipticity related to an N-function. We establish global Calderón-Zygmund estimates of the weak solutions in the framework of Orlicz spaces over bounded non-smooth domains. Moreover, we prove a global regularity result for asymptotically regular problems which are getting close to the regular problems considered, when the gradient variable goes to infinity.

In the light use efficiency (LUE) approach of estimating the gross primary productivity (GPP), plant productivity is linearly related to absorbed photosynthetically active radiation assuming that plants absorb and convert solar energy into biomass within a maximum LUE (LUEmax) rate, which is assumed to vary conservatively within a given biome type. However, it has been shown that photosynthetic efficiency can vary within biomes. In this study, we used 149 global CO2 flux towers to derive the optimum LUE (LUEopt) under prevailing climate conditions for each tower location, stratified according to model training and test sites. Unlike LUEmax, LUEopt varies according to heterogeneous landscape characteristics and species traits. The LUEopt data showed large spatial variability within and between biome types, so that a simple biome classification explained only 29% of LUEopt variability over 95 global tower training sites. The use of explanatory variables in a mixed effect regression model explained 62.2% of the spatial variability in tower LUEopt data. The resulting regression model was used for global extrapolation of the LUEopt data and GPP estimation. The GPP estimated using the new LUEopt map showed significant improvement relative to global tower data, including a 15% R2 increase and 34% root-mean-square error reduction relative to baseline GPP calculations derived from biome-specific LUEmax constants. The new global LUEopt map is expected to improve the performance of LUE-based GPP algorithms for better assessment and monitoring of global terrestrial productivity and carbon dynamics.

Background Up-to-date evidence about levels and trends in disease and injury incidence, prevalence, and years lived with disability (YLDs) is an essential input into global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013), we estimated these quantities

In this study, we use the MATCH (Model of Atmospheric Transport and Chemistry) model and Kalman filtering techniques to optimally estimate N2O emissions from seven source regions around the globe. The MATCH model was used with NCEP assimilated winds at T62 resolution (192 longitude by 94 latitude surface grid, and 28 vertical levels) from July 1st 1996 to December 31st 2000. The average concentrations of N2O in the lowest four layers of the model were then compared with the monthly mean observations from six national/global networks (AGAGE, CMDL (HATS), CMDL (CCGG), CSIRO, CSIR and NIES), at 48 surface sites. A 12-month-running-mean smoother was applied to both the model results and the observations, due to the fact that the model was not able to reproduce the very small observed seasonal variations. The Kalman filter was then used to solve for the time-averaged regional emissions of N2O for January 1st 1997 to June 30th 2000. The inversions assume that the model stratospheric destruction rates, which lead to a global N2O lifetime of 130 years, are correct. It also assumes normalized emission spatial distributions from each region based on previous studies. We conclude that the global N2O emission flux is about 16.2 TgN/yr, with {34.9±1.7%} from South America and Africa, {34.6±1.5%} from South Asia, {13.9±1.5%} from China/Japan/South East Asia, {8.0±1.9%} from all oceans, {6.4±1.1%} from North America and North and West Asia, {2.6±0.4%} from Europe, and {0.9±0.7%} from New Zealand and Australia. The errors here include the measurement standard deviation, calibration differences among the six groups, grid volume/measurement site mis-match errors estimated from the model, and a procedure to account approximately for the modeling errors.

This paper presents a preliminary attempt at obtaining an order-of-magnitude estimate of the global burden of disease (GBD) of human infectious diseases associated with swimming/bathing in coastal waters polluted by wastewater, and eating raw or lightly steamed filter-feeding shellfish harvested from such waters. Such diseases will be termed thalassogenic--caused by the sea. Until recently these human health effects have been viewed primarily as local phenomena, not generally included in the world agenda of marine scientists dealing with global marine pollution problems. The massive global scale of the problem can be visualized when one considers that the wastewater and human body wastes of a significant portion of the world's population who reside along the coastline or in the vicinity of the sea are discharged daily, directly or indirectly, into the marine coastal waters, much of it with little or no treatment. Every cubic metre of raw domestic wastewater discharged into the sea can carry millions of infectious doses of pathogenic microorganisms. It is estimated that globally, foreign and local tourists together spend some 2 billion man-days annually at coastal recreational resorts and many are often exposed there to coastal waters polluted by wastewater. Annually some 800 million meals of potentially contaminated filter-feeding shellfish/bivalves and other sea foods, harvested in polluted waters are consumed, much of it raw or lightly steamed. A number of scientific studies have shown that swimmers swallow significant amounts of polluted seawater and can become ill with gastrointestinal and respiratory diseases from the pathogens they ingest. Based on risk assessments from the World Health Organization (WHO) and academic research sources the present study has made an estimate that globally, each year, there are in excess of 120 million cases of gastrointestinal disease and in excess of 50 million cases of more severe respiratory diseases caused by swimming and

Satellite-derived coarse-resolution data are typically used for conducting global analyses. But the forest areas estimated from coarse-resolution maps (e.g., 1 km) inevitably differ from a corresponding fine-resolution map (such as a 30-m map) that would be closer to ground truth. A better understanding of changes in grain size on area estimation will improve our...

will present PAR levels beneath the snowpack for the Northern Hemisphere during spring for both cloudy and clear sky conditions for 1983-1987, describe our methods, and provide the first estimate of the NPP for Cladonia species beneath Northern Hemisphere snowpack. This analysis synthesizes 5 years of data; the variability during this period will be used to discuss the influence of global warming on Arctic plant growth prior to snowmelt.

Advances in flood monitoring/forecasting have been constrained by the difficulty in estimating rainfall continuously over space (catchment-, national-, continental-, or even global-scale areas) and flood-relevant time scale. With the recent availability of satellite rainfall estimates at fine time and space resolution, this paper describes a prototype research framework for global flood monitoring by combining real-time satellite observations with a database of global terrestrial characteristics through a hydrologically relevant modeling scheme. Four major components included in the framework are (1) real-time precipitation input from NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA); (2) a central geospatial database to preprocess the land surface characteristics: water divides, slopes, soils, land use, flow directions, flow accumulation, drainage network etc.; (3) a modified distributed hydrological model to convert rainfall to runoff and route the flow through the stream network in order to predict the timing and severity of the flood wave, and (4) an open-access web interface to quickly disseminate flood alerts for potential decision-making. Retrospective simulations for 1998-2006 demonstrate that the Global Flood Monitor (GFM) system performs consistently at both station and catchment levels. The GFM website (experimental version) has been running at near real-time in an effort to offer a cost-effective solution to the ultimate challenge of building natural disaster early warning systems for the data-sparse regions of the world. The interactive GFM website shows close-up maps of the flood risks overlaid on topography/population or integrated with the Google-Earth visualization tool. One additional capability, which extends forecast lead-time by assimilating QPF into the GFM, also will be implemented in the future.

In this paper we propose the use of global Kalman filters (KFs) to estimate absolute angles of lower limb segments. Standard approaches adopt KFs to improve the performance of inertial sensors based on individual link configurations. In consequence, for a multi-body system like a lower limb exoskeleton, the inertial measurements of one link (e.g., the shank) are not taken into account in other link angle estimations (e.g., foot). Global KF approaches, on the other hand, correlate the collective contribution of all signals from lower limb segments observed in the state-space model through the filtering process. We present a novel global KF (matricial global KF) relying only on inertial sensor data, and validate both this KF and a previously presented global KF (Markov Jump Linear Systems, MJLS-based KF), which fuses data from inertial sensors and encoders from an exoskeleton. We furthermore compare both methods to the commonly used local KF. The results indicate that the global KFs performed significantly better than the local KF, with an average root mean square error (RMSE) of respectively 0.942° for the MJLS-based KF, 1.167° for the matrical global KF, and 1.202° for the local KFs. Including the data from the exoskeleton encoders also resulted in a significant increase in performance. The results indicate that the current practice of using KFs based on local models is suboptimal. Both the presented KF based on inertial sensor data, as well our previously presented global approach fusing inertial sensor data with data from exoskeleton encoders, were superior to local KFs. We therefore recommend to use global KFs for gait analysis and exoskeleton control.

textabstractBackground: The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force

The world's population is growing and demand for food, feed, fiber, and fuel is increasing, placing greater demand on land and its resources for crop production. We review previously published estimates of global scale cropland availability, discuss the underlying assumptions that lead to

Tucker and Townshend (2000) conclude that wall-to-wall coverage is needed to avoid gross errors in estimations of deforestation rates' because tropical deforestation is concentrated along roads and rivers. They specifically question the reliability of the 10% sample of Landsat sensor scenes used in the global remote sensing survey conducted by the Food and...

Global terrestrial carbon cycle largely depends on a spatial pattern in land cover type, which is heterogeneously-distributed over regional and global scales. However, most studies, which aimed at the estimation of carbon exchanges between ecosystem and atmosphere, remained within several tens of kilometers grid spatial resolution, and the results have not been enough to understand the detailed pattern of carbon exchanges based on ecological community. Improving the sophistication of spatial resolution is obviously necessary to enhance the accuracy of carbon exchanges. Moreover, the improvement may contribute to global warming awareness, policy makers and other social activities. In this study, we show global terrestrial carbon exchanges (net ecosystem production, net primary production, and gross primary production) with 1km-grid resolution. As methodology for computing the exchanges, we 1) developed a global 1km-grid climate and satellite dataset based on the approach in Setoyama and Sasai (2013); 2) used the satellite-driven biosphere model (Biosphere model integrating Eco-physiological And Mechanistic approaches using Satellite data: BEAMS) (Sasai et al., 2005, 2007, 2011); 3) simulated the carbon exchanges by using the new dataset and BEAMS by the use of a supercomputer that includes 1280 CPU and 320 GPGPU cores (GOSAT RCF of NIES). As a result, we could develop a global uniform system for realistically estimating terrestrial carbon exchange, and evaluate net ecosystem production in each community level; leading to obtain highly detailed understanding of terrestrial carbon exchanges.

The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

Renewable energy continued to grow in 2014 against the backdrop of increasing global energy consumption, particularly in developing countries, and a dramatic decline in oil prices during the second half of the year. Despite rising energy use, for the first time in four decades, global carbon emissions associated with energy consumption remained stable in 2014 while the global economy grew; this stabilisation has been attributed to increased penetration of renewable energy and to improvements in energy efficiency. Globally, there is growing awareness that increased deployment of renewable energy (and energy efficiency) is critical for addressing climate change, creating new economic opportunities, and providing energy access to the billions of people still living without modern energy services. Although discussion is limited to date, renewables also are an important element of climate change adaptation, improving the resilience of existing energy systems and ensuring delivery of energy services under changing climatic conditions. Renewable energy provided an estimated 19.1% of global final energy consumption in 2013, and growth in capacity and generation continued to expand in 2014. Heating capacity grew at a steady pace, and the production of bio-fuels for transport increased for the second consecutive year, following a slowdown in 2011-2012. The most rapid growth, and the largest increase in capacity, occurred in the power sector, led by wind, solar PV, and hydropower. Growth has been driven by several factors, including renewable energy support policies and the increasing cost-competitiveness of energy from renewable sources. In many countries, renewables are broadly competitive with conventional energy sources. At the same time, growth continues to be tempered by subsidies to fossil fuels and nuclear power, particularly in developing countries. Although Europe remained an important market and a centre for innovation, activity continued to shift towards other

We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

Global terrestrial carbon cycle largely depends on a spatial pattern in land cover type, which is heterogeneously-distributed over regional and global scales. Many studies have been trying to reveal distribution of carbon exchanges between terrestrial ecosystems and atmosphere for understanding global carbon cycle dynamics by using terrestrial biosphere models, satellite data, inventory data, and so on. However, most studies remained within several tens of kilometers grid spatial resolution, and the results have not been enough to understand the detailed pattern of carbon exchanges based on ecological community and to evaluate the carbon stocks by forest ecosystems in each countries. Improving the sophistication of spatial resolution is obviously necessary to enhance the accuracy of carbon exchanges. Moreover, the improvement may contribute to global warming awareness, policy makers and other social activities. We show global terrestrial carbon exchanges (net ecosystem production, net primary production, and gross primary production) with 1km-grid resolution. The methodology for these estimations are shown in the 2015 AGU FM poster "Estimation of Global 1km-grid Terrestrial Carbon Exchange Part I: Developing Inputs and Modelling". In this study, we evaluated the carbon exchanges in various regions with other approaches. We used the satellite-driven biosphere model (BEAMS) as our estimations, GOSAT L4A CO2 flux data, NEP retrieved by NICAM and CarbonTracer2013 flux data, for period from Jun 2001 to Dec 2012. The temporal patterns for this period were indicated similar trends between BEAMS, GOSAT, NICAM, and CT2013 in many sub-continental regions. Then, we estimated the terrestrial carbon exchanges in each countries, and could indicated the temporal patterns of the exchanges in large carbon stock regions.Global terrestrial carbon cycle largely depends on a spatial pattern of land cover type, which is heterogeneously-distributed over regional and global scales. Many

Full Text Available The objective of this work is to map, analyze and compare the production multipliers and the overflow effect related to the manufacturing sector of motor vehicles, trailers and semi-trailers to 43 countries plus the rest of the world - focusing on Brazil - for the years of 2000 and 2014. For this purpose, a theoretical and empirical discussion was held that contemplates theories about the Global Value Chains and a retrospective of the Brazilian automobile industry. The work used the global input-output analysis to estimate the production multipliers based on data available from WIOD (2017. As the main result the research indicates an increase of the multipliers of global production, occurring the same for this sector in Brazil. Another important result was the reduction of the overflow effect for Brazil going against the global effect, that is, this sector, which is supported by the State, decreased its external dependence

Based on measurements of GOME on ESA ERS-2, SCIAMACHY on ESA-ENVISAT, and Ozone Monitoring Instrument (OMI) on the NASA EOS-Aura satellite there is now a unique 11-year dataset of global tropospheric nitrogen dioxide measurements from space. The retrieval approach consists of two steps. The first step is an application of the DOAS (Differential Optical Absorption Spectroscopy) approach which delivers the total absorption optical thickness along the light path (the slant column). For GOME and SCIAMACHY this is based on the DOAS implementation developed by BIRA/IASB. For OMI the DOAS implementation was developed in a collaboration between KNMI and NASA. The second retrieval step, developed at KNMI, estimates the tropospheric vertical column of NO2 based on the slant column, cloud fraction and cloud top height retrieval, stratospheric column estimates derived from a data assimilation approach and vertical profile estimates from space-time collocated profiles from the TM chemistry-transport model. The second step was applied with only minor modifications to all three instruments to generate a uniform 11-year data set. In our talk we will address the following topics: - A short summary of the retrieval approach and results - Comparisons with other retrievals - Comparisons with global and regional-scale models - OMI-SCIAMACHY and SCIAMACHY-GOME comparisons - Validation with independent measurements - Trend studies of NO2 for the past 11 years

A procedure is described to estimate bias errors for mean precipitation by using multiple estimates from different algorithms, satellite sources, and merged products. The Global Precipitation Climatology Project (GPCP) monthly product is used as a base precipitation estimate, with other input products included when they are within +/- 50% of the GPCP estimates on a zonal-mean basis (ocean and land separately). The standard deviation s of the included products is then taken to be the estimated systematic, or bias, error. The results allow one to examine monthly climatologies and the annual climatology, producing maps of estimated bias errors, zonal-mean errors, and estimated errors over large areas such as ocean and land for both the tropics and the globe. For ocean areas, where there is the largest question as to absolute magnitude of precipitation, the analysis shows spatial variations in the estimated bias errors, indicating areas where one should have more or less confidence in the mean precipitation estimates. In the tropics, relative bias error estimates (s/m, where m is the mean precipitation) over the eastern Pacific Ocean are as large as 20%, as compared with 10%-15% in the western Pacific part of the ITCZ. An examination of latitudinal differences over ocean clearly shows an increase in estimated bias error at higher latitudes, reaching up to 50%. Over land, the error estimates also locate regions of potential problems in the tropics and larger cold-season errors at high latitudes that are due to snow. An empirical technique to area average the gridded errors (s) is described that allows one to make error estimates for arbitrary areas and for the tropics and the globe (land and ocean separately, and combined). Over the tropics this calculation leads to a relative error estimate for tropical land and ocean combined of 7%, which is considered to be an upper bound because of the lack of sign-of-the-error canceling when integrating over different areas with a

We applied a land water mass balance equation over 59 major river basins during 2003–9 to estimate evapotranspiration (ET), using as input terrestrial water storage anomaly (TWSA) data from the GRACE satellites, precipitation and in situ runoff measurements. We found that the terrestrial water storage change cannot be neglected in the estimation of ET on an annual time step, especially in areas with relatively low ET values. We developed a spatial regression model of ET by integrating precipitation, temperature and satellite-derived normalized difference vegetation index (NDVI) data, and used this model to extrapolate the spatio-temporal patterns of changes in ET from 1982 to 2009. We found that the globally averaged land ET is about 604 mm yr −1 with a range of 558–650 mm yr −1 . From 1982 to 2009, global land ET was found to increase at a rate of 1.10 mm yr −2 , with the Amazon regions and Southeast Asia showing the highest ET increasing trend. Further analyses, however, show that the increase in global land ET mainly occurred between the 1980s and the 1990s. The trend over the 2000s, its magnitude or even the sign of change substantially depended on the choice of the beginning year. This suggests a non-significant trend in global land ET over the last decade. (letter)

Optimal management approaches can be adopted in order to increase crop productivity and lower the carbon footprint of grain products. The objective of this study was to estimate the carbon (C) footprint and global warming potential of rice production systems. In this experiment, rice production systems (including SRI, improved and conventional) were studied. All activities, field operations and data in production methods and at different input rates were monitored and recorded during 2012. Results showed that average global warming potential across production systems was equal to 2803.25 kg CO 2 -eq ha-1. The highest and least global warming potential were observed in the SRI and conventional systems, respectively. global warming potential per unit energy input was the least and most in SRI and conventional systems, respectively. Also, the SRI and conventional systems had the maximum and minimum global warming potential per unit energy output, respectively. SRI and conventional system had the greatest and least global warming potential per unit energy output, respectively. Therefore, the optimal management approach found in SRI resulted in a reduction in GHGs, global warming potential and the carbon footprint.

Atmospheric mass balance analyses suggest that terrestrial carbon (C) storage is increasing, partially abating the atmospheric [CO2] growth rate, although the continued strength of this important ecosystem service remains uncertain. Some evidence suggests that these increases will persist owing to positive responses of vegetation growth (net primary productivity; NPP) to rising atmospheric [CO2] (that is, ‘CO2 fertilization’). Here, we present a new satellite-derived global terrestrial NPP data set, which shows a significant increase in NPP from 1982 to 2011. However, comparison against Earth system model (ESM) NPP estimates reveals a significant divergence, with satellite-derived increases (2.8 ± 1.50%) less than half of ESM-derived increases (7.6 ± 1.67%) over the 30-year period. By isolating the CO2 fertilization effect in each NPP time series and comparing it against a synthesis of available free-air CO2 enrichment data, we provide evidence that much of the discrepancy may be due to an over-sensitivity of ESMs to atmospheric [CO2], potentially reflecting an under-representation of climatic feedbacks and/or a lack of representation of nutrient constraints. Our understanding of CO2 fertilization effects on NPP needs rapid improvement to enable more accurate projections of future C cycle–climate feedbacks; we contend that better integration of modelling, satellite and experimental approaches offers a promising way forward.

At the dawn of the era of high-precision altimetry, before the launch of TOPEX/Poseidon, ocean tides were properly viewed as a source of noise--tidal variations in ocean height would represent a very substantial fraction of what the altimeter measures, and would have to be accurately predicted and subtracted if altimetry were to achieve its potential for ocean and climate studies. But to the extent that the altimetry could be severely contaminated by tides, it also represented an unprecedented global-scale tidal data set. These new data, together with research stimulated by the need for accurate tidal corrections, led to a renaissance in tidal studies in the oceanographic community. In this paper we review contributions of altimetry to tidal science over the past 20 years, emphasizing recent progress. Mapping of tides has now been extended from the early focus on major constituents in the open ocean to include minor constituents, (e.g., long-period tides; non-linear tides in shelf waters, and in the open ocean), and into shallow and coastal waters. Global and spatially local estimates of tidal energy balance have been refined, and the role of internal tide conversion in dissipating barotropic tidal energy is now well established through modeling, altimetry, and in situ observations. However, energy budgets for internal tides, and the role of tidal dissipation in vertical ocean mixing remain controversial topics. Altimetry may contribute to resolving some of these important questions through improved mapping of low-mode internal tides. This area has advanced significantly in recent years, with several global maps now available, and progress on constraining temporally incoherent components. For the future, new applications of altimetry (e.g., in the coastal ocean, where barotropic tidal models remain inadequate), and new mission concepts (studies of the submesoscale with SWOT, which will require correction for internal tides) may bring us full circle, again pushing

Global solar radiation Rg is an important input for crop models to simulate crop responses. Because the scarcity of long and continuous records of Rg is a serious limitation in many countries, Rg is estimated using models. For crop-model application, empirical Rg models that use commonly measured meteorological variables, such as temperature and precipitation, are generally preferred. Although a large number of models of this kind exist, few have been evaluated for conditions in the United States. This study evaluated the performances of 16 empirical, temperature- and/or precipitation-based Rg models for the southeastern United States. By taking into account spatial distribution and data availability, 30 locations in the region were selected and their daily weather data spanning eight years obtained. One-half of the data was used for calibrating the models, and the other half was used for evaluation. For each model, location-specific parameter values were estimated through regressions. Models were evaluated for each location using the root-mean-square error and the modeling efficiency as goodness-of-fit measures. Among the models that use temperature or precipitation as the input variable, the Mavromatis model showed the best performance. The piecewise linear regression based Wu et al. model (WP) performed best not only among the models that use both temperature and precipitation but also among the 16 models evaluated, mainly because it has separate relationships for low and high radiation levels. The modeling efficiency of WP was from ~5% to more than 100% greater than those of the other models, depending on models and locations.

in adults. For the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015), we estimated the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015. Methods We estimated incidence...... and prevalence by age, sex, cause, year, and geography with a wide range of updated and standardised analytical procedures. Improvements from GBD 2013 included the addition of new data sources, updates to literature reviews for 85 causes, and the identification and inclusion of additional studies published up...... of years lived with disability (YLDs) on a global basis. NCDs accounted for 18 of the leading 20 causes of age-standardised YLDs on a global scale. Where rates were decreasing, the rate of decrease for YLDs was slower than that of years of life lost (YLLs) for nearly every cause included in our analysis...

Solar radiation management (SRM) is increasingly considered an option for managing global temperatures, yet the economic impacts of ameliorating climatic changes by scattering sunlight back to space remain largely unknown. Though SRM may increase crop yields by reducing heat stress, its impacts from concomitant changes in available sunlight have never been empirically estimated. Here we use the volcanic eruptions that inspired modern SRM proposals as natural experiments to provide the first estimates of how the stratospheric sulfate aerosols (SS) created by the eruptions of El Chichon and Pinatubo altered the quantity and quality of global sunlight, how those changes in sunlight impacted global crop yields, and the total effect that SS may have on yields in an SRM scenario when the climatic and sunlight effects are jointly considered. We find that the sunlight-mediated impact of SS on yields is negative for both C4 (maize) and C3 (soy, rice, wheat) crops. Applying our yield model to a geoengineering scenario using SS-based SRM from 2050-2069, we find that SRM damages due to scattering sunlight are roughly equal in magnitude to SRM benefits from cooling. This suggests that SRM - if deployed using SS similar to those emitted by the volcanic eruptions it seeks to mimic - would attenuate little of the damages from climate change to global agriculture on net. Our approach could be extended to study SRM impacts on other global systems, such as human health or ecosystem function.

The future requirements for natural uranium are mainly dependent on the future growth of nuclear energy generation and the types of reactors operated to provide that energy. These topics were examined extensively by the International Nuclear Fuel Cycle Evaluation (INFCE). The resulting projections of nuclear power plant capacity and estimated requirements for natural uranium, other nuclear raw materials and fuel cycle services were presented in the final report of INFCE. The projections from INFCE are the most recent results published by an international body, and can therefore be taken as the most authoritative estimates presently available. The INFCE results have been reviewed in the light of latest trends in national nuclear power capacity figures, and a sub-set of the INFCE results are used as the basis for the demand estimates presented in this article. The principal criteria involved in the selection of this sub-set are the nuclear power growth estimates and the reactor and fuel cycle strategies. These criteria are discussed in the following sections

Vegetation fires are a complex phenomenon and have a range of global impacts including influences on climate. Even though fire is a necessary disturbance for the maintenance of some ecosystems, a range of anthropogenically deleterious consequences are associated with it, such as damage to assets and infrastructure, loss of life, as well as degradation to air quality leading to negative impacts on human health. Estimating carbon emissions from fire relies on a carbon mass balance technique which has evolved with two different interpretations in the fire emissions community. Databases reporting global fire emissions estimates use an approach based on `consumed biomass' which is an approximation to the biogeochemically correct `burnt carbon' approach. Disagreement between the two methods occurs because the `consumed biomass' accounting technique assumes that all burnt carbon is volatilized and emitted. By undertaking a global review of the fraction of burnt carbon emitted to the atmosphere, we show that the `consumed biomass' accounting approach overestimates global carbon emissions by 4.0%, or 100 Teragrams, annually. The required correction is significant and represents 9% of the net global forest carbon sink estimated annually. To correctly partition burnt carbon between that emitted to the atmosphere and that remaining as a post-fire residue requires the post-burn carbon content to be estimated, which is quite often not undertaken in atmospheric emissions studies. To broaden our understanding of ecosystem carbon fluxes, it is recommended that the change in carbon content associated with burnt residues be accounted for. Apart from correctly partitioning burnt carbon between the emitted and residue pools, it enables an accounting approach which can assess the efficacy of fire management operations targeted at sequestering carbon from fire. These findings are particularly relevant for the second commitment period for the Kyoto protocol, since improved landscape fire

Incoming shortwave solar radiation is an important parameter in environmental applications. A detailed spatial and temporal analysis of global solar radiation on the earth surface is needed in many applications, ranging from solar energy uses to the study of agricultural, forest and biological processes. At local scales, the topography is the most important factor in the distribution of solar radiation on the surface. The variability of the elevation, the surface orientation and the obstructions due to elevations are a source of great local differences in insolation and, consequently, in other variables as ground temperature. For this reason, several models based on GIS techniques have been recently developed, integrating topography to obtain the solar radiation on the surface. In this work, global radiation is analyzed with the Solar Analyst, a model implemented on ArcView, that computes the topographic parameters: altitude, latitude, slope and orientation (azimuth) and shadow effects. Solar Analyst uses as input parameters the diffuse fraction and the transmittance. These parameters are not usually available in radiometric networks in mountainous areas. In this work, a method to obtain both parameters from global radiation is proposed. Global radiation data obtained in two networks of radiometric stations is used: one located in Sierra Magina Natural Park (Spain) with 11 stations and another one located on the surroundings of Sierra Nevada Natural Park (Spain) with 14 stations. Daily solar irradiation is calculated from a digital terrain model (DTM), the daily diffuse fraction, K, and daily atmospheric transmittivity, {tau}. Results provided by the model have been compared with measured values. An overestimation for high elevations is observed, whereas low altitudes present underestimation. The best performance was also reported during summer months, and the worst results were obtained during winter. Finally, a yearlyglobal solar irradiation map has been

We present two methods for the estimation of main effects in global sensitivity analysis. The methods adopt Satterthwaite's application of random balance designs in regression problems, and extend it to sensitivity analysis of model output for non-linear, non-additive models. Finite as well as infinite ranges for model input factors are allowed. The methods are easier to implement than any other method available for global sensitivity analysis, and reduce significantly the computational cost of the analysis. We test their performance on different test cases, including an international benchmark on safety assessment for nuclear waste disposal originally carried out by OECD/NEA

We present two methods for the estimation of main effects in global sensitivity analysis. The methods adopt Satterthwaite's application of random balance designs in regression problems, and extend it to sensitivity analysis of model output for non-linear, non-additive models. Finite as well as infinite ranges for model input factors are allowed. The methods are easier to implement than any other method available for global sensitivity analysis, and reduce significantly the computational cost of the analysis. We test their performance on different test cases, including an international benchmark on safety assessment for nuclear waste disposal originally carried out by OECD/NEA.

Interception of precipitation by forest canopies plays an important role in its partitioning to evaporation, transpiration and runoff. Field observations show arboreal lichens and bryophytes can substantially enhance forests' precipitation storage and evaporation. However, representations of canopy interception in global land surface models currently ignore arboreal lichen and bryophyte contributions. This study uses the lichen and bryophyte model (LiBry) to provide the first process-based modelling approach estimating these organisms' contributions to canopy water storage and evaporation. The global mean value of forest water storage capacity increased significantly from 0.87 mm to 1.33 mm by the inclusion of arboreal poikilohydric organisms. Global forest canopy evaporation of intercepted precipitation was also greatly enhanced by 44%. Ratio of total versus bare canopy global evaporation exceeded 2 in many forested regions. This altered global patterns in canopy water storage, evaporation, and ultimately the proportion of rainfall evaporated. A sensitivity analysis was also performed. Results indicate rainfall interception is of larger magnitude than previously reported by global land surface modelling work because of the important role of lichen and bryophytes in rainfall interception.

National Oceanic and Atmospheric Administration, Department of Commerce — The American Community Survey (ACS) is an ongoing statistical survey that samples a small percentage of the population every year. These data have been apportioned...

Dengue presents a formidable and growing global economic and disease burden, with around half the world's population estimated to be at risk of infection. There is wide variation and substantial uncertainty in current estimates of dengue disease burden and, consequently, on economic burden estimates. Dengue disease varies across time, geography and persons affected. Variations in the transmission of four different viruses and interactions among vector density and host's immune status, age, pre-existing medical conditions, all contribute to the disease's complexity. This systematic review aims to identify and examine estimates of dengue disease burden and costs, discuss major sources of uncertainty, and suggest next steps to improve estimates. Economic analysis of dengue is mainly concerned with costs of illness, particularly in estimating total episodes of symptomatic dengue. However, national dengue disease reporting systems show a great diversity in design and implementation, hindering accurate globalestimates of dengue episodes and country comparisons. A combination of immediate, short-, and long-term strategies could substantially improve estimates of disease and, consequently, of economic burden of dengue. Suggestions for immediate implementation include refining analysis of currently available data to adjust reported episodes and expanding data collection in empirical studies, such as documenting the number of ambulatory visits before and after hospitalization and including breakdowns by age. Short-term recommendations include merging multiple data sources, such as cohort and surveillance data to evaluate the accuracy of reporting rates (by health sector, treatment, severity, etc.), and using covariates to extrapolate dengue incidence to locations with no or limited reporting. Long-term efforts aim at strengthening capacity to document dengue transmission using serological methods to systematically analyze and relate to epidemiologic data. As promising tools

In recent months, the World Health Organization (WHO), independent academic researchers, the Lancet and PLoS Medicine journals worked together to improve reporting of population health estimates. The new guidelines for accurate and transparent health estimates reporting (likely to be named GATHER), which are eagerly awaited, represent a helpful move that should benefit the field of global health metrics. Building on this progress and drawing from a tradition of Child Health Epidemiology Reference Group (CHERG)'s successful work model, we would like to propose a new initiative - "Global Health Epidemiology Reference Group" (GHERG). We see GHERG as an informal and entirely voluntary international collaboration of academic groups who are willing to contribute to improving disease burden estimates and respect the principles of the new guidelines - a form of "academic crowd-sourcing". The main focus of GHERG will be to identify the "gap areas" where not much information is available and/or where there is a lot of uncertainty present about the accuracy of the existing estimates. This approach should serve to complement the existing WHO and IHME estimates and to represent added value to both efforts.

Sobol' , I.M. [Institute for Mathematical Modelling of the Russian Academy of Sciences, Moscow (Russian Federation); Tarantola, S. [Joint Research Centre of the European Commission, TP361, Institute of the Protection and Security of the Citizen, Via E. Fermi 1, 21020 Ispra (Italy)]. E-mail: stefano.tarantola@jrc.it; Gatelli, D. [Joint Research Centre of the European Commission, TP361, Institute of the Protection and Security of the Citizen, Via E. Fermi 1, 21020 Ispra (Italy)]. E-mail: debora.gatelli@jrc.it; Kucherenko, S.S. [Imperial College London (United Kingdom); Mauntz, W. [Department of Biochemical and Chemical Engineering, Dortmund University (Germany)

2007-07-15

One of the major settings of global sensitivity analysis is that of fixing non-influential factors, in order to reduce the dimensionality of a model. However, this is often done without knowing the magnitude of the approximation error being produced. This paper presents a new theorem for the estimation of the average approximation error generated when fixing a group of non-influential factors. A simple function where analytical solutions are available is used to illustrate the theorem. The numerical estimation of small sensitivity indices is discussed.

Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

Impervious surface areas around the globe are expanding and significantly altering the surface energy balance, hydrology cycle and ecosystem services. Many studies have underlined the importance of impervious surface, r from hydrological modeling to contaminant transport monitoring and urban development estimation. Therefore accurate estimation of the global impervious surface is important for both physical and social sciences. Given the limited coverage of high spatial resolution imagery and ground survey, using satellite remote sensing and geospatial data to estimateglobal impervious areas is a practical approach. Based on the previous work of area-weighted imperviousness for north branch of the Chicago River provided by HDR, this study developed a method to determine the percentage of impervious surface using latest global land cover categories from multi-source satellite observations, population density and gross domestic product (GDP) data. Percent impervious surface at 30-meter resolution were mapped. We found that 1.33% of the CONUS (105,814 km2) and 0.475% of the land surface (640,370km2) are impervious surfaces. To test the utility and practicality of the proposed method, National Land Cover Database (NLCD) 2011 percent developed imperviousness for the conterminous United States was used to evaluate our results. The average difference between the derived imperviousness from our method and the NLCD data across CONUS is 1.14%, while difference between our results and the NLCD data are within ±1% over 81.63% of the CONUS. The distribution of global impervious surface map indicates that impervious surfaces are primarily concentrated in China, India, Japan, USA and Europe where are highly populated and/or developed. This study proposes a straightforward way of mapping global imperviousness, which can provide useful information for hydrologic modeling and other applications.

Full Text Available BACKGROUND: In its first 8 years, the Global Programme to Eliminate Lymphatic Filariasis (GPELF achieved an unprecedentedly rapid scale-up: >1.9 billion treatments with anti-filarial drugs (albendazole, ivermectin, and diethylcarbamazine were provided via yearly mass drug administration (MDA to a minimum of 570 million individuals living in 48 of the 83 initially identified LF-endemic countries. METHODOLOGY: To assess the health impact that this massive global effort has had, we analyzed the benefits accrued first from preventing or stopping the progression of LF disease, and then from the broader anti-parasite effects ('beyond-LF' benefits attributable to the use of albendazole and ivermectin. Projections were based on demographic and disease prevalence data from publications of the Population Reference Bureau, The World Bank, and the World Health Organization. RESULT: Between 2000 and 2007, the GPELF prevented LF disease in an estimated 6.6 million newborns who would otherwise have acquired LF, thus averting in their lifetimes nearly 1.4 million cases of hydrocele, 800,000 cases of lymphedema and 4.4 million cases of subclinical disease. Similarly, 9.5 million individuals--previously infected but without overt manifestations of disease--were protected from developing hydrocele (6.0 million or lymphedema (3.5 million. These LF-related benefits, by themselves, translate into 32 million DALYs (Disability Adjusted Life Years averted. Ancillary, 'beyond-LF' benefits from the >1.9 billion treatments delivered by the GPELF were also enormous, especially because of the >310 million treatments to the children and women of childbearing age who received albendazole with/without ivermectin (effectively treating intestinal helminths, onchocerciasis, lice, scabies, and other conditions. These benefits can be described but remain difficult to quantify, largely because of the poorly defined epidemiology of these latter infections. CONCLUSION: The GPELF has

Historic estimates of daily global solar irradiation are often required for climatic impact studies. Regression equations with daily global solar irradiation, H, as the dependent variable and other climatic variables as the independent variables provide a practical way to estimate H at locations where it is not measured. They may also have potential to estimate H before 1953, the year of the first routine H measurements in Canada. This study compares several regression equations for calculating H on the Canadian prairies. Simple linear regression with daily bright sunshine duration as the dependent variable accounted for 90% of the variation of H in summer and 75% of the variation of H in winter. Linear regression with the daily air temperature range as the dependent variable accounted for 45% of the variation of H in summer and only 6% of the variation of H in winter. Linear regression with precipitation status (wet or dry) as the dependent variable accounted for only 35% of the summer-time variation in H, but stratifying other regression analyses into wet and dry days reduced their root-mean-squared errors. For periods with sufficiently dense bright sunshine observations (i.e. after 1960), however, H was more accurately estimated from spatially interpolated bright sunshine duration than from locally observed air temperature range or precipitation status. The daily air temperature range and precipitation status may have utility for estimating H for periods before 1953, when they are the only widely available climatic data on the Canadian prairies. Between 1953 and 1989, a period of large climatic variation, the regression coefficients did not vary significantly between contrasting years with cool-wet, intermediate and warm-dry summers. They should apply equally well earlier in the century. (author)

Estimates of extinction risk for Amazonian plant and animal species are rare and not often incorporated into land-use policy and conservation planning. We overlay spatial distribution models with historical and projected deforestation to show that at least 36% and up to 57% of all Amazonian tree species are likely to qualify as globally threatened under International Union for Conservation of Nature (IUCN) Red List criteria. If confirmed, these results would increase the number of threatened ...

This paper summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquely detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.

Full Text Available We present one of the first estimates of the global distribution of CO2 surface fluxes using total column CO2 measurements retrieved by the SRON-KIT RemoTeC algorithm from the Greenhouse gases Observing SATellite (GOSAT. We derive optimized fluxes from June 2009 to December 2010. We estimate fluxes from surface CO2 measurements to use as baselines for comparing GOSAT data-derived fluxes. Assimilating only GOSAT data, we can reproduce the observed CO2 time series at surface and TCCON sites in the tropics and the northern extra-tropics. In contrast, in the southern extra-tropics GOSAT XCO2 leads to enhanced seasonal cycle amplitudes compared to independent measurements, and we identify it as the result of a land–sea bias in our GOSAT XCO2 retrievals. A bias correction in the form of a global offset between GOSAT land and sea pixels in a joint inversion of satellite and surface measurements of CO2 yields plausible global flux estimates which are more tightly constrained than in an inversion using surface CO2 data alone. We show that assimilating the bias-corrected GOSAT data on top of surface CO2 data (a reduces the estimatedglobal land sink of CO2, and (b shifts the terrestrial net uptake of carbon from the tropics to the extra-tropics. It is concluded that while GOSAT total column CO2 provide useful constraints for source–sink inversions, small spatiotemporal biases – beyond what can be detected using current validation techniques – have serious consequences for optimized fluxes, even aggregated over continental scales.

Full Text Available There is an increasing demand for renewable electricity sources, due to the global efforts to reduce CO2 emissions. Despite the promising effects, only a limited amount of electricity is currently produced globally from solar power. In order to help countries realize the importance of tapping into solar energy, it is crucial to reveal the potential amount of electricity that could be thus produced. For this reason, open data were used to produce an interactive web map of the global solar energy potential. For the calculation of the potential, the top-down approach, generally used in the literature, was modified by introducing a better way of calculating rooftop areas, and accounting for temperature, which highly reduces PV panels’ efficiency. Mean annual temperature data were introduced to improve its accuracy, and an approach to estimate rooftop and façade areas as a function of GDP was developed. The current global solar potential technically available was estimated at about 613 PWh/y. Furthermore, the cost of photovoltaic generation was computed and extremely low values, 0.03 - 0.2 $/kWh, were derived.

Estimates of changes in global land mass by using GRACE observations can be achieved by two methods, a mascon method and a forward modeling method. However, results from these two methods show inconsistent secular trend. Sea level budget can be adopted to validate the consistency among observations of sea level rise by altimetry, steric change by the Argo project, and mass change by GRACE. Mascon products from JPL, GSFC and CSR are compared here, we find that all these three products cannot achieve a reconciled sea level budget, while this problem can be solved by a new forward modeling method. We further investigate the origin of this difference, and speculate that it is caused by the signal leakage from the ocean mass. Generally, it is well recognized that land signals leak into oceans, but it also happens the other way around. We stress the importance of correction of leakage from the ocean in the estimation of global land masses. Based on a reconciled sea level budget, we confirmed that global sea level rise has been accelerating significantly over 2005-2015, as a result of the ongoing global temperature increase.

Mangrove forests are highly productive but globally threatened coastal ecosystems, whose role in the carbon budget of the coastal zone has long been debated. Here we provide a comprehensive synthesis of the available data on carbon fluxes in mangrove ecosystems. A reassessment of global mangrove primary production from the literature results in a conservative estimate of ???-218 ?? 72 Tg C a-1. When using the best available estimates of various carbon sinks (organic carbon export, sediment burial, and mineralization), it appears that >50% of the carbon fixed by mangrove vegetation is unaccounted for. This unaccounted carbon sink is conservatively estimated at ??? 112 ?? 85 Tg C a-1, equivalent in magnitude to ??? 30-40% of the global riverine organic carbon input to the coastal zone. Our analysis suggests that mineralization is severely underestimated, and that the majority of carbon export from mangroves to adjacent waters occurs as dissolved inorganic carbon (DIC). CO2 efflux from sediments and creek waters and tidal export of DIC appear to be the major sinks. These processes are quantitatively comparable in magnitude to the unaccounted carbon sink in current budgets, but are not yet adequately constrained with the limited published data available so far. Copyright 2008 by the American Geophysical Union.

Accurate information on the temporal and spatial distributions of solar radiation is very important in many scientific fields. In this study, instantaneous solar irradiances on a horizontal surface at 10:30 and 13:30 local time (LT) were calculated from Moderate Resolution Imaging Spectroradiometer (MODIS) atmospheric data products with relatively high spatial resolution using a solar radiation model. These solar irradiances were combined to derive half-hourly averages of solar irradiance (HASI) and daily global solar radiation (GSR) on a horizontal surface using linear interpolation, piecewise linear regression, and quadratic polynomial regression. Compared with field observations, the HASI were estimated accurately when the total cloud fraction (TCF) was 0.6. Overall, the daily GSR estimated in this study was better than that estimated by the Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis of NASA. The daily GSR estimated in this study was underestimated, whereas it was overestimated by MERRA. The combination of the daily GSR estimates of this study and MERRA offers a simple and feasible technique for reducing uncertainty in daily GSR estimates. - Highlights: • Daily GSR is integrated from two observations from the MODIS products. • Daily GSR from the MODIS products is underestimated. • Biases were attributed primarily to variations in the total cloud percent. • Combining daily GSR estimates from the MODIS and the MERRA increases accuracy.

Motion estimation from digital video is an ill-posed problem that requires a regularization approach. Regularization introduces a smoothness constraint that can reduce the resolution of the velocity estimates. The problem is further complicated for ultrasound videos (US), where speckle noise levels can be significant. Motion estimation using optical flow models requires the modification of several parameters to satisfy the optical flow constraint as well as the level of imposed smoothness. Furthermore, except in simulations or mostly unrealistic cases, there is no ground truth to use for validating the velocity estimates. This problem is present in all real video sequences that are used as input to motion estimation algorithms. It is also an open problem in biomedical applications like motion analysis of US of carotid artery (CA) plaques. In this paper, we study the problem of obtaining reliable ultrasound video motion estimates for atherosclerotic plaques for use in clinical diagnosis. A global optimization framework for motion parameter optimization is presented. This framework uses actual carotid artery motions to provide optimal parameter values for a variety of motions and is tested on ten different US videos using two different motion estimation techniques.

energy dissipation is the dominant channel of energy transfer in that year from the solar wind. This is consistent with many results found by other researchers. Keywords: Østgaard's Empirical Relation, Ionospheric Energy Dissipation, Electron. Precipitation, Joule Heating. INTRODUCTION. In the Earth's magnetosphere, the ...

To accurately assess how increased global nitrous oxide (N2O) emission has affected the climate system requires a robust estimation of the preindustrial N2O emissions since only the difference between current and preindustrial emissions represents net drivers of anthropogenic climate change. However, large uncertainty exists in previous estimates of preindustrial N2O emissions from the land biosphere, while preindustrial N2O emissions on the finer scales, such as regional, biome, or sector scales, have not been well quantified yet. In this study, we applied a process-based Dynamic Land Ecosystem Model (DLEM) to estimate the magnitude and spatial patterns of preindustrial N2O fluxes at the biome, continental, and global level as driven by multiple environmental factors. Uncertainties associated with key parameters were also evaluated. Our study indicates that the mean of the preindustrial N2O emission was approximately 6.20 Tg N yr-1, with an uncertainty range of 4.76 to 8.13 Tg N yr-1. The estimated N2O emission varied significantly at spatial and biome levels. South America, Africa, and Southern Asia accounted for 34.12, 23.85, and 18.93 %, respectively, together contributing 76.90 % of global total emission. The tropics were identified as the major source of N2O released into the atmosphere, accounting for 64.66 % of the total emission. Our multi-scale estimates provide a robust reference for assessing the climate forcing of anthropogenic N2O emission from the land biosphere

Full Text Available To accurately assess how increased global nitrous oxide (N2O emission has affected the climate system requires a robust estimation of the preindustrial N2O emissions since only the difference between current and preindustrial emissions represents net drivers of anthropogenic climate change. However, large uncertainty exists in previous estimates of preindustrial N2O emissions from the land biosphere, while preindustrial N2O emissions on the finer scales, such as regional, biome, or sector scales, have not been well quantified yet. In this study, we applied a process-based Dynamic Land Ecosystem Model (DLEM to estimate the magnitude and spatial patterns of preindustrial N2O fluxes at the biome, continental, and global level as driven by multiple environmental factors. Uncertainties associated with key parameters were also evaluated. Our study indicates that the mean of the preindustrial N2O emission was approximately 6.20 Tg N yr−1, with an uncertainty range of 4.76 to 8.13 Tg N yr−1. The estimated N2O emission varied significantly at spatial and biome levels. South America, Africa, and Southern Asia accounted for 34.12, 23.85, and 18.93 %, respectively, together contributing 76.90 % of global total emission. The tropics were identified as the major source of N2O released into the atmosphere, accounting for 64.66 % of the total emission. Our multi-scale estimates provide a robust reference for assessing the climate forcing of anthropogenic N2O emission from the land biosphere

Background In 1997, the World Health Assembly adopted Resolution 50.29, committing to the elimination of lymphatic filariasis (LF) as a public health problem, subsequently targeted for 2020. The initial estimates were that 1.2 billion people were at-risk for LF infection globally. Now, 13 years after the Global Programme to Eliminate Lymphatic Filariasis (GPELF) began implementing mass drug administration (MDA) against LF in 2000—during which over 4.4 billion treatments have been distributed in 56 endemic countries—it is most appropriate to estimate the impact that the MDA has had on reducing the population at risk of LF. Methodology/Principal Findings To assess GPELF progress in reducing the population at-risk for LF, we developed a model based on defining reductions in risk of infection among cohorts of treated populations following each round of MDA. The model estimates that the number of people currently at risk of infection decreased by 46% to 789 million through 2012. Conclusions/Significance Important progress has been made in the global efforts to eliminate LF, but significant scale-up is required over the next 8 years to reach the 2020 elimination goal. PMID:25411843

A comprehensive study has recently been completed of the potential regional radiological dose in the Tennessee and Cumberland river basins in the year 2000, resulting from the operation of nuclear facilities. This study, sponsored jointly by the U.S. Energy Research and Development Administration and the Tennessee Valley Authority, was performed by the Hanford Engineering Development Laboratory (HEDL), the Oak Ridge National Laboratory (ORNL), and the Atmospheric Turbulence and Diffusion Laboratory (ATDL). This study considered the operation in the year 2000 of 33,000 MWe of nuclear capacity within the study area, and of 110,000 MWe in adjacent areas, together with supporting nuclear fuel fabrication and reprocessing facilities. Air and water transport models used and methods for calculating nuclide concentrations on the ground are discussed

Full Text Available Permafrost underlies much of Earth's surface and interacts with climate, eco-systems and human systems. It is a complex phenomenon controlled by climate and (sub- surface properties and reacts to change with variable delay. Heterogeneity and sparse data challenge the modeling of its spatial distribution. Currently, there is no data set to adequately inform global studies of permafrost. The available data set for the Northern Hemisphere is frequently used for model evaluation, but its quality and consistency are difficult to assess. Here, a global model of permafrost extent and dataset of permafrost zonation are presented and discussed, extending earlier studies by including the Southern Hemisphere, by consistent data and methods, by attention to uncertainty and scaling. Established relationships between air temperature and the occurrence of permafrost are re-formulated into a model that is parametrized using published estimates. It is run with a high-resolution (<1 km global elevation data and air temperatures based on the NCAR-NCEP reanalysis and CRU TS 2.0. The resulting data provide more spatial detail and a consistent extrapolation to remote regions, while aggregated values resemble previous studies. The estimated uncertainties affect regional patterns and aggregate number, and provide interesting insight. The permafrost area, i.e. the actual surface area underlain by permafrost, north of 60° S is estimated to be 13–18 × 106 km2 or 9–14 % of the exposed land surface. The global permafrost area including Antarctic and sub-sea permafrost is estimated to be 16–21 × 106 km2. The global permafrost region, i.e. the exposed land surface below which some permafrost can be expected, is estimated to be 22 ± 3 × 106 km2. A large proportion of this exhibits considerable topography and spatially-discontinuous permafrost, underscoring the importance of attention to scaling issues

South Carolina has few indigenous energy resources. Most widely known and utilized are hydropower, wood, and solar. Peat is a material composed of partially decomposed organic matter that, after burial for long periods of time, may eventually become coal. Peat is utilized as an energy resource for the production of electricity and for home heating in Europe and the Soviet Union. There are peat deposits in South Carolina, but peat has never been used as an energy resource within the state. This report presents the results of the two years of a planned four-year study of the quantity and energy potential of peat in South Carolina. In this year's survey two activities were undertaken. The first was to visit highly probable peat deposits to confirm the presence of fuel-grade peat. The second was to survey and characterize in more detail the areas judged to be of highest potential as major resources. The factors carrying the greatest weight in our determination of priority areas were: (1) a description of peat deposits in the scientific literature or from discussions with state and federal soil scientists; (2) mention of organic soils on soil maps or in the literature; and (3) information from farmers and other local citizens.

CEDIGAZ first estimates confirm the slowdown in the growth of gas supply seen in the past two years. CEDIGAZ expects a moderate 1.1% growth, on a par with the previous year. Net slowdown in China's gas demand growth (+ 8% in 2014, versus 16%/y over 2008-13). European natural gas consumption decline worsened (- 10%), largely due to mild weather. Strong decline in CIS' gas production and consumption amidst the Ukraine conflict. Surging US production (+ 5.7%), driven by shale gas. Significant decline in international pipeline trade (- 4.8%): Russian gas exports at the lowest in decade: -13% (- 9.7% to Europe, - 24% to the CIS); US net pipeline imports down 5% (effect of shale gas). 2014 showed a turnaround on the LNG market, after four years of market tightening: Additional LNG supply in Asia, combined with weather-related weak demand. Dramatic reduction of both European and Asian spot LNG prices in this context. Positive developments of US LNG projects (Cameron, Cove Point, Freeport all took FID...), which will likely delay other competing LNG projects (Russia, Canada, East Africa). In the short term, global gas demand growth is likely to remain moderate. The European market will continue to suffer from strong competition with coal and renewables + slowdown in Chinese gas demand growth. Uncertainties on the future evolution of the well-supplied LNG market and international prices until 2020 (demand in price-sensitive emerging markets...). Increasing pressure to cut subsidies in emerging markets in order to increase supply for a more viable development of natural gas in the long term. Recent structural and not temporary factors which could affect long term gas demand growth, such as the competition with other energy fuels (coal). Energy policies and general environmental regulations will thus be critical factors influencing natural gas demand (China). The Asian market will keep a major influence on the global LNG market

Highlights: ► The global solar radiation at Lake Van region is estimated. ► This study is unique for the Lake Van region. ► Solar radiation around Lake Van has the highest value at the east-southeast region. ► The annual average solar energy potential is obtained as 750–2458 kWh/m 2 . ► Results can be used to estimate evaporation. - Abstract: In this study several sunshine-based regression models have been evaluated to estimate monthly average daily global solar radiation on horizontal surface of Lake Van region in the Eastern Anatolia region in Turkey by using data obtained from seven different meteorological stations. These models are derived from Angström–Prescott linear regression model and its derivatives such as quadratic, cubic, logarithmic and exponential. The performance of this regression models were evaluated by comparing the calculated clearness index and the measured clearness index. Several statistical tests were used to control the validation and goodness of the regression models in terms of the coefficient of determination, mean percent error, mean absolute percent error, mean biased error, mean absolute biased error, root mean square error and t-statistic. The results of all the regression models are within acceptable limits according to the statistical tests. However, the best performances are obtained by cubic regression model for Bitlis, Gevaş, Hakkari, Muş stations and by quadratic regression model for Malazgirt, Tatvan and Van stations to predict global solar radiation. The spatial distributions of the monthly average daily global solar radiation around the Lake Van region were obtained with interpolation of calculated solar radiation data that acquired from best fit models of the stations. The annual average solar energy potential for Lake Van region is obtained between 750 kWh/m 2 and 2485 kWh/m 2 with annual average of 1610 kWh/m 2 .

The concept of isohydry/anisohydry describes the degree to which plants regulate their water status, operating from isohydric with strict regulation to anisohydric with less regulation. Though some species level measures of isohydry/anisohydry exist at a few locations, ecosystem-scale information is still largely unavailable. In this study, we use diurnal observations from active (Ku-Band backscatter from QuikSCAT) and passive (X-band vegetation optical depth (VOD) from Advanced Microwave Scanning Radiometer on EOS Aqua) microwave satellite data to estimateglobal ecosystem isohydry/anisohydry. Here diurnal observations from both satellites approximate predawn and midday plant canopy water contents, which are used to estimate isohydry/anisohydry. The two independent estimates from radar backscatter and VOD show reasonable agreement at low and middle latitudes but diverge at high latitudes. Grasslands, croplands, wetlands, and open shrublands are more anisohydric, whereas evergreen broadleaf and deciduous broadleaf forests are more isohydric. The direct validation with upscaled in situ species isohydry/anisohydry estimates indicates that the VOD-based estimates have much better agreement than the backscatter-based estimates. The indirect validation with prior knowledge suggests that both estimates are generally consistent in that vegetation water status of anisohydric ecosystems more closely tracks environmental fluctuations of water availability and demand than their isohydric counterparts. However, uncertainties still exist in the isohydry/anisohydry estimate, primarily arising from the remote sensing data and, to a lesser extent, from the methodology. The comprehensive assessment in this study can help us better understand the robustness, limitation, and uncertainties of the satellite-derived isohydry/anisohydry estimates. The ecosystem isohydry/anisohydry has the potential to reveal new insights into spatiotemporal ecosystem response to droughts.

Global solar radiation (GSR) is required in a large number of fields. Many parameterization schemes are developed to estimate it using routinely measured meteorological variables, since GSR is directly measured at a limited number of stations. Even so, meteorological stations are sparse, especially, in remote areas. Satellite signals (radiance at the top of atmosphere in most cases) can be used to estimate continuous GSR in space. However, many existing remote sensing products have a relatively coarse spatial resolution and these inversion algorithms are too complicated to be mastered by experts in other research fields. In this study, the artificial neural network (ANN) is utilized to build the mathematical relationship between measured monthly-mean daily GSR and several high-level remote sensing products available for the public, including Moderate Resolution Imaging Spectroradiometer (MODIS) monthly averaged land surface temperature (LST), the number of days in which the LST retrieval is performed in 1 month, MODIS enhanced vegetation index, Tropical Rainfall Measuring Mission satellite (TRMM) monthly precipitation. After training, GSR estimates from this ANN are verified against ground measurements at 12 radiation stations. Then, comparisons are performed among three GSR estimates, including the one presented in this study, a surface data-based estimate, and a remote sensing product by Japan Aerospace Exploration Agency (JAXA). Validation results indicate that the ANN-based method presented in this study can estimate monthly-mean daily GSR at a spatial resolution of about 5 km with high accuracy.

High frequency, in situ observations from 11 globally distributed sites for the period 1994–2014 and archived air measurements dating from 1978 onward have been used to determine the global growth rate of 1,1-difluoroethane (HFC-152a, CH[subscript 3]CHF[subscript 2]). These observations have been combined with a range of atmospheric transport models to derive global emission estimates in a top-down approach. HFC-152a is a greenhouse gas with a short atmospheric lifetime of about 1.5 years. Si...

Microwave Sounding Unit (MSU) radiometer observations in Ch 2 (53.74 GHz), made in the nadir direction from sequential, sun-synchronous, polar-orbiting NOAA morning satellites (NOAA 6, 10 and 12 that have about 7am/7pm orbital geometry) and afternoon satellites (NOAA 7, 9, 11 and 14 that have about 2am/2pm orbital geometry) are analyzed in this study to derive global temperature trend from 1980 to 1998. In order to remove the discontinuities between the data of the successive satellites and to get a continuous time series, first we have used shortest possible time record of each satellite. In this way we get a preliminary estimate of the global temperature trend of 0.21 K/decade. However, this estimate is affected by systematic time-dependent errors. One such error is the instrument calibration error. This error can be inferred whenever there are overlapping measurements made by two satellites over an extended period of time. From the available successive satellite data we have taken the longest possible time record of each satellite to form the time series during the period 1980 to 1998 to this error. We find we can decrease the global temperature trend by about 0.07 K/decade. In addition there are systematic time dependent errors present in the data that are introduced by the drift in the satellite orbital geometry arises from the diurnal cycle in temperature which is the drift related change in the calibration of the MSU. In order to analyze the nature of these drift related errors the multi-satellite Ch 2 data set is partitioned into am and pm subsets to create two independent time series. The error can be assessed in the am and pm data of Ch 2 on land and can be eliminated. Observations made in the MSU Ch 1 (50.3 GHz) support this approach. The error is obvious only in the difference between the pm and am observations of Ch 2 over the ocean. We have followed two different paths to assess the impact of the errors on the global temperature trend. In one path the

Global warming of 2 °C above preindustrial levels has been considered to be the threshold that should not be exceeded by the global mean temperature to avoid dangerous interference with the climate system. However, this global mean target has different implications for different regions owing to the globally nonuniform climate change characteristics. Permafrost is sensitive to climate change; moreover, it is widely distributed in high-latitude and high-altitude regions where the greatest warming is predicted. Permafrost is expected to be severely affected by even the 2 °C global warming, which, in turn, affects other systems such as water resources, ecosystems, and infrastructures. Using air and soil temperature data from ten coupled model intercomparison project phase five models combined with observations of frozen ground, we investigated the permafrost thaw and associated ground settlement under 2 °C global warming. Results show that the climate models produced an ensemble mean permafrost area of 14.01 × 106 km2, which compares reasonably with the area of 13.89 × 106 km2 (north of 45°N) in the observations. The models predict that the soil temperature at 6 m depth will increase by 2.34-2.67 °C on area average relative to 1990-2000, and the increase intensifies with increasing latitude. The active layer thickness will also increase by 0.42-0.45 m, but dissimilar to soil temperature, the increase weakens with increasing latitude due to the distinctly cooler permafrost at higher latitudes. The permafrost extent will obviously retreat north and decrease by 24-26% and the ground settlement owing to permafrost thaw is estimated at 3.8-15 cm on area average. Possible uncertainties in this study may be mostly attributed to the less accurate ground ice content data and coarse horizontal resolution of the models.

Full Text Available As formaldehyde (HCHO is a high-yield product in the oxidation of most volatile organic compounds (VOCs emitted by fires, vegetation, and anthropogenic activities, satellite observations of HCHO are well-suited to inform us on the spatial and temporal variability of the underlying VOC sources. The long record of space-based HCHO column observations from the Ozone Monitoring Instrument (OMI is used to infer emission flux estimates from pyrogenic and biogenic volatile organic compounds (VOCs on the global scale over 2005–2013. This is realized through the method of source inverse modeling, which consists in the optimization of emissions in a chemistry-transport model (CTM in order to minimize the discrepancy between the observed and modeled HCHO columns. The top–down fluxes are derived in the global CTM IMAGESv2 by an iterative minimization algorithm based on the full adjoint of IMAGESv2, starting from a priori emission estimates provided by the newly released GFED4s (Global Fire Emission Database, version 4s inventory for fires, and by the MEGAN-MOHYCAN inventory for isoprene emissions. The top–down fluxes are compared to two independent inventories for fire (GFAS and FINNv1.5 and isoprene emissions (MEGAN-MACC and GUESS-ES. The inversion indicates a moderate decrease (ca. 20 % in the average annual global fire and isoprene emissions, from 2028 Tg C in the a priori to 1653 Tg C for burned biomass, and from 343 to 272 Tg for isoprene fluxes. Those estimates are acknowledged to depend on the accuracy of formaldehyde data, as well as on the assumed fire emission factors and the oxidation mechanisms leading to HCHO production. Strongly decreased top–down fire fluxes (30–50 % are inferred in the peak fire season in Africa and during years with strong a priori fluxes associated with forest fires in Amazonia (in 2005, 2007, and 2010, bushfires in Australia (in 2006 and 2011, and peat burning in Indonesia (in 2006 and 2009, whereas

Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.

The measured data of global solar radiation on a horizontal surface, as well as the number of sunshine hours, mean daily ambient temperature, maximum and minimum ambient temperatures, relative humidity and amount of cloud cover, for Jeddah (latitude 21 deg. 42'37''N, longitude 39 deg. 11'12''E), Saudi Arabia for the period 1996-2006 are analyzed. The data are divided into two sets. The sub-data set 1 (1996-2004) are employed to develop empirical correlations between the monthly average of daily global solar radiation fraction (H/H 0 ) and various meteorological parameters. The nonlinear Angstroem type model developed by Sen and the trigonometric function model proposed by Bulut and Bueyuekalaca are also evaluated. New empirical constants for these two models have been obtained for Jeddah. The sub-data set 2 (2005, 2006) are then used to evaluate the derived correlations. Comparisons between measured and calculated values of H have been performed. It is indicated that, the Sen and Bulut and Bueyuekalaca models satisfactorily describe the horizontal global solar radiation for Jeddah. All the proposed correlations are found to be able to predict the annual average of daily global solar radiation with excellent accuracy. Therefore, the long term performance of solar energy devices can be estimated.

Nuclear power's global expansion is projected to continue in the coming decades - albeit at a slowing pace - amid challenges including low fossil fuel prices, a sluggish world economy and the legacy of Japan's Fukushima Daiichi accident. Each year, the IAEA publishes projections of the world's nuclear power generating capacity in Energy, Electricity and Nuclear Power Estimates for the Period up to 2050, now in its 35th edition.The latest projections point to slower growth in nuclear power, in keeping with the trend since the 2011 Fukushima Daiichi accident. The world's nuclear power generating capacity is projected to expand by 2.4 percent by 2030, according to the low projections, compared with 7.7 percent estimated in 2014. In the high case, generating capacity is estimated to grow by 68 percent by 2030, versus 88 percent forecast last year. Uncertainty related to energy policy, license renewals, shutdowns and future constructions accounts for the wide range.The estimates also factor in the likely future retirement of many of the world's 438 nuclear reactors currently in operation, more than half of which are over 30 years old. Despite the need to replace scores of retiring reactors, nuclear power is still set to maintain - and possibly increase - its role in the world's low-carbon energy mix. It's important to understand that these projections, while carefully derived, are not predictions.The estimates should be viewed as very general growth trends, whose validity must be constantly subjected to critical review.(author).

Full Text Available This work presents some methods to create local maps and to estimate the position of a mobile robot, using the global appearance of omnidirectional images. We use a robot that carries an omnidirectional vision system on it. Every omnidirectional image acquired by the robot is described only with one global appearance descriptor, based on the Radon transform. In the work presented in this paper, two different possibilities have been considered. In the first one, we assume the existence of a map previously built composed of omnidirectional images that have been captured from previously-known positions. The purpose in this case consists of estimating the nearest position of the map to the current position of the robot, making use of the visual information acquired by the robot from its current (unknown position. In the second one, we assume that we have a model of the environment composed of omnidirectional images, but with no information about the location of where the images were acquired. The purpose in this case consists of building a local map and estimating the position of the robot within this map. Both methods are tested with different databases (including virtual and real images taking into consideration the changes of the position of different objects in the environment, different lighting conditions and occlusions. The results show the effectiveness and the robustness of both methods.

Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

Full Text Available Fungal diseases kill more than 1.5 million and affect over a billion people. However, they are still a neglected topic by public health authorities even though most deaths from fungal diseases are avoidable. Serious fungal infections occur as a consequence of other health problems including asthma, AIDS, cancer, organ transplantation and corticosteroid therapies. Early accurate diagnosis allows prompt antifungal therapy; however this is often delayed or unavailable leading to death, serious chronic illness or blindness. Recent globalestimates have found 3,000,000 cases of chronic pulmonary aspergillosis, ~223,100 cases of cryptococcal meningitis complicating HIV/AIDS, ~700,000 cases of invasive candidiasis, ~500,000 cases of Pneumocystis jirovecii pneumonia, ~250,000 cases of invasive aspergillosis, ~100,000 cases of disseminated histoplasmosis, over 10,000,000 cases of fungal asthma and ~1,000,000 cases of fungal keratitis occur annually. Since 2013, the Leading International Fungal Education (LIFE portal has facilitated the estimation of the burden of serious fungal infections country by country for over 5.7 billion people (>80% of the world’s population. These studies have shown differences in the global burden between countries, within regions of the same country and between at risk populations. Here we interrogate the accuracy of these fungal infection burden estimates in the 43 published papers within the LIFE initiative.

Full Text Available In this work we present a topological map building and localization system for mobile robots based on global appearance of visual information. We include a comparison and analysis of global-appearance techniques applied to wide-angle scenes in retrieval tasks. Next, we define multiscale analysis, which permits improving the association between images and extracting topological distances. Then, a topological map-building algorithm is proposed. At first, the algorithm has information only of some isolated positions of the navigation area in the form of nodes. Each node is composed of a collection of images that covers the complete field of view from a certain position. The algorithm solves the node retrieval and estimates their spatial arrangement. With these aims, it uses the visual information captured along some routes that cover the navigation area. As a result, the algorithm builds a graph that reflects the distribution and adjacency relations between nodes (map. After the map building, we also propose a route path estimation system. This algorithm takes advantage of the multiscale analysis. The accuracy in the pose estimation is not reduced to the nodes locations but also to intermediate positions between them. The algorithms have been tested using two different databases captured in real indoor environments under dynamic conditions.

Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.

Full Text Available The Fire INventory from NCAR version 1.0 (FINNv1 provides daily, 1 km resolution, globalestimates of the trace gas and particle emissions from open burning of biomass, which includes wildfire, agricultural fires, and prescribed burning and does not include biofuel use and trash burning. Emission factors used in the calculations have been updated with recent data, particularly for the non-methane organic compounds (NMOC. The resulting global annual NMOC emission estimates are as much as a factor of 5 greater than some prior estimates. Chemical speciation profiles, necessary to allocate the total NMOC emission estimates to lumped species for use by chemical transport models, are provided for three widely used chemical mechanisms: SAPRC99, GEOS-CHEM, and MOZART-4. Using these profiles, FINNv1 also provides globalestimates of key organic compounds, including formaldehyde and methanol. Uncertainties in the emissions estimates arise from several of the method steps. The use of fire hot spots, assumed area burned, land cover maps, biomass consumption estimates, and emission factors all introduce error into the model estimates. The uncertainty in the FINNv1 emission estimates are about a factor of two; but, the globalestimates agree reasonably well with other global inventories of biomass burning emissions for CO, CO2, and other species with less variable emission factors. FINNv1 emission estimates have been developed specifically for modeling atmospheric chemistry and air quality in a consistent framework at scales from local to global. The product is unique because of the high temporal and spatial resolution, global coverage, and the number of species estimated. FINNv1 can be used for both hindcast and forecast or near-real time model applications and the results are being critically evaluated with models and observations whenever possible.

Full Text Available This paper proposes a segmentation-based global optimization method for depth estimation. Firstly, for obtaining accurate matching cost, the original local stereo matching approach based on self-adapting matching window is integrated with two matching cost optimization strategies aiming at handling both borders and occlusion regions. Secondly, we employ a comprehensive smooth term to satisfy diverse smoothness request in real scene. Thirdly, a selective segmentation term is used for enforcing the plane trend constraints selectively on the corresponding segments to further improve the accuracy of depth results from object level. Experiments on the Middlebury image pairs show that the proposed global optimization approach is considerably competitive with other state-of-the-art matching approaches.

SummaryAccurate estimation of reference evapotranspiration is important for irrigation scheduling, water resources management and planning and other agricultural water management issues. In the present paper, the capabilities of generalized neuro-fuzzy models were evaluated for estimating reference evapotranspiration using two separate sets of weather data from humid and non-humid regions of Spain and Iran. In this way, the data from some weather stations in the Basque Country and Valencia region (Spain) were used for training the neuro-fuzzy models [in humid and non-humid regions, respectively] and subsequently, the data from these regions were pooled to evaluate the generalization capability of a general neuro-fuzzy model in humid and non-humid regions. The developed models were tested in stations of Iran, located in humid and non-humid regions. The obtained results showed the capabilities of generalized neuro-fuzzy model in estimating reference evapotranspiration in different climatic zones. Global GNF models calibrated using both non-humid and humid data were found to successfully estimate ET0 in both non-humid and humid regions of Iran (the lowest MAE values are about 0.23 mm for non-humid Iranian regions and 0.12 mm for humid regions). non-humid GNF models calibrated using non-humid data performed much better than the humid GNF models calibrated using humid data in non-humid region while the humid GNF model gave better estimates in humid region.

Full Text Available The World Health Organization initiative to eliminate mother-to-child transmission of syphilis aims for ≥ 90% of pregnant women to be tested for syphilis and ≥ 90% to receive treatment by 2015. We calculated global and regional estimates of syphilis in pregnancy and associated adverse outcomes for 2008, as well as antenatal care (ANC coverage for women with syphilis.Estimates were based upon a health service delivery model. National syphilis seropositivity data from 97 of 193 countries and ANC coverage from 147 countries were obtained from World Health Organization databases. Proportions of adverse outcomes and effectiveness of screening and treatment were from published literature. Regional estimates of ANC syphilis testing and treatment were examined through sensitivity analysis. In 2008, approximately 1.36 million (range: 1.16 to 1.56 million pregnant women globally were estimated to have probable active syphilis; of these, 80% had attended ANC. Globally, 520,905 (best case: 425,847; worst case: 615,963 adverse outcomes were estimated to be caused by maternal syphilis, including approximately 212,327 (174,938; 249,716 stillbirths (>28 wk or early fetal deaths (22 to 28 wk, 91,764 (76,141; 107,397 neonatal deaths, 65,267 (56,929; 73,605 preterm or low birth weight infants, and 151,547 (117,848; 185,245 infected newborns. Approximately 66% of adverse outcomes occurred in ANC attendees who were not tested or were not treated for syphilis. In 2008, based on the middle case scenario, clinical services likely averted 26% of all adverse outcomes. Limitations include missing syphilis seropositivity data for many countries in Europe, the Mediterranean, and North America, and use of estimates for the proportion of syphilis that was "probable active," and for testing and treatment coverage.Syphilis continues to affect large numbers of pregnant women, causing substantial perinatal morbidity and mortality that could be prevented by early testing and

Background The World Health Organization initiative to eliminate mother-to-child transmission of syphilis aims for ≥90% of pregnant women to be tested for syphilis and ≥90% to receive treatment by 2015. We calculated global and regional estimates of syphilis in pregnancy and associated adverse outcomes for 2008, as well as antenatal care (ANC) coverage for women with syphilis. Methods and Findings Estimates were based upon a health service delivery model. National syphilis seropositivity data from 97 of 193 countries and ANC coverage from 147 countries were obtained from World Health Organization databases. Proportions of adverse outcomes and effectiveness of screening and treatment were from published literature. Regional estimates of ANC syphilis testing and treatment were examined through sensitivity analysis. In 2008, approximately 1.36 million (range: 1.16 to 1.56 million) pregnant women globally were estimated to have probable active syphilis; of these, 80% had attended ANC. Globally, 520,905 (best case: 425,847; worst case: 615,963) adverse outcomes were estimated to be caused by maternal syphilis, including approximately 212,327 (174,938; 249,716) stillbirths (>28 wk) or early fetal deaths (22 to 28 wk), 91,764 (76,141; 107,397) neonatal deaths, 65,267 (56,929; 73,605) preterm or low birth weight infants, and 151,547 (117,848; 185,245) infected newborns. Approximately 66% of adverse outcomes occurred in ANC attendees who were not tested or were not treated for syphilis. In 2008, based on the middle case scenario, clinical services likely averted 26% of all adverse outcomes. Limitations include missing syphilis seropositivity data for many countries in Europe, the Mediterranean, and North America, and use of estimates for the proportion of syphilis that was “probable active,” and for testing and treatment coverage. Conclusions Syphilis continues to affect large numbers of pregnant women, causing substantial perinatal morbidity and mortality that

Accurate and real-time precipitation estimation is a challenging task for current and future spaceborne measurements, which is essential to understand the global hydrological cycle. Recently, the Global Precipitation Measurement (GPM) satellites were launched as a next-generation rainfall mission for observing the global precipitation characteristics. The purpose of the GPM is to enhance the spatiotemporal resolution of global precipitation. The main objective of the present study is to assess the rainfall products from the GPM, especially the Integrated Multi-satellitE Retrievals for the GPM (IMERG) data by comparing with the ground-based observations. The multitemporal scale evaluations of rainfall involving subdaily, diurnal, monthly, and seasonal scales were performed over the Indian subcontinent. The comparison shows that the IMERG performed better than the Tropical Rainfall Measuring Mission (TRMM)-3B42, although both rainfall products underestimated the observed rainfall compared to the ground-based measurements. The analyses also reveal that the TRMM-3B42 and IMERG data sets are able to represent the large-scale monsoon rainfall spatial features but are having region-specific biases. The IMERG shows significant improvement in low rainfall estimates compared to the TRMM-3B42 for selected regions. In the spatial distribution, the IMERG shows higher rain rates compared to the TRMM-3B42, due to its enhanced spatial and temporal resolutions. Apart from this, the characteristics of raindrop size distribution (DSD) obtained from the GPM mission dual-frequency precipitation radar is assessed over the complex mountain terrain site in the Western Ghats, India, using the DSD measured by a Joss-Waldvogel disdrometer.

Over the last 5 decades monitoring systems have been developed to detect changes in the accumulation of carbon (C) in the atmosphere and ocean; however, our ability to detect changes in the behavior of the global C cycle is still hindered by measurement and estimate errors. Here we present a rigorous and flexible framework for assessing the temporal and spatial components of estimate errors and their impact on uncertainty in net C uptake by the biosphere. We present a novel approach for incorporating temporally correlated random error into the error structure of emission estimates. Based on this approach, we conclude that the 2σ uncertainties of the atmospheric growth rate have decreased from 1.2 Pg C yr-1 in the 1960s to 0.3 Pg C yr-1 in the 2000s due to an expansion of the atmospheric observation network. The 2σ uncertainties in fossil fuel emissions have increased from 0.3 Pg C yr-1 in the 1960s to almost 1.0 Pg C yr-1 during the 2000s due to differences in national reporting errors and differences in energy inventories. Lastly, while land use emissions have remained fairly constant, their errors still remain high and thus their global C uptake uncertainty is not trivial. Currently, the absolute errors in fossil fuel emissions rival the total emissions from land use, highlighting the extent to which fossil fuels dominate the global C budget. Because errors in the atmospheric growth rate have decreased faster than errors in total emissions have increased, a ~20% reduction in the overall uncertainty of net C global uptake has occurred. Given all the major sources of error in the global C budget that we could identify, we are 93% confident that terrestrial C uptake has increased and 97% confident that ocean C uptake has increased over the last 5 decades. Thus, it is clear that arguably one of the most vital ecosystem services currently provided by the biosphere is the continued removal of approximately half of atmospheric CO2 emissions from the atmosphere

A methodology has been developed to determine the impacts of ISO 50001 Energy Management System (EnMS) at a region or country level. The impacts of ISO 50001 EnMS include energy, CO2 emissions, and cost savings. This internationally recognized and transparent methodology has been embodied in a user friendly Microsoft Excel® based tool called ISO 50001 Impact Estimator Tool (IET 50001). However, the tool inputs are critical in order to get accurate and defensible results. This report is intended to document the data sources used and assumptions made to calculate the global impact of ISO 50001 EnMS.

In this paper, we consider two linear plate models, namely the Reissner–Mindlin system (R–M) and the Kirchhoff–Love equation (K–L), which come from linear elasticity. We prove global Carleman inequalities for both models with boundary observations and under a suitable hypothesis on the parameters. We use these estimates to study the inverse problem of recovering a spatially dependent potential from knowledge of Neumann boundary data. We obtain L 2 -Lipschitz stability for K–L and H 1 -Lipschitz stability for R–M under the assumption that the potentials are equal at the boundary. (paper)

The short lifetime and heterogeneous distribution of Black Carbon (BC) in the atmosphere leads to complex impacts on radiative forcing, climate, and health, and complicates analysis of its atmospheric processing and emissions. Two recent papers have estimated the global and regional emissions of BC using advanced statistical and computational methods. One used a Kalman Filter, including data from AERONET, NOAA, and other ground-based sources, to estimateglobal emissions of 17.8+/-5.6 Tg BC/year (with the increase attributable to East Asia, South Asia, Southeast Asia, and Eastern Europe - all regions which have had rapid urban, industrial, and economic expansion). The second additionally used remotely sensed measurements from MISR and a variance maximizing technique, uniquely quantifying fire and urban sources in Southeast Asia, as well as their large year-to-year variability over the past 12 years, leading to increases from 10% to 150%. These new emissions products, when run through our state-of-the art modelling system of chemistry, physics, transport, removal, radiation, and climate, match 140 ground stations and satellites better in both an absolute and a temporal sense. New work now further includes trace species measurements from OMI, which are used with the variance maximizing technique to constrain the types of emissions sources. Furthermore, land-use change and fire estimation products from MODIS are also included, which provide other constraints on the temporal and spatial nature of the variations of intermittent sources like fires or new permanent sources like expanded urbanization. This talk will introduce a new, top-down constrained, weekly varying BC emissions dataset, show that it produces a better fit with observations, and draw conclusions about the sources and impacts from urbanization one hand, and fires on another hand. Results specific to the Southeast and East Asia will demonstrate inter- and intra-annual variations, such as the function of

Motivated by the question of whether recent indications of decadal climate variability and a possible "climate shift" may have affected the global water balance, we examine evaporation minus precipitation (E-P) variability integrated over the global oceans and global land from three points of view-remotely sensed retrievals / objective analyses over the oceans, reanalysis vertically-integrated moisture convergence (MFC) over land, and land surface models forced with observations-based precipitation, radiation and near-surface meteorology. Because monthly variations in area-averaged atmospheric moisture storage are small and the global integral of moisture convergence must approach zero, area-integrated E-P over ocean should essentially equal precipitation minus evapotranspiration (P-ET) over land (after adjusting for ocean and land areas). Our analysis reveals considerable uncertainty in the decadal variations of ocean evaporation when integrated to global scales. This is due to differences among datasets in 10m wind speed and near-surface atmospheric specific humidity (2m qa) used in bulk aerodynamic retrievals. Precipitation variations, all relying substantially on passive microwave retrievals over ocean, still have uncertainties in decadal variability, but not to the degree present with ocean evaporation estimates. Reanalysis MFC and P-ET over land from several observationally forced diagnostic and land surface models agree best on interannual variations. However, upward MFC (i.e. P-ET) reanalysis trends are likely related in part to observing system changes affecting atmospheric assimilation models. While some evidence for a low-frequency E-P maximum near 2000 is found, consistent with a recent apparent pause in sea-surface temperature (SST) rise, uncertainties in the datasets used here remain significant. Prospects for further reducing uncertainties are discussed. The results are interpreted in the context of recent climate variability (Pacific Decadal

This report contains the fiscal year budget justifications to Congress. The budget estimates for salaries and expenses for fiscal year 1984 to 1985 provide for obligations of $466,800,000 to be funded in total by a new appropriation

Full Text Available Satellite-based rainfall estimates over land have great potential for a wide range of applications, but their validation is challenging due to the scarcity of ground-based observations of rainfall in many areas of the planet. Recent studies have suggested the use of triple collocation (TC to characterize uncertainties associated with rainfall estimates by using three collocated rainfall products. However, TC requires the simultaneous availability of three products with mutually uncorrelated errors, a requirement which is difficult to satisfy with current global precipitation data sets. In this study, a recently developed method for rainfall estimation from soil moisture observations, SM2RAIN, is demonstrated to facilitate the accurate application of TC within triplets containing two state-of-the-art satellite rainfall estimates and a reanalysis product. The validity of different TC assumptions are indirectly tested via a high-quality ground rainfall product over the contiguous United States (CONUS, showing that SM2RAIN can provide a truly independent source of rainfall accumulation information which uniquely satisfies the assumptions underlying TC. On this basis, TC is applied with SM2RAIN on a global scale in an optimal configuration to calculate, for the first time, reliable global correlations (vs. an unknown truth of the aforementioned products without using a ground benchmark data set. The analysis is carried out during the period 2007–2012 using daily rainfall accumulation products obtained at 1° × 1° spatial resolution. Results convey the relatively high performance of the satellite rainfall estimates in eastern North and South America, southern Africa, southern and eastern Asia, eastern Australia, and southern Europe, as well as complementary performances between the reanalysis product and SM2RAIN, with the first performing reasonably well in the Northern Hemisphere and the second providing very good performance in the Southern

In this study, an artificial neural network (ANN) model was used to estimate monthly average global solar radiation on a horizontal surface for selected 5 locations in Mediterranean region for period of 18 years (1993-2010). Meteorological and geographical data were taken from Turkish State Meteorological Service. The ANN architecture designed is a feed-forward back-propagation model with one-hidden layer containing 21 neurons with hyperbolic tangent sigmoid as the transfer function and one output layer utilized a linear transfer function (purelin). The training algorithm used in ANN model was the Levenberg Marquand back propagation algorith (trainlm). Results obtained from ANN model were compared with measured meteorological values by using statistical methods. A correlation coefficient of 97.97 (~98%) was obtained with root mean square error (RMSE) of 0.852 MJ/m 2 , mean square error (MSE) of 0.725 MJ/m 2 , mean absolute bias error (MABE) 10.659MJ/m 2 , and mean absolute percentage error (MAPE) of 4.8%. Results show good agreement between the estimated and measured values of global solar radiation. We suggest that the developed ANN model can be used to predict solar radiation another location and conditions

Full Text Available ABSTRACT CONTEXT AND OBJECTIVE: Noncommunicable diseases (NCDs are the leading health problem globally and generate high numbers of premature deaths and loss of quality of life. The aim here was to describe the major groups of causes of death due to NCDs and the ranking of the leading causes of premature death between 1990 and 2015, according to the Global Burden of Disease (GBD 2015 study estimates for Brazil. DESIGN AND SETTING: Cross-sectional study covering Brazil and its 27 federal states. METHODS: This was a descriptive study on rates of mortality due to NCDs, with corrections for garbage codes and underreporting of deaths. RESULTS: This study shows the epidemiological transition in Brazil between 1990 and 2015, with increasing proportional mortality due to NCDs, followed by violence, and decreasing mortality due to communicable, maternal and neonatal causes within the global burden of diseases. NCDs had the highest mortality rates over the whole period, but with reductions in cardiovascular diseases, chronic respiratory diseases and cancer. Diabetes increased over this period. NCDs were the leading causes of premature death (30 to 69 years: ischemic heart diseases and cerebrovascular diseases, followed by interpersonal violence, traffic injuries and HIV/AIDS. CONCLUSION: The decline in mortality due to NCDs confirms that improvements in disease control have been achieved in Brazil. Nonetheless, the high mortality due to violence is a warning sign. Through maintaining the current decline in NCDs, Brazil should meet the target of 25% reduction proposed by the World Health Organization by 2025.

Extreme hydrological events cause the most impacts of natural hazards globally, impacting on a wide range of sectors including, most prominently, agriculture, food security and water availability and quality, but also on energy production, forestry, health, transportation and fisheries. Understanding how floods and droughts intersect, and have changed in the past provides the basis for understanding current risk and how it may change in the future. To do this requires an understanding of the mechanisms associated with events and therefore their predictability, attribution of long-term changes in risk, and quantification of projections of changes in the future. Of key importance are long-term records of relevant variables so that risk can be quantified more accurately, given the growing acknowledgement that risk is not stationary under long-term climate variability and climate change. To address this, we develop a catalogue of drought and flood events based on land surface and hydrodynamic modeling, forced by a hybrid meteorological dataset that draws from the continuity and coverage of reanalysis, and satellite datasets, merged with global gauge databases. The meteorological dataset is corrected for temporal inhomogeneities, spurious trends and variable inter-dependencies to ensure long-term consistency, as well as realistic representation of short-term variability and extremes. The VIC land surface model is run for the past 100 years at 0.25-degree resolution for global land areas. The VIC runoff is then used to drive the CaMa-Flood hydrodynamic model to obtain information on flood inundation risk. The model outputs are compared to satellite based estimates of flood and drought conditions and the observational flood record. The data are analyzed in terms of the spatio-temporal characteristics of large-scale flood and drought events with a particular focus on characterizing the long-term variability in risk. Significant changes in risk occur on multi-decadal time

A dynamic linear compartment model of the global iodine cycle has been developed for the purpose of estimating long-term doses and dose commitments to the world population from releases of 129 I to the environment. The environmental compartments assumed in the model comprise the atmosphere, hydrosphere, lithosphere, and terrestrial biosphere. The global transport of iodine is described by means of time-invariant fractional transfer rates between the environmental compartments. The fractional transfer rates for 129 I are determined primarily from available data on compartment inventories and fluxes for naturally occurring stable iodine and from data on the global hydrologic cycle. The dose to the world population is estimated from the calculated compartment inventories of 129 I, the known compartment inventories of stable iodine, a pathway analysis of the intake of iodine by a reference individual, dose conversion factors for inhalation and ingestion, and an estimate of the world population. For an assumed constant population of 12.21 billion beyond the year 2075, the estimated population dose commitment is 2 x 10 5 man-rem/Ci. The sensitivity of the calculated doses to variations in some of the parameters in the model for the global iodine cycle is investigated. A computer code written to calculate global compartment inventories and dose rates and population doses is described and documented

The question of estimating the upper limit of -parallel B -parallel 2 , which is a key step in some recently reported global robust stability criteria for delayed neural networks, is revisited ( B denotes the delayed connection weight matrix). Recently, Cao, Huang, and Qu have given an estimate of the upper limit of -parallel B -parallel 2 . In the present paper, an alternative estimate of the upper limit of -parallel B -parallel 2 is highlighted. It is shown that the alternative estimate may yield some new global robust stability results

The impact of a potential global temperature rise by 2oC on tourism is examined, within the framework of IMPACT2C FP7 project. The period of the specific increase was defined according to the global mean temperature projections from two GCM, BCM and HadCM3Q3. Simulations from two RCMs, driven by the aforementioned GCMs, in the frame of ENSEMBLES FP6 under A1B emission scenario were used to estimate the Tourism Climatic Index (TCI) which is a measure of climate favorability for outdoor leisure and recreational activities. Climate favorability related to summer tourism is expected to increase in most European countries moving from south to north. In the opposite, countries that traditionally attract "sun and sand" tourists like Italy, Spain, Greece, France, Portugal, Cyprus are projected to become uncomfortably hot for the months of the peak summer season. Both of the examined models provide consistent information about the direction of change, however SMHI shows a greater change in future TCI. The TCI between 1960 and 2000 was associated to bednights data, to reveal the correlation of the empirical index with a real tourism indicator. The resulted correlation function was then applied to the 2oC period, estimating the effect of the specific temperature rise to future tourism activity expressed in terms of projected bednights.

Global mean sea surface temperature (T¯) is a variable of primary interest in studies of climate variability and change. The temporal evolution of T¯ can be influenced by surface heat fluxes (F¯) and by diffusion (D¯) and advection (A¯) processes internal to the ocean, but quantifying the contribution of these different factors from data alone is prone to substantial uncertainties. Here we derive a closed T¯ budget for the period 1993-2015 based on a global ocean state estimate, which is an exact solution of a general circulation model constrained to most extant ocean observations through advanced optimization methods. The estimated average temperature of the top (10-m thick) level in the model, taken to represent T¯, shows relatively small variability at most time scales compared to F¯, D¯, or A¯, reflecting the tendency for largely balancing effects from all the latter terms. The seasonal cycle in T¯ is mostly determined by small imbalances between F¯ and D¯, with negligible contributions from A¯. While D¯ seems to simply damp F¯ at the annual period, a different dynamical role for D¯ at semiannual period is suggested by it being larger than F¯. At periods longer than annual, A¯ contributes importantly to T¯ variability, pointing to the direct influence of the variable ocean circulation on T¯ and mean surface climate.

With their origins in numerical weather prediction and climate modeling, land surface models aim to accurately partition the surface energy balance. An overlooked challenge in these schemes is the role of model parameter uncertainty, particularly at unmonitored sites. This study provides global parameter estimates for the Noah land surface model using 85 eddy covariance sites in the global FLUXNET network. The at-site parameters are first calibrated using a Latin Hypercube-based ensemble of the most sensitive parameters, determined by the Sobol method, to be the minimum stomatal resistance (rs,min), the Zilitinkevich empirical constant (Czil), and the bare soil evaporation exponent (fxexp). Calibration leads to an increase in the mean Kling-Gupta Efficiency performance metric from 0.54 to 0.71. These calibrated parameter sets are then related to local environmental characteristics using the Extra-Trees machine learning algorithm. The fitted Extra-Trees model is used to map the optimal parameter sets over the globe at a 5 km spatial resolution. The leave-one-out cross validation of the mapped parameters using the Noah land surface model suggests that there is the potential to skillfully relate calibrated model parameter sets to local environmental characteristics. The results demonstrate the potential to use FLUXNET to tune the parameterizations of surface fluxes in land surface models and to provide improved parameter estimates over the globe.

Visual acuity, a forced-choice psychophysical measure of visual spatial resolution, is the sine qua non of clinical visual impairment testing in ophthalmology and optometry patients with visual system disorders ranging from refractive error to retinal, optic nerve, or central visual system pathology. Visual acuity measures are standardized against a norm, but it is well known that visual acuity depends on a variety of stimulus parameters, including contrast and exposure duration. This paper asks if it is possible to estimate a single global visual state measure from visual acuity measures as a function of stimulus parameters that can represent the patient's overall visual health state with a single variable. Psychophysical theory (at the sensory level) and psychometric theory (at the decision level) are merged to identify the conditions that must be satisfied to derive a global visual state measure from parameterised visual acuity measures. A global visual state measurement model is developed and tested with forced-choice visual acuity measures from 116 subjects with no visual impairments and 560 subjects with uncorrected refractive error. The results are in agreement with the expectations of the model

We propose a novel approach to quantify gross primary productivity (GPP) and evapotranspiration (ET) at global scale (5 km resolution with 8-day interval). The MODIS-based, process-oriented approach couples photosynthesis, evaporation, two-leaf energy balance and nitrogen, which are different from the previous satellite-based approaches. We couple information from MODIS with flux towers to assess the drivers and parameters of GPP and ET. Incoming shortwave radiation components (direct and diffuse PAR, NIR) under all sky condition are modeled using a Monte-Carlo based atmospheric radiative transfer model. The MODIS Level 2 Atmospheric products are gridded and overlaid with MODIS Land products to produce spatially compatible forcing variables. GPP is modeled using a two-leaf model (sunlit and shaded leaf) and the maximum carboxylation rate is estimated using albedo-Nitrogen-leaf trait relations. The GPP is used to calculate canopy conductance via Ball-Berry model. Then, we apply Penman-Monteith equation to calculate evapotranspiration. The process-oriented approach allows us to investigate the main drivers of GPP and ET at global scale. Finally we explore the spatial and temporal variability of GPP and ET at global scale.

I will present results of a new inverse technique that infers small-scale turbulent diffusivities and mesoscale eddy diffusivities from an ocean climatology of Salinity (S) and Temperature (T) in combination with surface freshwater and heat fluxes.First, the ocean circulation is represented in (S,T) coordinates, by the diathermohaline streamfunction. Framing the ocean circulation in (S,T) coordinates, isolates the component of the circulation that is directly related to water-mass transformation.Because water-mass transformation is directly related to fluxes of salt and heat, this framework allows for the formulation of an inverse method in which the diathermohaline streamfunction is balanced with known air-sea forcing and unknown mixing. When applying this inverse method to observations, we obtain observationally based estimates for both the streamfunction and the mixing. The results reveal new information about the component of the global ocean circulation due to water-mass transformation and its relation to surface freshwater and heat fluxes and small-scale and mesoscale mixing. The results provide global constraints on spatially varying patterns of diffusivities, in order to obtain a realistic overturning circulation. We find that mesoscale isopycnal mixing is much smaller than expected. These results are important for our understanding of the relation between global ocean circulation and mixing and may lead to improved parameterisations in numerical ocean models.

Policy Relevant Background (PRB) ozone, as defined by the US Environmental Protection Agency (EPA), refers to ozone concentrations that would occur in the absence of all North American anthropogenic emissions. PRB enters into the calculation of health risk benefits, and as the US ozone standard approaches background levels, PRB is increasingly important in determining the feasibility and cost of compliance. As PRB is a hypothetical construct, modeling is a necessary tool. Since 2006 EPA has relied on global modeling to establish PRB for their regulatory analyses. Recent assessments with higher resolution global models exhibit improved agreement with remote observations and modest upward shifts in PRB estimates. This paper shifts the paradigm to a regional model (CAMx) run at 12 km resolution, for which North American boundary conditions were provided by a low-resolution version of the GEOS-Chem global model. We conducted a comprehensive model inter-comparison, from which we elucidate differences in predictive performance against ozone observations and differences in temporal and spatial background variability over the US. In general, CAMx performed better in replicating observations at remote monitoring sites, and performance remained better at higher concentrations. While spring and summer mean PRB predicted by GEOS-Chem ranged 20-45 ppb, CAMx predicted PRB ranged 25-50 ppb and reached well over 60 ppb in the west due to event-oriented phenomena such as stratospheric intrusion and wildfires. CAMx showed a higher correlation between modeled PRB and total observed ozone, which is significant for health risk assessments. A case study during April 2006 suggests that stratospheric exchange of ozone is underestimated in both models on an event basis. We conclude that wildfires, lightning NO x and stratospheric intrusions contribute a significant level of uncertainty in estimating PRB, and that PRB will require careful consideration in the ozone standard setting process.

Background: HIV/AIDS is one of greatest global public health concerns today due to the high incidence, prevalence and mortality rates. The aim of this research was investigate and estimate the global HIV/AIDS mortality, prevalence and incidence rates, and explore their associations with the Human Development Index. Methods: The global age-standardized rates of mortality, prevalence and incidence of HIV/AIDS were obtained from the UNAIDS for different countries in 2015. The human developm...

Global and diffuse solar radiation intensities are, in general, measured on horizontal surfaces, whereas stationary solar conversion systems (both flat plate solar collector and solar photovoltaic) are mounted on inclined surface to maximize the amount of solar radiation incident on the collector surface. Consequently, the solar radiation incident measured on a tilted surface has to be determined by converting solar radiation from horizontal surface to tilted surface of interest. This study evaluates the performance of 14 models transposing 10 minutes, hourly and daily diffuse solar irradiation from horizontal to inclined surface. Solar radiation data from 8 months (April to November 2011) which include diverse atmospheric conditions and solar altitudes, measured on the roof of the radiation tower of the Royal Meteorological Institute of Belgium in Uccle (Longitude 4.35°, Latitude 50.79°) were used for validation purposes. The individual model performance is assessed by an inter-comparison between the calculated and measured solar global radiation on the south-oriented surface tilted at 50.79° using statistical methods. The relative performance of the different models under different sky conditions has been studied. Comparison of the statistical errors between the different radiation models in function of the clearness index shows that some models perform better under one type of sky condition. Putting together different models acting under different sky conditions can lead to a diminution of the statistical error between global measured solar radiation and globalestimated solar radiation. As models described in this paper have been developed for hourly data inputs, statistical error indexes are minimum for hourly data and increase for 10 minutes and one day frequency data.

In developing countries like Pakistan the global solar radiation and its components is not available for all locations due to which there is a requirement of using different models for the estimation of global solar radiation that use climatological parameters of the locations. Only five long-period locations data of solar radiation data is available in Pakistan (Karachi, Quetta, Lahore, Multan and Peshawar). These locations almost encompass the different geographical features of Pakistan. For this reason in this study the Mean monthly global solar radiation has been estimated using empirical models of Angstrom, FAO, Glover Mc-Culloch, Sangeeta & Tiwari for the diversity of approach and use of climatic and geographical parameters. Empirical constants for these models have been estimated and the results obtained by these models have been tested statistically. The results show encouraging agreement between estimated and measured values. The outcome of these empirical models will assist the researchers working on solar energy estimation of the location having similar conditions

In developing countries like India, global solar radiation (GSR) is measured at very few locations due to non-availability of radiation measuring instruments. To overcome the inadequacy of GSR measurements, scientists developed many empirical models to estimate location-wise GSR. In the present study, three simple forms of Angstrom equation [Angstrom-Prescott (A-P), Ogelman, and Bahel] were used to estimate GSR at six geographically and climatologically different locations across India with an objective to find out a set of common constants usable for whole country. Results showed that GSR values varied from 9.86 to 24.85 MJ m-2 day-1 for different stations. It was also observed that A-P model showed smaller errors than Ogelman and Bahel models. All the models well estimated GSR, as the 1:1 line between measured and estimated values showed Nash-Sutcliffe efficiency (NSE) values ≥ 0.81 for all locations. Measured data of GSR pooled over six selected locations was analyzed to obtain a new set of constants for A-P equation which can be applicable throughout the country. The set of constants (a = 0.29 and b = 0.40) was named as "One India One Constant (OIOC)," and the model was named as "MOIOC." Furthermore, the developed constants are validated statistically for another six locations of India and produce close estimation. High R 2 values (≥ 76%) along with low mean bias error (MBE) ranging from - 0.64 to 0.05 MJ m-2 day-1 revealed that the new constants are able to predict GSR with lesser percentage of error.

The objective of the current study is to determine what factors have been associated with the global adoption of mandatory child restraint laws (ChRLs) since 1975. In order to determine what factors explained the global adoption of mandatory ChRLs, Weibull models were analyzed. To carry out this analysis, 170 countries were considered and the time risk corresponded to 5,146 observations for the period 1957-2013. The dependent variable was first time to adopt a ChRL. Independent variables representing global factors were the World Health Organization (WHO) and World Bank's (WB) road safety global campaign; the Geneva Convention on Road Traffic; and the United Nation's (UN) 1958 Vehicle Agreement. Independent variables representing regional factors were the creation of the European Transport Safety Council and being a Commonwealth country. Independent variables representing national factors were population; gross domestic product (GDP) per capita; political violence; existence of road safety nongovernmental organizations (NGOs); and existence of road safety agencies. Urbanization served as a control variable. To examine regional dynamics, Weibull models for Africa, Asia, Europe, North America, Latin America, the Caribbean, and the Commonwealth were also carried out. Empirical estimates from full Weibull models suggest that 2 global factors and 2 national factors are significantly associated with the adoption of this measure. The global factors explaining adoption are the WHO and WB's road safety global campaign implemented after 2004 (P policy was global. Regional analysis showed that the UN's Convention on Road Traffic was significant in Asia, the creation of the European Transport Safety Council was significant in Europe and North America, and the global campaign was in Africa. In Commonwealth and European and North American countries, the existence of road safety agencies was also positively associated with ChRL adoption. Results of the world models suggest that

In several important biomes, including croplands and tropical forests, many small fires exist that have sizes that are well below the detection limit for the current generation of burned area products derived from moderate resolution spectroradiometers. These fires likely have important effects on greenhouse gas and aerosol emissions and regional air quality. Here we developed an approach for combining 1km thermal anomalies (active fires; MOD14A2) and 500m burned area observations (MCD64A1) to estimate the prevalence of these fires and their likely contribution to burned area and carbon emissions. We first estimated active fires within and outside of 500m burn scars in 0.5 degree grid cells during 2001-2010 for which MCD64A1 burned area observations were available. For these two sets of active fires we then examined mean fire radiative power (FRP) and changes in enhanced vegetation index (EVI) derived from 16-day intervals immediately before and after each active fire observation. To estimate the burned area associated with sub-500m fires, we first applied burned area to active fire ratios derived solely from within burned area perimeters to active fires outside of burn perimeters. In a second step, we further modified our sub-500m burned area estimates using EVI changes from active fires outside and within of burned areas (after subtracting EVI changes derived from control regions). We found that in northern and southern Africa savanna regions and in Central and South America dry forest regions, the number of active fires outside of MCD64A1 burned areas increased considerably towards the end of the fire season. EVI changes for active fires outside of burn perimeters were, on average, considerably smaller than EVI changes associated with active fires inside burn scars, providing evidence for burn scars that were substantially smaller than the 25 ha area of a single 500m pixel. FRP estimates also were lower for active fires outside of burn perimeters. In our

Full Text Available Assessing the mortality impact of the 2009 influenza A H1N1 virus (H1N1pdm09 is essential for optimizing public health responses to future pandemics. The World Health Organization reported 18,631 laboratory-confirmed pandemic deaths, but the total pandemic mortality burden was substantially higher. We estimated the 2009 pandemic mortality burden through statistical modeling of mortality data from multiple countries.We obtained weekly virology and underlying cause-of-death mortality time series for 2005-2009 for 20 countries covering ∼35% of the world population. We applied a multivariate linear regression model to estimate pandemic respiratory mortality in each collaborating country. We then used these results plus ten country indicators in a multiple imputation model to project the mortality burden in all world countries. Between 123,000 and 203,000 pandemic respiratory deaths were estimatedglobally for the last 9 mo of 2009. The majority (62%-85% were attributed to persons under 65 y of age. We observed a striking regional heterogeneity, with almost 20-fold higher mortality in some countries in the Americas than in Europe. The model attributed 148,000-249,000 respiratory deaths to influenza in an average pre-pandemic season, with only 19% in persons <65 y. Limitations include lack of representation of low-income countries among single-country estimates and an inability to study subsequent pandemic waves (2010-2012.We estimate that 2009 global pandemic respiratory mortality was ∼10-fold higher than the World Health Organization's laboratory-confirmed mortality count. Although the pandemic mortality estimate was similar in magnitude to that of seasonal influenza, a marked shift toward mortality among persons <65 y of age occurred, so that many more life-years were lost. The burden varied greatly among countries, corroborating early reports of far greater pandemic severity in the Americas than in Australia, New Zealand, and Europe. A

Full Text Available The interaction of the solar wind with a planetary magnetic field causes electrical currents that modify the magnetic field distribution around the planet. We present an approach to estimating the planetary magnetic field from in situ spacecraft data using a magnetohydrodynamic (MHD simulation approach. The method is developed with respect to the upcoming BepiColombo mission to planet Mercury aimed at determining the planet's magnetic field and its interior electrical conductivity distribution. In contrast to the widely used empirical models, global MHD simulations allow the calculation of the strongly time-dependent interaction process of the solar wind with the planet. As a first approach, we use a simple MHD simulation code that includes time-dependent solar wind and magnetic field parameters. The planetary parameters are estimated by minimizing the misfit of spacecraft data and simulation results with a gradient-based optimization. As the calculation of gradients with respect to many parameters is usually very time-consuming, we investigate the application of an adjoint MHD model. This adjoint MHD model is generated by an automatic differentiation tool to compute the gradients efficiently. The computational cost for determining the gradient with an adjoint approach is nearly independent of the number of parameters. Our method is validated by application to THEMIS (Time History of Events and Macroscale Interactions during Substorms magnetosheath data to estimate Earth's dipole moment.

Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

Open fire biomass burning and domestic biofuel burning (e.g., cooking, heating, and charcoal making) algorithms have been incorporated into a terrestrial ecosystem model to estimate CO2 and key reactive GHGs (CO, NOx, and NMHCs) emissions for the year 2000. The emissions are calculated over the globe at a 0.5° × 0.5° spatial resolution using tree density imagery, and two separate sets of data each for global area burned and land clearing for croplands, along with biofuel consumption rate data. The estimatedglobal and annual total dry matter (DM) burned due to open fire biomass burning ranges between 5221 and 7346 Tg DM/yr, whereas the resultant emissions ranges are 6564-9093 Tg CO2/yr, 438-568 Tg CO/yr, 11-16 Tg NOx/yr (as NO), and 29-40 Tg NMHCs/yr. The results indicate that land use changes for cropland is one of the major sources of biomass burning, which amounts to 25-27% (CO2), 25 -28% (CO), 20-23% (NO), and 28-30% (NMHCs) of the total open fire biomass burning emissions of these gases. Estimated DM burned associated with domestic biofuel burning is 3,114 Tg DM/yr, and resultant emissions are 4825 Tg CO2/yr, 243 Tg CO/yr, 3 Tg NOx/yr, and 23 Tg NMHCs/yr. Total emissions from biomass burning are highest in tropical regions (Asia, America, and Africa), where we identify important contributions from primary forest cutting for croplands and domestic biofuel burning.

Full Text Available Foodborne diseases are globally important, resulting in considerable morbidity and mortality. Parasitic diseases often result in high burdens of disease in low and middle income countries and are frequently transmitted to humans via contaminated food. This study presents the first estimates of the global and regional human disease burden of 10 helminth diseases and toxoplasmosis that may be attributed to contaminated food.Data were abstracted from 16 systematic reviews or similar studies published between 2010 and 2015; from 5 disease data bases accessed in 2015; and from 79 reports, 73 of which have been published since 2000, 4 published between 1995 and 2000 and 2 published in 1986 and 1981. These included reports from national surveillance systems, journal articles, and national estimates of foodborne diseases. These data were used to estimate the number of infections, sequelae, deaths, and Disability Adjusted Life Years (DALYs, by age and region for 2010. These parasitic diseases, resulted in 48.4 million cases (95% Uncertainty intervals [UI] of 43.4-79.0 million and 59,724 (95% UI 48,017-83,616 deaths annually resulting in 8.78 million (95% UI 7.62-12.51 million DALYs. We estimated that 48% (95% UI 38%-56% of cases of these parasitic diseases were foodborne, resulting in 76% (95% UI 65%-81% of the DALYs attributable to these diseases. Overall, foodborne parasitic disease, excluding enteric protozoa, caused an estimated 23.2 million (95% UI 18.2-38.1 million cases and 45,927 (95% UI 34,763-59,933 deaths annually resulting in an estimated 6.64 million (95% UI 5.61-8.41 million DALYs. Foodborne Ascaris infection (12.3 million cases, 95% UI 8.29-22.0 million and foodborne toxoplasmosis (10.3 million cases, 95% UI 7.40-14.9 million were the most common foodborne parasitic diseases. Human cysticercosis with 2.78 million DALYs (95% UI 2.14-3.61 million, foodborne trematodosis with 2.02 million DALYs (95% UI 1.65-2.48 million and foodborne

yielded sea-level-rise estimates between 1.06–1.75 mm/ yrear-1 , with a regional average of 1.29 mm yr-1, when corrected for global isostatic adjustment (GIA) using model data, with a regional average of 1.29 mm-1.. These estimates are consistent...

Diarrhoeal diseases are major contributors to the global burden of disease, particularly in children. However, comprehensive estimates of the incidence and mortality due to specific aetiologies of diarrhoeal diseases are not available. The objective of this study is to provide estimates of the gl...

An estimate of the global-scale joule heating rates in the thermosphere is made based on derived global equivalent overhead electric current systems in the dynamo region during geomagnetically quiet and disturbed periods. The equivalent total electric field distribution is calculated from Ohm's law. The global-scale joule heating rates are calculated for various monthly average periods in 1965. The calculated joule heating rates maximize at high latitudes in the early evening and postmidnight sectors. During geomagnetically quiet times the daytime joule heating rates are considerably lower than heating by solar EUV radiation. However, during geomagnetically disturbed periods the estimated joule heating rates increase by an order of magnitude and can locally exceed the solar EUV heating rates. The results show that joule heating is an important and at times the dominant energy source at high latitudes. However, the global mean joule heating rates calculated near solar minimum are generally small compared to the global mean solar EUV heating rates. (auth)

Our aim was to estimate the population of emperor penguins (Aptenodytes fosteri) using a single synoptic survey. We examined the whole continental coastline of Antarctica using a combination of medium resolution and Very High Resolution (VHR) satellite imagery to identify emperor penguin colony locations. Where colonies were identified, VHR imagery was obtained in the 2009 breeding season. The remotely-sensed images were then analysed using a supervised classification method to separate penguins from snow, shadow and guano. Actual counts of penguins from eleven ground truthing sites were used to convert these classified areas into numbers of penguins using a robust regression algorithm.We found four new colonies and confirmed the location of three previously suspected sites giving a total number of emperor penguin breeding colonies of 46. We estimated the breeding population of emperor penguins at each colony during 2009 and provide a population estimate of ~238,000 breeding pairs (compared with the last previously published count of 135,000-175,000 pairs). Based on published values of the relationship between breeders and non-breeders, this translates to a total population of ~595,000 adult birds.There is a growing consensus in the literature that global and regional emperor penguin populations will be affected by changing climate, a driver thought to be critical to their future survival. However, a complete understanding is severely limited by the lack of detailed knowledge about much of their ecology, and importantly a poor understanding of their total breeding population. To address the second of these issues, our work now provides a comprehensive estimate of the total breeding population that can be used in future population models and will provide a baseline for long-term research.

A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. These methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.

A year-class curve is a plot of log cpue (catch per unit effort) over age for a single year class of a species (in contrast to the better known catch curve, fitted to multiple year classes at one time). When linear, the intercept and slope estimate the log cpue at age 0 and the average rate of total

Full Text Available Daily evapotranspiration (ET is modeled globally for the period 2000–2013 based on the Penman–Monteith equation with radiation and vapor pressures derived using remotely sensed Land Surface Temperature (LST from the MODerate resolution Imaging Spectroradiometer (MODIS on the Aqua and Terra satellites. The ET for a given land area is based on four surface conditions: wet/dry and vegetated/non-vegetated. For each, the ET resistance terms are based on land cover, leaf area index (LAI and literature values. The vegetated/non-vegetated fractions of the land surface are estimated using land cover, LAI, a simplified version of the Beer–Lambert law for describing light transition through vegetation and newly derived light extension coefficients for each MODIS land cover type. The wet/dry fractions of the land surface are nonlinear functions of LST derived humidity calibrated using in-situ ET measurements. Results are compared to in-situ measurements (average of the root mean squared errors and mean absolute errors for 39 sites are 0.81 mm day−1 and 0.59 mm day−1, respectively and the MODIS ET product, MOD16, (mean bias during 2001–2013 is −0.2 mm day−1. Although the mean global difference between MOD16 and ET estimates is only 0.2 mm day−1, local temperature derived vapor pressures are the likely contributor to differences, especially in energy and water limited regions. The intended application for the presented model is simulating ET based on long-term climate forecasts (e.g., using only minimum, maximum and mean daily or monthly temperatures.

Previous studies have shown that the Hidden Local Symmetry (HLS) Model, supplied with appropriate symmetry breaking mechanisms, provides an Effective Lagrangian (BHLS) which encompasses a large number of processes within a unified framework; a global fit procedure allows for a simultaneous description of the e + e - annihilation into the 6 final states - π + π - , π 0 γ, ηγ, π + π - π 0 , K + K - , K L K S - and includes the dipion spectrum in the τ decay and some more light meson decay partial widths. The contribution to the muon anomalous magnetic moment a th μ of these annihilation channels over the range of validity of the HLS model (up to 1.05 GeV) is found much improved compared to its partner derived from integrating the measured spectra directly. However, most spectra for the process e + e - → π + π - undergo overall scale uncertainties which dominate the other sources, and one may suspect some bias in the dipion contribution to a th μ . However, an iterated fit algorithm, shown to lead to unbiased results by a Monte Carlo study, is defined and applied succesfully to the e + e - → π + π - data samples from CMD2, SND, KLOE (including the latest sample) and BaBar. The iterated fit solution is shown to be further improved and leads to a value for a μ different from aexp above the 4σ level. The contribution of the π + π - intermediate state up to 1.05 GeV to a μ derived from the iterated fit benefits from an uncertainty about 3 times smaller than the corresponding usual estimate. Therefore, global fit techniques are shown to work and lead to improved unbiased results. The main issue raised in this study and the kind of solution proposed may be of concern for other data driven methods when the data samples are dominated by global normalization uncertainties.

Estimates of extinction risk for Amazonian plant and animal species are rare and not often incorporated into land-use policy and conservation planning. We overlay spatial distribution models with historical and projected deforestation to show that at least 36% and up to 57% of all Amazonian tree species are likely to qualify as globally threatened under International Union for Conservation of Nature (IUCN) Red List criteria. If confirmed, these results would increase the number of threatened plant species on Earth by 22%. We show that the trends observed in Amazonia apply to trees throughout the tropics, and we predict that most of the world’s >40,000 tropical tree species now qualify as globally threatened. A gap analysis suggests that existing Amazonian protected areas and indigenous territories will protect viable populations of most threatened species if these areas suffer no further degradation, highlighting the key roles that protected areas, indigenous peoples, and improved governance can play in preventing large-scale extinctions in the tropics in this century. PMID:26702442

Carbon dioxide evasion (FCO2) from lakes and reservoirs is established as an important component of the global carbon (C) cycle, a fact reflected by the inclusion of these waterbodies in the most recent IPCC assessment report. In this study we developed a statistical model driven by environmental geodata, to predict CO2 partial pressure (pCO2) in boreal lakes, and to create the first high resolution map (0.5°) of boreal (50°- 70°) lake pCO2. The resulting map of pCO2 was combined with lake area (lakes >0.01km2) from the recently developed GLOWABO database (Verpoorter et al., 2014) and estimates of gas transfer velocity k, to produce the first high resolution map of boreal lake FCO2. Before training our model, the geodata as well as approximately 27,000 samples of `open water' (excluding periods of ice cover) pCO2 from the boreal region, were gridded at 0.5° resolution and log transformed where necessary. A multilinear regression was used to derive a prediction equation for log10 pCO2 as a function of log10 lake area, net primary productivity (NPP), precipitation, wind speed and soil pH (r2= 0.66), and then applied in ArcGIS to build the map of pCO2. After validation, the map of boreal lake pCO2 was used to derive a map of boreal lake FCO2. For the boreal region we estimate an average, lake area weighted, pCO2 of 930 μatm and FCO2 of 170 (121-243) Tg C yr-1. Our estimate of FCO2 will soon be updated with the incorporation of the smallest lakes (<0.01km2). Despite the current exclusion of the smallest lakes, our estimate is higher than the highest previous estimate of approximately 110 Tg C yr-1 (Aufdenkampe et al, 2011). Moreover, our empirical approach driven by environmental geodata can be used as the basis for estimating future FCO2 from boreal lakes, and their sensitivity to climate change.

The year 2013?2014 has been designated the GlobalYear Against Orofacial Pain by the International Association for the Study of Pain. Accordingly, a multidisciplinary Canadian and international group of clinical, research and knowledge-transfer experts attended a workshop in Montreal, Quebec. The workshop had two aims: to identify new pathways for innovative diagnosis and management of chronic orofacial pain states; and to identify opportunities for further collaborative orofacial pain resear...

Background Timely assessment of the burden of HIV/AIDS is essential for policy setting and programme evaluation. In this report from the Global Burden of Disease Study 2015 (GBD 2015), we provide national estimates of levels and trends of HIV/AIDS incidence, prevalence, coverage of antiretroviral

Through the Global Partnership the UK continues to make a significant contribution to improve national and global security. Over the past year the UK has continued to implement a wide range of projects across the breadth of its Global Partnership Programme. As well as ensuring the Programme is robust and capable of dealing with new challenges, the UK has cooperated with other donor countries to help them progress projects associated with submarine dismantling, scientist redirection, enhancing nuclear security and Chemical Weapons Destruction. The Global Partnership, although only five years old, has already achieved a great deal. Some 23 states, plus the European Union, are now working closer together under the Global Partnership, and collectively have enhanced global regional and national security by reducing the availability of Weapons of Mass Destruction (WMD) materials and expertise to both states of concern and terrorists. Considerable progress has already been made in, for example: - Improving the security of fissile materials, dangerous biological agents and chemical weapons stocks; - Reducing the number of sites containing radioactive materials; - Working towards closure of reactors still producing weapon-grade plutonium; - Improving nuclear safety to reduce the risks of further, Chernobyl style accidents; - Constructing facilities for destroying Chemical Weapons stocks, and starting actual destruction; - Providing sustainable employment for former WMD scientists to reduce the risk that their expertise will be misused by states or terrorists. By contributing to many of these activities, the UK has helped to make the world safer. This paper reports on the UK's practical and sustainable contribution to the Global Partnership and identifies a number of challenges that remain if it is to have a wider impact on reducing the threats from WMD material. (authors)

The aims of this study were to estimate all-cause and cause-specific mortality and years of life lost, investigated by disability-adjusted life-years (DALYs), due to colorectal cancer attributable to physical inactivity in Brazil and in the states; to analyze the temporal trend of these estimates over 25 years (1990-2015) compared with globalestimates and according to the socioeconomic status of states of Brazil. Databases from the Global Burden of Disease Study (GBD) for Brazil, Brazilian states and global information were used. It was estimated the total number and the age-standardized rates of deaths and DALYs for colorectal cancer attributable to physical inactivity in the years 1990 and 2015. We used the Socioeconomic Development Index (SDI). Physical inactivity was responsible for a substantial number of deaths (1990: 1,302; 2015: 119,351) and DALYs (1990: 31,121; 2015: 87,116) due to colorectal cancer in Brazil. From 1990 to 2015, the mortality and DALYs due to colorectal cancer attributable to physical inactivity increased in Brazil (0.6% and 0.6%, respectively) and decreased around the world (-0.8% and -1.1%, respectively). The Brazilian states with better socioeconomic indicators had higher rates of mortality and morbidity by colorectal cancer due to physical inactivity (pBrazil. Over 25 years, the Brazilian population showed more worrisome results than around the world. Actions to combat physical inactivity and greater cancer screening and treatment are urgent in the Brazilian states.

Halogenated chemical substances are used in a broad array of applications, and new chemical substances are continually being developed and introduced into commerce. While recent research has considerably increased our understanding of the global warming potentials (GWPs) of multiple individual chemical substances, this research inevitably lags behind the development of new chemical substances. There are currently over 200 substances known to have high GWP. Evaluation of schemes to estimate radiative efficiency (RE) based on computational chemistry are useful where no measured IR spectrum is available. This study assesses the reliability of values of RE calculated using computational chemistry techniques for 235 chemical substances against the best available values. Computed vibrational frequency data is used to estimate RE values using several Pinnock-type models, and reasonable agreement with reported values is found. Significant improvement is obtained through scaling of both vibrational frequencies and intensities. The effect of varying the computational method and basis set used to calculate the frequency data is discussed. It is found that the vibrational intensities have a strong dependence on basis set and are largely responsible for differences in computed RE values.

In this paper a comparison between three models for predicting the total solar flux falling on a horizontal surface has been processed. Capderou, Perrin & Brichambaut and Hottel models used to estimate the global solar radiation, the models are identified and evaluated using MATLAB environment. The recorded data have been obtained from a small weather station installed at the LAGE laboratory of Ouargla University, Algeria. Solar radiation data have been recorded using four sample days, every 15thday of the month, (March, April, May and October). The Root Mean Square Error (RMSE), Correlation Coefficient (CC) and Mean Absolute Percentage Error (MAPE) have been also calculated so as that to test the reliability of the proposed models. A comparisons between the measured and the calculated values have been made. The results obtained in this study depict that Perrin & Brichambaut and Capderou models are more effective to estimate the total solar intensity on a horizontal surface for clear sky over Ouargla city (Latitude of 31° 95' N, Longitude of 5° 24' E, and Altitude of 0.141km above Mean Sea Level), these models dedicated from meteorological parameters, geographical location and number of days since the first January. Perrin & Brichambaut and Capderou models give the best tendency with a CC of 0.985-0.999 and 0.932-0.995 consecutively further, Hottel give's a CC of 0.617-0.942.

Over the last two decades, using simple radiation models has been an interesting task to estimate daily solar radiation in arid and semi-arid deserts such as those in Iran, where the number of solar observation sites is poor. In Iran, most of the models used so far, have been validated for a few specific locations based on short-term solar observations. In this work, three different radiation models (Sabbagh, Paltridge, Daneshyar) have been revised to predict the climatology of monthly average daily solar radiation on horizontal surfaces in various cities in central arid deserts of Iran. The modifications are made by the inclusion of altitude, monthly total number of dusty days and seasonal variation of Sun-Earth distance. A new height-dependent formula is proposed based on MBE, MABE, MPE and RMSE statistical analysis. It is shown that the revised Sabbagh method can be a good estimator for the prediction of global solar radiation in arid and semi-arid deserts with an average error of less than 2%, that performs a more accurate prediction than those in the previous studies. The required data for the suggested method are usually available in most meteorological sites. For the locations, where some of the input data are not reported, an alternative approach is presented. (author)

Full Text Available Global motion estimation (GME is a key technology in unmanned aerial vehicle remote sensing (UAVRS. However, when a UAV’s motion and behavior change significantly or the image information is not rich, traditional image-based methods for GME often perform poorly. Introducing bottom metadata can improve precision in a large-scale motion condition and reduce the dependence on unreliable image information. GME is divided into coarse and residual GME through coordinate transformation and based on the study hypotheses. In coarse GME, an auxiliary image is built to convert image matching from a wide baseline condition to a narrow baseline one. In residual GME, a novel information and contrast feature detection algorithm is proposed for big-block matching to maximize the use of reliable image information and ensure that the contents of interest are well estimated. Additionally, an image motion monitor is designed to select the appropriate processing strategy by monitoring the motion scales of translation, rotation, and zoom. A medium-altitude UAV is employed to collect three types of large-scale motion datasets. Peak signal to noise ratio (PSNR and motion scale are computed. This study’s result is encouraging and applicable to other medium- or high-altitude UAVs with a similar system structure.

Accurate determination of atmospheric methane surface fluxes is an important and challenging problem in global biogeochemical cycles. We use inverse modeling to estimate annual, seasonal, and interannual CH4 fluxes between 1996 and 2001. The fluxes include 7 time-varying seasonal (3 wetland, rice, and 3 biomass burning) and 3 steady aseasonal (animals/waste, coal, and gas) global processes. To simulate atmospheric methane, we use the 3-D chemical transport model MATCH driven by NCEP reanalyzed observed winds at a resolution of T42 ( ˜2.8° x 2.8° ) in the horizontal and 28 levels (1000 - 3 mb) in the vertical. By combining existing datasets of individual processes, we construct a reference emissions field that represents our prior guess of the total CH4 surface flux. For the methane sink, we use a prescribed, annually-repeating OH field scaled to fit methyl chloroform observations. MATCH is used to produce both the reference run from the reference emissions, and the time-dependent sensitivities that relate individual emission processes to observations. The observational data include CH4 time-series from ˜15 high-frequency (in-situ) and ˜50 low-frequency (flask) observing sites. Most of the high-frequency data, at a time resolution of 40-60 minutes, have not previously been used in global scale inversions. In the inversion, the high-frequency data generally have greater weight than the weekly flask data because they better define the observational monthly means. The Kalman Filter is used as the optimal inversion technique to solve for emissions between 1996-2001. At each step in the inversion, new monthly observations are utilized and new emissions estimates are produced. The optimized emissions represent deviations from the reference emissions that lead to a better fit to the observations. The seasonal processes are optimized for each month, and contain the methane seasonality and interannual variability. The aseasonal processes, which are less variable, are

Previous studies have shown that the Hidden Local Symmetry (HLS) model, supplied with appropriate symmetry breaking mechanisms, provides an effective Lagrangian (Broken Hidden Local Symmetry, BHLS) which encompasses a large number of processes within a unified framework. Based on it, a global fit procedure allows for a simultaneous description of the e{sup +}e{sup -} annihilation into six final states—π{sup +}π{sup -}, π{sup 0}γ, ηγ, π{sup +}π{sup -}π{sup 0}, K{sup +}K{sup -}, K{sub L}K{sub S}—and includes the dipion spectrum in the τ decay and some more light meson decay partial widths. The contribution to the muon anomalous magnetic moment a{sub μ}{sup th} of these annihilation channels over the range of validity of the HLS model (up to 1.05 GeV) is found much improved in comparison to the standard approach of integrating the measured spectra directly. However, because most spectra for the annihilation process e{sup +}e{sup -}→π{sup +}π{sup -} undergo overall scale uncertainties which dominate the other sources, one may suspect some bias in the dipion contribution to a{sub μ}{sup th}, which could question the reliability of the global fit method. However, an iterated global fit algorithm, shown to lead to unbiased results by a Monte Carlo study, is defined and applied successfully to the e{sup +}e{sup -}→π{sup +}π{sup -} data samples from CMD2, SND, KLOE, BaBar, and BESSIII. The iterated fit solution is shown to further improve the prediction for a{sub μ}, which we find to deviate from its experimental value above the 4σ level. The contribution to a{sub μ} of the π{sup +}π{sup -} intermediate state up to 1.05 GeV has an uncertainty about 3 times smaller than the corresponding usual estimate. Therefore, global fit techniques are shown to work and lead to improved unbiased results.

Previous studies have shown that the Hidden Local Symmetry (HLS) model, supplied with appropriate symmetry breaking mechanisms, provides an effective Lagrangian (Broken Hidden Local Symmetry, BHLS) which encompasses a large number of processes within a unified framework. Based on it, a global fit procedure allows for a simultaneous description of the e{sup +}e{sup -} annihilation into six final states - π{sup +}π{sup -}, π{sup 0}γ, ηγ, π{sup +}π{sup -}π{sup 0}, K{sup +}K{sup -}, K{sub L}K{sub S} - and includes the dipion spectrum in the τ decay and some more light meson decay partial widths. The contribution to the muon anomalous magnetic moment a{sub μ}{sup th} of these annihilation channels over the range of validity of the HLS model (up to 1.05 GeV) is found much improved in comparison to the standard approach of integrating the measured spectra directly. However, because most spectra for the annihilation process e{sup +}e{sup -} → π{sup +}π{sup -} undergo overall scale uncertainties which dominate the other sources, one may suspect some bias in the dipion contribution to a{sub μ}{sup th}, which could question the reliability of the global fit method. However, an iterated global fit algorithm, shown to lead to unbiased results by a Monte Carlo study, is defined and applied successfully to the e{sup +}e{sup -} → π{sup +}π{sup -} data samples from CMD2, SND, KLOE, BaBar, and BESSIII. The iterated fit solution is shown to further improve the prediction for a{sub μ}, which we find to deviate from its experimental value above the 4σ level. The contribution to a{sub μ} of the π{sup +}π{sup -} intermediate state up to 1.05 GeV has an uncertainty about 3 times smaller than the corresponding usual estimate. Therefore, global fit techniques are shown to work and lead to improved unbiased results. (orig.)

Modern codes for the numerical solution of Initial Value Problems (IVPs) in ODEs are based in adaptive methods that, for a user supplied tolerance [delta], attempt to advance the integration selecting the size of each step so that some measure of the local error is [similar, equals][delta]. Although this policy does not ensure that the global errors are under the prescribed tolerance, after the early studies of Stetter [Considerations concerning a theory for ODE-solvers, in: R. Burlisch, R.D. Grigorieff, J. Schröder (Eds.), Numerical Treatment of Differential Equations, Proceedings of Oberwolfach, 1976, Lecture Notes in Mathematics, vol. 631, Springer, Berlin, 1978, pp. 188-200; Tolerance proportionality in ODE codes, in: R. März (Ed.), Proceedings of the Second Conference on Numerical Treatment of Ordinary Differential Equations, Humbold University, Berlin, 1980, pp. 109-123] and the extensions of Higham [Global error versus tolerance for explicit Runge-Kutta methods, IMA J. Numer. Anal. 11 (1991) 457-480; The tolerance proportionality of adaptive ODE solvers, J. Comput. Appl. Math. 45 (1993) 227-236; The reliability of standard local error control algorithms for initial value ordinary differential equations, in: Proceedings: The Quality of Numerical Software: Assessment and Enhancement, IFIP Series, Springer, Berlin, 1997], it has been proved that in many existing explicit Runge-Kutta codes the global errors behave asymptotically as some rational power of [delta]. This step-size policy, for a given IVP, determines at each grid point tn a new step-size hn+1=h(tn;[delta]) so that h(t;[delta]) is a continuous function of t. In this paper a study of the tolerance proportionality property under a discontinuous step-size policy that does not allow to change the size of the step if the step-size ratio between two consecutive steps is close to unity is carried out. This theory is applied to obtain global error estimations in a few problems that have been solved with

Highlights: → New semi-empirical models for predicting clear sky irradiance were developed. → The proposed models compare favorably with other empirical models. → Performance of proposed models is comparable with that of widely used physical models. → The proposed models have advantage over the physical models in terms of simplicity. -- Abstract: This paper presents semi-empirical models for estimatingglobal and direct normal solar irradiances under clear sky conditions in the tropics. The models are based on a one-year period of clear sky global and direct normal irradiances data collected at three solar radiation monitoring stations in Thailand: Chiang Mai (18.78 o N, 98.98 o E) located in the North of the country, Nakhon Pathom (13.82 o N, 100.04 o E) in the Centre and Songkhla (7.20 o N, 100.60 o E) in the South. The models describe global and direct normal irradiances as functions of the Angstrom turbidity coefficient, the Angstrom wavelength exponent, precipitable water and total column ozone. The data of Angstrom turbidity coefficient, wavelength exponent and precipitable water were obtained from AERONET sunphotometers, and column ozone was retrieved from the OMI/AURA satellite. Model validation was accomplished using data from these three stations for the data periods which were not included in the model formulation. The models were also validated against an independent data set collected at Ubon Ratchathani (15.25 o N, 104.87 o E) in the Northeast. The global and direct normal irradiances calculated from the models and those obtained from measurements are in good agreement, with the root mean square difference (RMSD) of 7.5% for both global and direct normal irradiances. The performance of the models was also compared with that of other models. The performance of the models compared favorably with that of empirical models. Additionally, the accuracy of irradiances predicted from the proposed model are comparable with that obtained from some

Full Text Available Along the past years, mobile robots have proliferated both in domestic and in industrial environments to solve some tasks such as cleaning, assistance, or material transportation. One of their advantages is the ability to operate in wide areas without the necessity of introducing changes into the existing infrastructure. Thanks to the sensors they may be equipped with and their processing systems, mobile robots constitute a versatile alternative to solve a wide range of applications. When designing the control system of a mobile robot so that it carries out a task autonomously in an unknown environment, it is expected to take decisions about its localization in the environment and about the trajectory that it has to follow in order to arrive to the target points. More concisely, the robot has to find a relatively good solution to two crucial problems: building a model of the environment, and estimating the position of the robot within this model. In this work, we propose a framework to solve these problems using only visual information. The mobile robot is equipped with a catadioptric vision sensor that provides omnidirectional images from the environment. First, the robot goes along the trajectories to include in the model and uses the visual information captured to build this model. After that, the robot is able to estimate its position and orientation with respect to the trajectory. Among the possible approaches to solve these problems, global appearance techniques are used in this work. They have emerged recently as a robust and efficient alternative compared to landmark extraction techniques. A global description method based on Radon Transform is used to design mapping and localization algorithms and a set of images captured by a mobile robot in a real environment, under realistic operation conditions, is used to test the performance of these algorithms.

and sex on initial CD4 distribution at infection, CD4 progression rates (probability of progression from higher to lower CD4 cell-count category), on and off antiretroviral therapy (ART) mortality, and mortality from all other causes. Our estimation strategy links the GBD 2015 assessment of all......Summary Background Timely assessment of the burden of HIV/AIDS is essential for policy setting and programme evaluation. In this report from the Global Burden of Disease Study 2015 (GBD 2015), we provide national estimates of levels and trends of HIV/AIDS incidence, prevalence, coverage......-cause mortality and estimation of incidence and prevalence so that for each draw from the uncertainty distribution all assumptions used in each step are internally consistent. We estimated incidence, prevalence, and death with GBD versions of the Estimation and Projection Package (EPP) and Spectrum software...

The IRIS network has accumulated full disk helioseismological data since July 1989, i.e. a complete 11-year solar cycle. Since the last paper publishing a frequency list [A&A 317 (1997) L71], not only has the network acquired new data, but has also developed new co-operative programs with compatible instruments [Abstr. SOHO 6/GONG 98 Workshop (1998) 51], so that merging IRIS files with these co-operative program data sets has made possible the improvement of the overall duty cycle. This paper presents new estimations of low degree p-mode frequencies obtained from this IRIS++ data bank covering the period 1989-1996, as well as the variation of their main parameters along the total range of magnetic activity, from before the last maximum to the very minimum. A preliminary estimation of the peak profile asymmetries is also included.

Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

To provide information for greenhouse gas reduction policies, the California Air Resources Board (CARB) inventories annual emissions of high-global-warming potential (GWP) fluorinated gases, the fastest growing sector of greenhouse gas (GHG) emissions globally. Baseline 2008 F-gas emissions estimates for selected chlorofluorocarbons (CFC-12), hydrochlorofluorocarbons (HCFC-22), and hydrofluorocarbons (HFC-134a) made with an inventory-based methodology were compared to emissions estimates made by ambient-based measurements. Significant discrepancies were found, with the inventory-based emissions methodology resulting in a systematic 42% under-estimation of CFC-12 emissions from older refrigeration equipment and older vehicles, and a systematic 114% overestimation of emissions for HFC-134a, a refrigerant substitute for phased-out CFCs. Initial, inventory-based estimates for all F-gas emissions had assumed that equipment is no longer in service once it reaches its average lifetime of use. Revised emission estimates using improved models for equipment age at end-of-life, inventories, and leak rates specific to California resulted in F-gas emissions estimates in closer agreement to ambient-based measurements. The discrepancies between inventory-based estimates and ambient-based measurements were reduced from -42% to -6% for CFC-12, and from +114% to +9% for HFC-134a.

This article reviews the present indicators, trends, and recent solutions and strategies to tackle major global and country problems in safety and health at work. The article is based on the Yant Award Lecture of the American Industrial Hygiene Association (AIHA) at its 2013 Congress. We reviewed employment figures, mortality rates, occupational burden of disease and injuries, reported accidents, surveys on self-reported occupational illnesses and injuries, attributable fractions, national economic cost estimates of work-related injuries and ill health, and the most recent information on the problems from published papers, documents, and electronic data sources of international and regional organizations, in particular the International Labor Organization (ILO), World Health Organization (WHO), and European Union (EU), institutions, agencies, and public websites. We identified and analyzed successful solutions, programs, and strategies to reduce the work-related negative outcomes at various levels. Work-related illnesses that have a long latency period and are linked to ageing are clearly on the increase, while the number of occupational injuries has gone down in industrialized countries thanks to both better prevention and structural changes. We have estimated that globally there are 2.3 million deaths annually for reasons attributed to work. The biggest component is linked to work-related diseases, 2.0 million, and 0.3 million linked to occupational injuries. However, the division of these two factors varies depending on the level of development. In industrialized countries the share of deaths caused by occupational injuries and work-related communicable diseases is very low while non-communicable diseases are the overwhelming causes in those countries. Economic costs of work-related injury and illness vary between 1.8 and 6.0% of GDP in country estimates, the average being 4% according to the ILO. Singapore's economic costs were estimated to be equivalent to 3

Long-term projections of energy consumption, supply and prices heavily influence decisions regarding long-lived energy infrastructure. Predicting the evolution of these quantities over multiple years to decades is a difficult task. Here, we estimateyear-on-year volatility and unpredictability over multi-decade time frames for many quantities in the US energy system using historical projections. We determine the distribution over time of the most extreme projection errors (unpredictability) from 1985 to 2014, and the largest year-over-year changes (volatility) in the quantities themselves from 1949 to 2014. Our results show that both volatility and unpredictability have increased in the past decade, compared to the three and two decades before it. These findings may be useful for energy decision-makers to consider as they invest in and regulate long-lived energy infrastructure in a deeply uncertain world.

Accurate estimation of solar radiation is the major concern in renewable energy applications. Over the past few years, a lot of machine learning paradigms have been proposed in order to improve the estimation performances, mostly based on artificial neural networks, fuzzy logic, support vector machine and adaptive neuro-fuzzy inference system. The aim of this work is the prediction of the daily global solar radiation, received on a horizontal surface through the Gaussian process regression (GPR) methodology. A case study of Ghardaïa region (Algeria) has been used in order to validate the above methodology. In fact, several combinations have been tested; it was found that, GPR-model based on sunshine duration, minimum air temperature and relative humidity gives the best results in term of mean absolute bias error (MBE), root mean square error (RMSE), relative mean square error (rRMSE), and correlation coefficient ( r) . The obtained values of these indicators are 0.67 MJ/m2, 1.15 MJ/m2, 5.2%, and 98.42%, respectively.

Full Text Available The phenomenon of wildfires became a global environmental problem which demands estimations of their CO2 emissions. Wildfires have deteriorated the air quality increasingly. Using available information on documented wildfires and a data set of satellite detected hot spots, total yearly emissions of CO2 in Mexico were estimated for the period 1999–2010. A map of the main vegetation groups was used to calculate total areas for every vegetation type. The yearly number of hot spots per vegetation type was calculated. Estimates of emitted CO2 in a wildfire were then accomplished by considering parameters such as: forest fuel load, vegetation type, burning efficiency, and mean burned area. The number of wildfires and total affected areas showed an annual variability. The yearly mean of affected area by a single wildfire varied between 0.2 and 0.3 km2. The total affected area during the period 1999 to 2010 was 86800 km2 which corresponds to 4.3% of the Mexican territory. Total CO2 emissions were approximately 112 Tg. The most affected vegetation types were forest and rainforest.

Climate extremes will increase the frequency and severity of natural disasters worldwide. Climate-related natural disasters were anticipated to affect 375 million people in 2015, more than 50% greater than the yearly average in the previous decade. To inform surgical assistance preparedness, we estimated the number of surgical procedures needed. The numbers of people affected by climate-related disasters from 2004 to 2014 were obtained from the Centre for Research of the Epidemiology of Disasters database. Using 5,000 procedures per 100,000 persons as the minimum, baseline estimates were calculated. A linear regression of the number of surgical procedures performed annually and the estimated number of surgical procedures required for climate-related natural disasters was performed. Approximately 140 million people were affected by climate-related natural disasters annually requiring 7.0 million surgical procedures. The greatest need for surgical care was in the People's Republic of China, India, and the Philippines. Linear regression demonstrated a poor relationship between national surgical capacity and estimated need for surgical care resulting from natural disaster, but countries with the least surgical capacity will have the greatest need for surgical care for persons affected by climate-related natural disasters. As climate extremes increase the frequency and severity of natural disasters, millions will need surgical care beyond baseline needs. Countries with insufficient surgical capacity will have the most need for surgical care for persons affected by climate-related natural disasters. Estimates of surgical are particularly important for countries least equipped to meet surgical care demands given critical human and physical resource deficiencies.

In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

Full Text Available Aquaculture has grown rapidly over the last three decades expanding at an average annual growth rate of 5.8% (2005-2014, down from 8.8% achieved between 1980 and 2010. The sector now produces 44% of total food fish production. Increasing demand and consumption from a growing global population are driving further expansion of both inland and marine aquaculture (i.e., mariculture, including marine species farmed on land. However, the growth of mariculture is dependent on the availability of suitable farming areas for new facilities, particularly for open farming practices that rely on the natural oceanic environmental parameters such as temperature, oxygen, chlorophyll etc. In this study, we estimated the marine areas within the exclusive economic zones of all countries that were suitable for potential open ocean mariculture activities. To this end, we quantify the environmental niche and inferred the global habitat suitability index (HSI of the 102 most farmed marine species using four species distribution models. The average weighted HSI across the four models suggests that 72,000,000 km2 of ocean are to be environmentally suitable to farm one or more species. About 92% of the predicted area (66,000,000 km2 is environmentally suitable for farming finfish, 43% (31,000,000 km2 for molluscs and 54% (39,000,000 km2 for crustaceans. These predictions do not consider technological feasibility that can limit crustaceans farming in open waters. Suitable mariculture areas along the Atlantic coast of South America and West Africa appear to be most under-utilized for farming. Our results suggest that factors other than environmental considerations such as the lack of socio-economic and technological capacity, as well as aqua feed supply are currently limiting the potential for mariculture expansion in many areas.

Full Text Available Abstract Background Foodborne diseases (FBD comprise a large part of the global mortality burden, yet the true extent of their impact remains unknown. The present study utilizes multiple regression with the first attempt to use nonhealth variables to predict potentially FBD mortality at the country level. Methods Vital registration (VR data were used to build a multiple regression model incorporating nonhealth variables in addition to traditionally used health indicators. This model was subsequently used to predict FBD mortality rates for all countries of the World Health Organization classifications AmrA, AmrB, EurA, and EurB. Results Statistical modeling strongly supported the inclusion of nonhealth variables in a multiple regression model as predictors of potentially FBD mortality. Six variables were included in the final model: percent irrigated land, average calorie supply from animal products, meat production in metric tons, adult literacy rate, adult HIV/AIDS prevalence, and percent of deaths under age 5 caused by diarrheal disease. Interestingly, nonhealth variables were not only more robust predictors of mortality than health variables but also remained significant when adding additional health variables into the analysis. Mortality rate predictions from our model ranged from 0.26 deaths per 100,000 (Netherlands to 15.65 deaths per 100,000 (Honduras. Reported mortality rates of potentially FBD from VR data lie within the 95% prediction interval for the majority of countries (37/39 where comparison was possible. Conclusions Nonhealth variables appear to be strong predictors of potentially FBD mortality at the country level and may be a powerful tool in the effort to estimate the global mortality burden of FBD. Disclaimer The views expressed in this document are solely those of the authors and do not represent the views of the World Health Organization.

Having completed its fourth year of full operation, East Rand Gold and Uranium (ERGO) has established itself as a succesful low-cost gold producer. The recovering of gold, uranium and sulphuric acid from some old slimes dams has beaten its production estimates for 1981 till the end of March 1982. Overall Ergo has settled down well from its first years of production.

Data on the current burden of adenocarcinoma (ADC) and histology-specific human papillomavirus (HPV) type distribution are relevant to predict the future impact of prophylactic HPV vaccines. We estimate the proportion of ADC in invasive cervical cancer, the global number of cases of cervical ADC in 2015, the effect of cervical screening on ADC, the number of ADC cases attributable to high-risk HPV types -16, -18, -45, -31 and -33, and the potential impact of HPV vaccination using a variety of data sources including: GLOBOCAN 2008, Cancer Incidence in Five Continents (CI5) Volume IX, cervical screening data from the World Health Organization/Institut Català d'Oncologia Information Centre on HPV and cervical cancer, and published literature. ADC represents 9.4% of all ICC although its contribution varies greatly by country and region. The global crude incidence rate of cervical ADC in 2015 is estimated at 1.6 cases per 100,000 women, and the projected worldwide incidence of ADC in 2015 is 56,805 new cases. Current detection rates for HPV DNA in cervical ADC tend to range around 80–85%; the lower HPV detection rates in cervical ADC versus squamous cell carcinoma may be due to technical artefacts or to misdiagnosis of endometrial carcinoma as cervical ADC. Published data indicate that the five most common HPV types found in cervical ADC are HPV-16 (41.6%), -18 (38.7%), -45 (7.0%), -31 (2.2%) and -33 (2.1%), together comprising 92% of all HPV positive cases. Future projections using 2015 data, assuming 100% vaccine coverage and a true HPV causal relation of 100%, suggest that vaccines providing protection against HPV-16/18 may theoretically prevent 79% of new HPV-related ADC cases (44,702 cases annually) and vaccines additionally providing cross-protection against HPV-31/33/45 may prevent 89% of new HPV-related ADC cases (50,769 cases annually). It is predicted that the currently available HPV vaccines will be highly effective in preventing HPV-related cervical

Previous studies have shown that the Hidden Local Symmetry (HLS) Model, supplied with appropriate symmetry breaking mechanisms, provides an Effective Lagrangian (BHLS) which encompasses a large number of processes within a unified framework; a global fit procedure allows for a simultaneous description of the e{sup +}e{sup -} annihilation into the 6 final states - π{sup +}π{sup -}, π{sup 0}γ, ηγ, π{sup +}π{sup -}π{sup 0}, K{sup +}K{sup -}, K{sub L}K{sub S} - and includes the dipion spectrum in the τ decay and some more light meson decay partial widths. The contribution to the muon anomalous magnetic moment a{sup th}{sub μ} of these annihilation channels over the range of validity of the HLS model (up to 1.05 GeV) is found much improved compared to its partner derived from integrating the measured spectra directly. However, most spectra for the process e{sup +}e{sup -} → π{sup +}π{sup -} undergo overall scale uncertainties which dominate the other sources, and one may suspect some bias in the dipion contribution to a{sup th}{sub μ}. However, an iterated fit algorithm, shown to lead to unbiased results by a Monte Carlo study, is defined and applied succesfully to the e{sup +}e{sup -} → π{sup +}π{sup -} data samples from CMD2, SND, KLOE (including the latest sample) and BaBar. The iterated fit solution is shown to be further improved and leads to a value for a{sub μ} different from aexp above the 4σ level. The contribution of the π{sup +}π{sup -} intermediate state up to 1.05 GeV to a{sub μ} derived from the iterated fit benefits from an uncertainty about 3 times smaller than the corresponding usual estimate. Therefore, global fit techniques are shown to work and lead to improved unbiased results. The main issue raised in this study and the kind of solution proposed may be of concern for other data driven methods when the data samples are dominated by global normalization uncertainties.

Global Positioning System (GPS) measurements spanning approximately 3 years have been used to determine velocities for 7 sites on the Australian, Pacific and Antarctic plates. The site velocities agree with both plate model predictions and other space geodetic techniques. We find no evidence for internal deformation of the interior of the Australian plate. Wellington, New Zealand, located in the Australian-Pacific plate boundary zone, moves 20 +/- 5 mm/yr west-southwest relative to the Australian plate. Its velocity lies midway between the predicted velocities of the two plates. Relative Euler vectors for the Australia-Antarctica and Pacific-Antarctica plates agree within one standard deviation with the NUVEL-1A predictions.

Background: There is increasing recognition of stroke as an important contributor to childhood morbidity and mortality. Current estimates of global childhood stroke burden and its temporal trends are sparse. Accurate and up-to-date estimates of childhood stroke burden are important for planning

Full Text Available The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates.National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimatedglobal prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%.Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health problem in different countries can be drawn based on the country

. The most important demographic change to diabetes prevalence across the world appears to be the increase in the proportion of people 65 years of age. CONCLUSIONS — These findings indicate that the “diabetes epidemic” will continue even if levels of obesity remain constant. Given the increasing prevalence......OBJECTIVE — The goal of this study was to estimate the prevalence of diabetes and the number of people of all ages with diabetes for years 2000 and 2030. RESEARCH DESIGN AND METHODS — Data on diabetes prevalence by age and sex from a limited number of countries were extrapolated to all 191 World...... of obesity, it is likely that these figures provide an underestimate of future diabetes prevalence....

Full Text Available Depressive disorders were a leading cause of burden in the Global Burden of Disease (GBD 1990 and 2000 studies. Here, we analyze the burden of depressive disorders in GBD 2010 and present severity proportions, burden by country, region, age, sex, and year, as well as burden of depressive disorders as a risk factor for suicide and ischemic heart disease.Burden was calculated for major depressive disorder (MDD and dysthymia. A systematic review of epidemiological data was conducted. The data were pooled using a Bayesian meta-regression. Disability weights from population survey data quantified the severity of health loss from depressive disorders. These weights were used to calculate years lived with disability (YLDs and disability adjusted life years (DALYs. Separate DALYs were estimated for suicide and ischemic heart disease attributable to depressive disorders. Depressive disorders were the second leading cause of YLDs in 2010. MDD accounted for 8.2% (5.9%-10.8% of global YLDs and dysthymia for 1.4% (0.9%-2.0%. Depressive disorders were a leading cause of DALYs even though no mortality was attributed to them as the underlying cause. MDD accounted for 2.5% (1.9%-3.2% of global DALYs and dysthymia for 0.5% (0.3%-0.6%. There was more regional variation in burden for MDD than for dysthymia; with higher estimates in females, and adults of working age. Whilst burden increased by 37.5% between 1990 and 2010, this was due to population growth and ageing. MDD explained 16 million suicide DALYs and almost 4 million ischemic heart disease DALYs. This attributable burden would increase the overall burden of depressive disorders from 3.0% (2.2%-3.8% to 3.8% (3.0%-4.7% of global DALYs.GBD 2010 identified depressive disorders as a leading cause of burden. MDD was also a contributor of burden allocated to suicide and ischemic heart disease. These findings emphasize the importance of including depressive disorders as a public-health priority and implementing

Over the last fifteen years artificial neural networks (ANN) have been shown to be advantageous for the solution of many hydrological modelling problems. The use of ANNs for flood magnitude estimation in ungauged catchments, however, is a relatively new and under researched area. In this paper ANNs are used to make estimates of the magnitude of the 100-year flood event (Q100) for a number of ungauged catchments. The data used in this study were provided by the Centre for Ecology and Hydrology's Flood Estimation Handbook (FEH), which contains information on catchments across the UK. Sixteen catchment descriptors for 719 catchments were used to train an ANN, which was split into a training, validation and test data set. The goodness-of-fit statistics on the test data set indicated good model performance, with an r-squared value of 0.8 and a coefficient of efficiency of 79 percent. Data for twelve ungauged catchments were then put through the trained ANN to produce estimates of Q100. Two other accepted methodologies were also employed: the FEH statistical method and the FSR (Flood Studies Report) design storm technique, both of which are used to produce flood frequency estimates. The advantage of developing an ANN model is that it provides a third figure to aid a hydrologist in making an accurate estimate. For six of the twelve catchments, there was a relatively low spread between estimates. In these instances, an estimate of Q100 could be made with a fair degree of certainty. Of the remaining six catchments, three had areas greater than 1000km2, which means the FSR design storm estimate cannot be used. Armed with the ANN model and the FEH statistical method the hydrologist still has two possible estimates to consider. For these three catchments, the estimates were also fairly similar, providing additional confidence to the estimation. In summary, the findings of this study have shown that an accurate estimation of Q100 can be made using the catchment descriptors of

Using information from the Supplementary Schedules of the 1950 National Census and from the JNIH-ABCC Life Span Study, cumulative person-years at risk in 1950 to 1960 were estimated by age ATB, sex, distance from hypocenter, radiation dose and symptoms for A-bomb survivors resident in Hiroshima and Nagasaki cities. The number of person-years at risk in 1951 to 1958 was estimated by applying the survivorship in each age group of the Adult Health Study sample during the period 1951 to 1958 to the number of survivors in 1950. To determine the number of person-years at risk from 1959 to 1960, the average yearly loss was evaluated for each exposure group for the period 1955 to 1958 in Hiroshima and for 1953 to 1958 in Nagasaki which was then applied to 1959 and 1960, respectively. The estimate of person-years among the nonexposed groups for this period was obtained from the above estimates, the total population of both cities, and the number of persons born after the A-bombing. Estimates by other associated factors were obtained by the same procedure. 20 references, 25 tables.

Full Text Available To investigate the association between spouse weekly working hours (SWWH and the estimated 10-years risk of cardiovascular disease (CVD.This cross-sectional study was based on the data obtained from the Korean National Health and Nutrition Examination Survey 2007-2012. Data of 16,917 participants (8,330 husbands, 8,587 wives were used for this analysis. The participants' clinical data were collected to estimate the 10-years risk of CVD, as well as weekly working hours. Multiple logistic regression was conducted to investigate the association between SWWH and the estimated 10-years risk of CVD. We also performed a stratified analysis according to each participant's and their spouse's employment status.Compared to those whose spouses worked 30 hours per week, estimated 10-years risk of CVD was significantly higher as SWWH increase among those whose spouses worked >30 hours per week. After adjusting for covariates, the odds ratio for high CVD risk was found to increase as SWWH increased, up to 2.52 among husbands and 2.43 among wives. We also found that the association between SWWH and the estimated 10-years risk of CVD varied according to the employment status. Analysis of each component included in the CVD appraisal model showed that SWWH had close relationship with diabetes in men, and smoking habits in women.Spouse's long working hours are associated with individual's risk of CVD in future, especially among husbands.

Human use of biomass has become a major component of the global biogeochemical cycles of carbon and nitrogen. The use of land for biomass production (e.g. cropland) is among the most important pressures on biodiversity. At the same time, biomass is indispensable for humans as food, animal feed, raw material and energy source. In order to support research into these complex issues, we here present a comprehensive assessment of global socioeconomic biomass harvest, use and trade for the year 2000. We developed country-level livestock balances and a consistent set of factors to estimate flows of used biomass not covered by international statistics (e.g. grazed biomass, crop residues) and indirect flows (i.e. biomass destroyed during harvest but not used). We found that current global terrestrial biomass appropriation amounted to 18.7 billion tonnes dry matter per year (Pg/yr) or 16% of global terrestrial NPP of which 6.6 Pg/yr were indirect flows. Only 12% of the economically used plant biomass (12.1 Pg/yr) directly served as human food, while 58% were used as feed for livestock, 20% as raw material and 10% as fuelwood. There are considerable regional variations in biomass supply and use. Distinguishing 11 world regions, we found that extraction of used biomass ranged from 0.3 to 2.8 t/ha/yr, per-capita values varied between 1.2 and 11.7 t/cap/yr (dry matter). Aggregate global biomass trade amounted to 7.5% of all extracted biomass. An analysis of these regional patterns revealed that the level of biomass use per capita is determined by historically evolved patterns of land use and population density rather than by affluence or economic development status. Regions with low population density have the highest level of per-capita biomass use, high-density regions the lowest. Livestock, consuming 30-75% of all harvested biomass, is another important factor explaining regional variations in biomass use. Global biomass demand is expected to grow during the next decades

The most applied CFC refrigerants and their HFC alternatives. values of ODP (Ozone Depletion Potential) and GWP (Global Warming Potential) of the most used refrigerants. natural working fluids and their properties. Montreal Protocol and Kyoto Protocol, illogical relations between them concerning to the HFC fluids. Confusion and polemics on the international level about the appliance of HFCs which, by the Kyoto Protocol, are liable to reduction. Introduction of the TEWI concept as a method for estimating the overall influence of refrigerating and air conditioning systems on the greenhouse effect: the direct emission (refrigerant leakage in the atmosphere) and indirect emission as a result of the electrical energy consumption. A demonstration of the TEWI concept on the concrete example in several variants. A discussion about the appliance of the TEWI concept. Meaning of the energy efficiency of the refrigerating systems (indirect CO 2 emission). One of the main measures: prevention of refrigerant leakage (direct CO 2 emission). A need of permanent education and training courses of the people who work on refrigerating and air conditioning systems. A necessity for constitution of an expert body in the country, preparation of a strategy to lay obligations on the new changes of the Kyoto Protocol and news on the world market. Introduction of country regulations, certification of the companies and people involved in refrigeration and air conditioning. (Author)

Full Text Available How often do people visit the world's protected areas (PAs? Despite PAs covering one-eighth of the land and being a major focus of nature-based recreation and tourism, we don't know. To address this, we compiled a globally-representative database of visits to PAs and built region-specific models predicting visit rates from PA size, local population size, remoteness, natural attractiveness, and national income. Applying these models to all but the very smallest of the world's terrestrial PAs suggests that together they receive roughly 8 billion (8 x 109 visits/y-of which more than 80% are in Europe and North America. Linking our region-specific visit estimates to valuation studies indicates that these visits generate approximately US $600 billion/y in direct in-country expenditure and US $250 billion/y in consumer surplus. These figures dwarf current, typically inadequate spending on conserving PAs. Thus, even without considering the many other ecosystem services that PAs provide to people, our findings underscore calls for greatly increased investment in their conservation.

Evaporation from the world's oceans constitutes the largest component of the global water balance. It is important not only as the ultimate source of moisture that is tied to the radiative processes determining Earth's energy balance but also to freshwater availability over land, governing habitability of the planet. Here we focus on variability of ocean evaporation on scales from interannual to decadal by appealing to three sources of data: the new MERRA-2 (Modern-Era Retrospective analysis for Research and Applications -2); climate models run with historical sea-surface temperatures, ice and atmospheric constituents (so-called AMIP experiments); and state-of-the-art satellite retrievals from the Seaflux and HOAPS (Hamburg Ocean-Atmosphere Parameters and Fluxes from Satellite) projects. Each of these sources has distinct advantages as well as drawbacks. MERRA-2, like other reanalyses, synthesizes evaporation estimates consistent with observationally constrained physical and dynamical models-but data stream discontinuities are a major problem for interpreting multi-decadal records. The climate models used in data assimilation can also be run with lesser constraints such as with SSTs and sea-ice (i.e. AMIPs) or with additional, minimal observations of surface pressure and marine observations that have longer and less fragmentary observational records. We use the new ERA-20C reanalysis produced by ECMWF embodying the latter methodology. Still, the model physics biases in climate models and the lack of a predicted surface energy balance are of concern. Satellite retrievals and comparisons to ship-based measurements offer the most observationally-based estimates, but sensor inter-calibration, algorithm retrieval assumptions, and short records are dominant issues. Our strategy depends on maximizing the advantages of these combined records. The primary diagnostic tool used here is an analysis of bulk aerodynamic computations produced by these sources and uses a first

The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

Natural modes of variability on many timescales influence aerosol particle distributions and cloud properties such that isolating statistically significant differences in cloud radiative forcing due to anthropogenic aerosol perturbations (indirect effects) typically requires integrating over long simulations. For state-of-the-art global climate models (GCM), especially those in which embedded cloud-resolving models replace conventional statistical parameterizations (i.e. multi-scale modeling framework, MMF), the required long integrations can be prohibitively expensive. Here an alternative approach is explored, which implements Newtonian relaxation (nudging) to constrain simulations with both pre-industrial and present-day aerosol emissions toward identical meteorological conditions, thus reducing differences in natural variability and dampening feedback responses in order to isolate radiative forcing. Ten-year GCM simulations with nudging provide a more stable estimate of the global-annual mean aerosol indirect radiative forcing than do conventional free-running simulations. The estimates have mean values and 95% confidence intervals of -1.54 ± 0.02 W/m2 and -1.63 ± 0.17 W/m2 for nudged and free-running simulations, respectively. Nudging also substantially increases the fraction of the world’s area in which a statistically significant aerosol indirect effect can be detected (68% and 25% of the Earth's surface for nudged and free-running simulations, respectively). One-year MMF simulations with and without nudging provide global-annual mean aerosol indirect radiative forcing estimates of -0.80 W/m2 and -0.56 W/m2, respectively. The one-year nudged results compare well with previous estimates from three-year free-running simulations (-0.77 W/m2), which showed the aerosol-cloud relationship to be in better agreement with observations and high-resolution models than in the results obtained with conventional parameterizations.

The 1918 devastating influenza pandemic left a lasting impact on influenza experts and the public, and the importance of global influenza surveillance was soon recognized. The WHO Global Influenza Surveillance Network (GISN) was founded in 1952 and renamed to Global Influenza Surveillance and Response System in 2011 upon the adoption by the World Health Assembly, of the Pandemic Influenza Preparedness Framework for the Sharing of Influenza Viruses and Access to Vaccines and Other Benefits ("PIP Framework"). The importance of influenza surveillance had been recognized and promoted by experts prior to the years leading up to the establishment of WHO. In the 65 years of its existence, the Network has grown to comprise 143 National Influenza Centers recognized by WHO, 6 WHO Collaborating Centers, 4 Essential Regulatory Laboratories, and 13 H5 Reference Laboratories. The Network has proven its excellence throughout these 65 years, providing detailed information on circulating seasonal influenza viruses, as well as immediate response to the influenza pandemics in 1957, 1968, and 2009, and to threats caused by animal influenza viruses and by zoonotic transmission of coronaviruses. For its central role in global public health, the Network has been highly recognized by its many partners and by international bodies. Several generations of world renown influenza scientists have brought the Network to where it is now and they will take it forward to the future, as influenza will remain a pre-eminent threat to humans and to animals. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

Ground-based in situ measurements of 1,1-difluoroethane (HFC-152a, CH3CHF2) which is regulated under the Kyoto Protocol are reported under the auspices of the AGAGE (Advanced Global Atmospheric Gases Experiment) and SOGE (System of Observation of halogenated Greenhouse gases in Europe) programs. Observations of HFC-152a at five locations (four European and one Australian) over a 10 year period were recorded. The annual average growth rate of HFC-152a in the midlatitude Northern Hemisphere has risen from 0.11 ppt/yr to 0.6 ppt/yr from 1994 to 2004. The Southern Hemisphere annual average growth rate has risen from 0.09 ppt/yr to 0.4 ppt/yr from 1998 to 2004. The 2004 average mixing ratio for HFC-152a was 5.0 ppt and 1.8 ppt in the Northern and Southern hemispheres, respectively. The annual cycle observed for this species in both hemispheres is approximately consistent with measured annual cycles at the same locations in other gases which are destroyed by OH. Yearlyglobal emissions of HFC-152a from 1994 to 2004 are derived using the global mean HFC-152a observations and a 12-box 2-D model. The global emission of HFC-152a has risen from 7 Kt/yr to 28 Kt/yr from 1995 to 2004. On the basis of observations of above-baseline elevations in the HFC-152a record and a consumption model, regional emission estimates for Europe and Australia are calculated, indicating accelerating emissions from Europe since 2000. The overall European emission in 2004 ranges from 1.5 to 4.0 Kt/year, 5-15% of global emissions for 1,1-difluoroethane, while the Australian contribution is negligible at 5-10 tonnes/year, <0.05% of global emissions.

Solar global radiation is a function of solar altitude, site altitude, albedo, atmospheric transparency and cloudiness, whereas solar global radiation on a clear day is defined such that it is a function of all the abovementioned parameters except cloudiness. Consequently, analysis of the relative magnitudes of solar global radiation and solar global radiation on a clear day provides a platform for studying the influence of cloudiness on solar global radiation. The Iqbal filter for determining the day type has been utilized to calculate the monthly average clear day solar global radiation at three sites in the Negev region of Israel. An inter-comparison between four models for estimating clear sky solar global radiation at the three sites was made. The relative accuracy of the four models was determined by comparing the monthly average daily clear sky solar global radiation to that determined using the Iqbal filter. The analysis was performed on databases consisting of measurements made during the time interval of January 1991 to December 2004. The monthly average daily clear sky solar global radiation determined by the Berlynd model was found to give the best agreement with that determined using the Iqbal filter. The Berlynd model was then utilized to calculate a daily clear day index, K c , which is defined as the ratio of the daily solar global radiation to the daily clear day solar global radiation. It is suggested that this index be used as an indication of the degree of cloudiness. Linear regression analysis was performed on the individual monthly databases for each site to determine the correlation between the daily clear day index and the daily clearness index, K T

Schumann resonances (SR) are resonant electromagnetic oscillations in extremely low frequency band (ELF, 3 Hz - 3 kHz), which arise in the Earth-ionosphere cavity due to lightning activity in planetary range. The time records in the ELF-band consist of background signals and ELF transients/Q-bursts superimposed on the background exceeding it by a factor of 5 - 10. The former are produced by the common worldwide thunderstorm activity (100 - 150 events per second), the latter origin from individual intense distant lightning discharges (100 - 120 powerful strokes per hour). A Q-burst is produced by a combination of direct and antipodal pulses and the decisive factor for its shape follows from the source-to-observer distance. Diurnal/seasonal variations of global thunderstorm activity can be deduced from spectral amplitudes of SR modes. Here we focus on diurnal/seasonal variations of the number of ELF-transients assuming that it is another way of lightning activity estimation. To search for transients, our own code was applied to the SR vertical electric component measured in October 2004 - October 2008 at the Astronomical and Geophysical Observatory of FMPI CU, Slovakia. Criteria for the identification of the burst are chosen on the basis of the transient amplitudes and their morphological features. Monthly mean daily variations in number of transients showed that African focus dominates at 14 - 16 h UT and it is more active in comparison with Asian source, which dominates at 5 - 8 h UT in dependence on winter or summer month. American source had surprisingly slight response. Meteorological observations in South America aiming to determine lightning hotspots on the Earth indicate that flash rate in this region is greatest during nocturnal 0 h - 3 h local standard time. This fact may be interpreted that Asian and South American sources contribute together in the same UT. Cumulative spectral amplitude of the first three SR modes compared with number of ELF-transients in

cycles for different product categories may be lagged (type II lag) because changes in economic and other factors will result in demands for different products. Identifying lagged life cycle structures major importance in global marketing of food products. The problems in arriving at such estimates...

In this report the largest Lyapunov characteristic exponent of a high dimensional atmospheric global circulation model of intermediate complexity has been estimated numerically. A sensitivity analysis has been carried out by varying the equator-to-pole temperature difference, the space resolution and the value of some parameters employed by the model. Chaotic and non-chaotic regimes of circulation have been found. [it

Estimated rates and efficiency of ocean carbon export flux are sensitive to differences in the depth horizons used to define export, which often vary across methodological approaches. We evaluate sinking particulate organic carbon (POC) flux rates and efficiency (e-ratios) in a global earth system model, using a range of commonly used depth horizons: the seasonal mixed layer depth, the particle compensation depth, the base of the euphotic zone, a fixed depth horizon of 100 m, and the maximum annual mixed layer depth. Within this single dynamically consistent model framework, global POC flux rates vary by 30% and global e-ratios by 21% across different depth horizon choices. Zonal variability in POC flux and e-ratio also depends on the export depth horizon due to pronounced influence of deep winter mixing in subpolar regions. Efforts to reconcile conflicting estimates of export need to account for these systematic discrepancies created by differing depth horizon choices.

in this age group. For this reason, to aid in the diagnosis of asthma in young children, a symptoms-only descriptive approach that includes the definition of various wheezing phenotypes has been recommended. In 1993, the Global Initiative for Asthma (GINA) was implemented to develop a network of individuals...... and our ability to manage and control it effectively. However, in children 5 years and younger, the clinical symptoms of asthma are variable and non-specific. Furthermore, neither airflow limitation nor airway inflammation, the main pathologic hallmarks of the condition, can be assessed routinely......, organizations, and public health officials to disseminate information about the care of patients with asthma while at the same time assuring a mechanism to incorporate the results of scientific investigations into asthma care. Since then, GINA has developed and regularly revised a Global Strategy for Asthma...

As a dominant part of terrestrial ecosystems, forest ecosystem plays an important role in absorbing atmospheric CO2 and global climate change mitigation. From the aspects of zonal climate and geographical distribution, the present carbon stocks and carbon sequestration capacity of forest ecosystem were comprehensively examined based on the review of the latest literatures. The influences of land use change on forest carbon sequestration were analyzed, and factors that leading to the uncertainty of carbon sequestration assessment in forest ecosystem were also discussed. It was estimated that the current forest carbon stock was in the range of 652 to 927 Pg C and the carbon sequestration capacity was approximately 4.02 Pg C · a(-1). In terms of zonal climate, the carbon stock and carbon sequestration capacity of tropical forest were the maximum, about 471 Pg C and 1.02-1.3 Pg C · a(-1) respectively; then the carbon stock of boreal forest was about 272 Pg C, while its carbon sequestration capacity was the minimum, approximately 0.5 Pg C · a(-1); for temperate forest, the carbon stock was minimal, around 113 to 159 Pg C and its carbon sequestration capacity was 0.8 Pg C · a(-1). From the aspect of geographical distribution, the carbon stock of forest ecosystem in South America was the largest (187.7-290 Pg C), then followed by European (162.6 Pg C), North America (106.7 Pg C), Africa (98.2 Pg C) and Asia (74.5 Pg C), and Oceania (21.7 Pg C). In addition, carbon sequestration capacity of regional forest ecosystem was summed up as listed below: Tropical South America forest was the maximum (1276 Tg C · a(-1)), then were Tropical Africa (753 Tg C · a(-1)), North America (248 Tg C · a(-1)) and European (239 Tg C · a(-1)), and East Asia (98.8-136.5 Tg C · a(-1)) was minimum. To further reduce the uncertainty in the estimations of the carbon stock and carbon sequestration capacity of forest ecosystem, comprehensive application of long-term observation, inventories

We estimate methane fluxes across Alaska over a multi-year period using observations from a three-year aircraft campaign, the Carbon Arctic Reservoirs Vulnerability Experiment (CARVE). Existing estimates of methane from Alaska and other Arctic regions disagree in both magnitude and distribution, and before the CARVE campaign, atmospheric observations in the region were sparse. We combine these observations with an atmospheric particle trajectory model and a geostatistical inversion to estimate surface fluxes at the model grid scale. We first use this framework to estimate the spatial distribution of methane fluxes across the state. We find the largest fluxes in the south-east and North Slope regions of Alaska. This distribution is consistent with several estimates of wetland extent but contrasts with the distribution in most existing flux models. These flux models concentrate methane in warmer or more southerly regions of Alaska compared to the estimate presented here. This result suggests a discrepancy in how existing bottom-up models translate wetland area into methane fluxes across the state. We next use the inversion framework to explore inter-annual variability in regional-scale methane fluxes for 2012-2014. We examine the extent to which this variability correlates with weather or other environmental conditions. These results indicate the possible sensitivity of wetland fluxes to near-term variability in climate.

MERRA products were used to force an established ocean biogeochemical model to estimate surface carbon inventories and fluxes in the global oceans. The results were compared to public archives of in situ carbon data and estimates. The model exhibited skill for ocean dissolved inorganic carbon (DIC), partial pressure of ocean CO2 (pCO2) and air-sea fluxes (FCO2). The MERRA-forced model produced global mean differences of 0.02% (approximately 0.3 microns) for DIC, -0.3% (about -1.2 (micro) atm; model lower) for pCO2, and -2.3% (-0.003 mol C/sq m/y) for FCO2 compared to in situ estimates. Basin-scale distributions were significantly correlated with observations for all three variables (r=0.97, 0.76, and 0.73, P<0.05, respectively for DIC, pCO2, and FCO2). All major oceanographic basins were represented as sources to the atmosphere or sinks in agreement with in situ estimates. However, there were substantial basin-scale and local departures.

We estimate a firm-year measure of accounting conservatism, examine its empirical properties as a metric, and illustrate applications by testing new hypotheses that shed further light on the nature and effects of conservatism. The results are consistent with the measure, C_Score, capturing variation in conservatism and also predicting asymmetric earnings timeliness at horizons of up to three years ahead. Cross-sectional hypothesis tests suggest firms with longer investment cycles, higher idio...

Background Schistosomiasis is a water-based disease that is believed to affect over 200 million people with an estimated 97% of the infections concentrated in Africa. However, these statistics are largely based on population re-adjusted data originally published by Utroska and colleagues more than 20 years ago. Hence, these estimates are outdated due to large-scale preventive chemotherapy programs, improved sanitation, water resources development and management, among other reasons. For planning, coordination, and evaluation of control activities, it is essential to possess reliable schistosomiasis prevalence maps. Methodology We analyzed survey data compiled on a newly established open-access global neglected tropical diseases database (i) to create smooth empirical prevalence maps for Schistosoma mansoni and S. haematobium for individuals aged ≤20 years in West Africa, including Cameroon, and (ii) to derive country-specific prevalence estimates. We used Bayesian geostatistical models based on environmental predictors to take into account potential clustering due to common spatially structured exposures. Prediction at unobserved locations was facilitated by joint kriging. Principal Findings Our models revealed that 50.8 million individuals aged ≤20 years in West Africa are infected with either S. mansoni, or S. haematobium, or both species concurrently. The country prevalence estimates ranged between 0.5% (The Gambia) and 37.1% (Liberia) for S. mansoni, and between 17.6% (The Gambia) and 51.6% (Sierra Leone) for S. haematobium. We observed that the combined prevalence for both schistosome species is two-fold lower in Gambia than previously reported, while we found an almost two-fold higher estimate for Liberia (58.3%) than reported before (30.0%). Our predictions are likely to overestimate overall country prevalence, since modeling was based on children and adolescents up to the age of 20 years who are at highest risk of infection. Conclusion/Significance We

The knowledge of the solar irradiation in a certain place is fundamental for the suitable location of solar systems, both thermal and photovoltaic. On the local scale, the topography is the most important modulating factor of the solar irradiation on the surface. In this work the global daily irradiation is estimated concerning various sky conditions, in zones of complex topography. In order to estimate the global daily irradiation we use a methodology based on a Digital Terrain Model (DTM), on one hand making use of pyranometer measurements and on the other hand utilizing satellite images. We underline that DTM application employing pyranometer measurements produces better results than estimation using satellite images, though accuracy of the same order is obtained in both cases for Root Mean Square Error (RMSE) and Mean Bias Error (MBE).

The knowledge of the solar irradiation in a certain place is fundamental for the suitable location of solar systems, both thermal and photovoltaic. On the local scale, the topography is the most important modulating factor of the solar irradiation on the surface. In this work the global daily irradiation is estimated concerning various sky conditions, in zones of complex topography. In order to estimate the global daily irradiation we use a methodology based on a Digital Terrain Model (DTM), on one hand making use of pyranometer measurements and on the other hand utilizing satellite images. We underline that DTM application employing pyranometer measurements produces better results than estimation using satellite images, though accuracy of the same order is obtained in both cases for Root Mean Square Error (RMSE) and Mean Bias Error (MBE). (author)

GEM is a public-private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) to build an independent standard for modeling and communicating earthquake risk worldwide. GEM is aimed at providing authoritative, open information about seismic risk and decision tools to support mitigation. GEM will also raise risk awareness and help post-disaster economic development, with the ultimate goal of reducing the toll of future earthquakes. GEM will provide a unified set of seismic hazard, risk, and loss modeling tools based on a common global IT infrastructure and consensus standards. These tools, systems, and standards will be developed in partnership with organizations around the world, with coordination by the GEM Secretariat and its Secretary General. GEM partners will develop a variety of global components, including a unified earthquake catalog, fault database, and ground motion prediction equations. To ensure broad representation and community acceptance, GEM will include local knowledge in all modeling activities, incorporate existing detailed models where possible, and independently test all resulting tools and models. When completed in five years, GEM will have a versatile, penly accessible modeling environment that can be updated as necessary, and will provide the global standard for seismic hazard, risk, and loss models to government ministers, scientists and engineers, financial institutions, and the public worldwide. GEM is now underway with key support provided by private sponsors (Munich Reinsurance Company, Zurich Financial Services, AIR Worldwide Corporation, and Willis Group Holdings); countries including Belgium, Germany, Italy, Singapore, Switzerland, and Turkey; and groups such as the European Commission. The GEM Secretariat has been selected by the OECD and will be hosted at the Eucentre at the University of Pavia in Italy; the Secretariat is now formalizing the creation of the GEM Foundation. Some of GEM's global

Chronic obstructive pulmonary disease (COPD) and asthma are common diseases with a heterogeneous distribution worldwide. Here, we present methods and disease and risk estimates for COPD and asthma from the Global Burden of Diseases, Injuries, and Risk Factors (GBD) 2015 study. The GBD study provides annual updates on estimates of deaths, prevalence, and disability-adjusted life years (DALYs), a summary measure of fatal and non-fatal disease outcomes, for over 300 diseases and injuries, for 188 countries from 1990 to the most recent year. We estimated numbers of deaths due to COPD and asthma using the GBD Cause of Death Ensemble modelling (CODEm) tool. First, we analysed data from vital registration and verbal autopsy for the aggregate category of all chronic respiratory diseases. Subsequently, models were run for asthma and COPD relying on covariates to predict rates in countries that have incomplete or no vital registration data. Disease estimates for COPD and asthma were based on systematic reviews of published papers, unpublished reports, surveys, and health service encounter data from the USA. We used the Global Initiative of Chronic Obstructive Lung Disease spirometry-based definition as the reference for COPD and a reported diagnosis of asthma with current wheeze as the definition of asthma. We used a Bayesian meta-regression tool, DisMod-MR 2.1, to derive estimates of prevalence and incidence. We estimated population-attributable fractions for risk factors for COPD and asthma from exposure data, relative risks, and a theoretical minimum exposure level. Results were stratified by Socio-demographic Index (SDI), a composite measure of income per capita, mean years of education over the age of 15 years, and total fertility rate. In 2015, 3·2 million people (95% uncertainty interval [UI] 3·1 million to 3·3 million) died from COPD worldwide, an increase of 11·6% (95% UI 5·3 to 19·8) compared with 1990. There was a decrease in age-standardised death rate of

Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.

Highlights: ► Bristow–Campbell model was calibrated and validated over the Tibetan Plateau. ► Develop a simple method to rasterise the daily global solar radiation and get gridded information. ► The daily global solar radiation spatial distribution over the Tibetan Plateau was estimated. - Abstract: Daily global solar radiation is fundamental to most ecological and biophysical processes because it plays a key role in the local and global energy budget. However, gridded information about the spatial distribution of solar radiation is limited. This study aims to parameterise the Bristow–Campbell model for the daily global solar radiation estimation in the Tibetan Plateau and propose a method to rasterise the daily global solar radiation. Observed daily solar radiation and diurnal temperature data from eleven stations over the Tibetan Plateau during 1971–2010 were used to calibrate and validate the Bristow–Campbell radiation model. The extra-terrestrial radiation and clear sky atmospheric transmittance were calculated on a Geographic Information System (GIS) platform. Results show that the Bristow–Campbell model performs well after adjusting the parameters, the average Pearson’s correlation coefficients (r), Nash–Sutcliffe equation (NSE), ratio of the root mean square error to the standard deviation of measured data (RSR), and root mean-square error (RMSE) of 11 stations are 0.85, 2.81 MJ m −2 day −1 , 0.3 and 0.77 respectively. Gridded maximum and minimum average temperature data were obtained using Parameter-elevation Regressions on Independent Slopes Model (PRISM) and validated by the Chinese Ecosystem Research Network (CERN) stations’ data. The spatial daily global solar radiation distribution pattern was estimated and analysed by combining the solar radiation model (Bristow–Campbell model) and meteorological interpolation model (PRISM). Based on the overall results, it can be concluded that a calibrated Bristow–Campbell performs well

Highlights: • Transferability of SVM in estimation of solar radiation is investigated. • Radiation at estimation site could be well estimated by SVM developed at source site. • A strategy for selecting a suitable source site is presented. • SVM accuracy is affected by distance and temperature difference between two sites. • RMSE of SVM shows logarithm or linearly relationship with altitude of source site. - Abstract: Exploring novel methods for estimation of global solar radiation from air temperature has been being a focus in many studies. This paper evaluates the transferability of support vector machines (SVM) for estimation of solar radiation in subtropical zone in China. Results suggest that solar radiation at one site (estimation site) could be well estimated by SVM model developed at another site (source site). The accuracy of estimation is affected by the distance and temperature difference between two sites, and altitude of source site. Higher correlations between RMSE of SVM and distance, and temperature differences are observed in northeastern region, increasing the reliability and confidence of SVM model developed at nearby stations. While lower correlations between RMSE and distance, and temperature differences are observed in southwest plateau region. When the altitude of estimation site is lower than 1200 m, RMSE show logarithm relationship with altitude of source sites where the altitude are lower than that of estimation site. Otherwise, RMSE show linearly relationship with altitude of source sites where the altitude are higher than 200 m but lower than that of the estimation site. This result suggests that solar radiation could be also estimated using SVM model developed at the site with similar but lower altitude. Based on these results, a strategy that takes into account the climatic conditions, topography, distance, and altitude for selecting a suitable source site is presented. The findings can guide and ease the appropriate choice of

The Global Ice Sheet Mapping Orbiter (GISMO) mission was developed to address scientific needs to understand the polar ice subsurface structure. This NASA Instrument Incubator Program project is a collaboration between Ohio State University, the University of Kansas, Vexcel Corporation and NASA. The GISMO design utilizes an interferometric SAR (InSAR) strategy in which ice sheet reflected signals received by a dual-antenna system are used to produce an interference pattern. The resulting interferogram can be used to filter out surface clutter so as to reveal the signals scattered from the base of the ice sheet. These signals are further processed to produce 3D-images representing basal topography of the ice sheet. In the past three years, the GISMO airborne field campaigns that have been conducted provide a set of useful data for studying geophysical properties of the Greenland ice sheet. While topography information can be obtained using interferometric SAR processing techniques, ice sheet roughness statistics can also be derived by a relatively simple procedure that involves analyzing power levels and the shape of the radar impulse response waveforms. An electromagnetic scattering model describing GISMO impulse responses has previously been proposed and validated. This model suggested that rms-heights and correlation lengths of the upper surface profile can be determined from the peak power and the decay rate of the pulse return waveform, respectively. This presentation will demonstrate a procedure for estimating the roughness of ice surfaces by fitting the GISMO impulse response model to retrieved waveforms from selected GISMO flights. Furthermore, an extension of this procedure to estimate the scattering coefficient of the glacier bed will be addressed as well. Planned future applications involving the classification of glacier bed conditions based on the derived scattering coefficients will also be described.

Assessment of the ability of climate policies to produce desired improvements in public health through co-benefits of air pollution reduction can consume resources in both time and research funds. These resources increase significantly as the spatial resolution of models increases. In addition, the level of spatial detail available in macroeconomic models at the heart of climate policy assessments is much lower than that available in traditional human health risk modeling. It is therefore important to determine whether increasing spatial resolution considerably affects risk-based decisions; which kinds of decisions might be affected; and under what conditions they will be affected. Human health risk co-benefits from carbon emissions reductions that bring about concurrent reductions in Particulate Matter (PM10) emissions is therefore examined here at four levels of spatial resolution (Uniform Nation, Uniform Region, Uniform County/city, Health Risk Assessment) in a case study of Taiwan as one of the geographic regions of a global macroeceonomic model, with results that are representative of small, industrialized nations within that global model. A metric of human health risk mortality (YOLL, years of life lost in life expectancy) is compared under assessments ranging from a "uniform simulation" in which there is no spatial resolution of changes in ambient air concentration under a policy to a "highly spatially resolved simulation" (called here Health Risk Assessment). PM10 is chosen in this study as the indicator of air pollution for which risks are assessed due to its significance as a co-benefit of carbon emissions reductions within climate mitigation policy. For the policy examined, the four estimates of mortality in the entirety of Taiwan are 747 YOLL, 834 YOLL, 984 YOLL and 916 YOLL, under Uniform Taiwan, Uniform Region, Uniform County and Health Risk Assessment respectively; or differences of 18%, 9%, 7% if the HRA methodology is taken as the baseline. While

The most exciting initiative for the recent polar studies was the International Polar Year (IPY) in 2007-2008. The IPY has witnessed a growing community of seismologists who have made considerable efforts to acquire high-quality data in polar regions. It also provided an excellent opportunity to make significant advances in seismic instrumentation of the polar regions to achieve scientific targets involving global issues. Taking these aspects into account, we organize and publish a special issue in Polar Science on the recent advance in polar seismology and cryoseismology as fruitful achievements of the IPY.

The world gas expansion had already shown its limits in 2012 when apparent gas demand had only increased by 2.3%, down from an average growth of 2.8% per year in the previous decade. In 2013, the growth in apparent gas demand slowed even more substantially to 0.8%, according to CEDIGAZ's first estimates. The growth of natural gas has been limited by several factors on both the demand and supply sides against a background of economic and geopolitical turmoil. On the demand side, natural gas still suffers in particular from severe competition with coal in the power generation sector. The singular case of the European gas market is quite instructive. However, natural gas continued to gain ground against fuel oil in most markets. Japan's power generation mix was nuclear free by the end of the year, due to the maintenance period on the two reactors still in operation, despite strong support from the government to restart some of the country's 50 reactors. However, the recourse to LNG imports to compensate for the nuclear shortfall was less apparent in 2013, as conservation measures by consumers in a context of high import prices reduced electricity consumption. Japan's gas demand is now limited by the capacity of both its LNG importing infrastructures and combined-cycle gas power plants. The future pace of restarts of nuclear reactors in Japan remains a matter of speculation. Japan's nuclear malaise has spilled over into neighbouring South Korea, where reactors have been shut by a safety certificate scandal and by other safety issues. These developments create further uncertainties on the LNG demand prospects in Northeast Asia. The global growth in natural gas has been increasingly constrained by supply and investment issues. On the supply side, the gas supply shortfall is generally due to the decline of mature and conventional fields, and an insufficient renewal of reserves. In most regions the reserves-to-replacement ratio has followed a

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible un...

The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO 2 fertilization effects,

This article builds on the premise that human consumption of goods, food and transport are the ultimate drivers of climate change. However, the nature of the climate change problem (well described as a tragedy of the commons) makes it difficult for individuals to recognise their personal duty to implement behavioural changes to reduce greenhouse gas emissions. Consequently, this article aims to analyse the climate change issue from a human-scale perspective, in which each of us has a clearly defined personal quota of CO2 emissions that limits our activity and there is a finite time during which CO2 emissions must be eliminated to achieve the “well below 2°C” warming limit set by the Paris Agreement of 2015 (COP21). Thus, this work’s primary contribution is to connect an equal per capita fairness approach to a global carbon budget, linking personal levels with planetary levels. Here, we show that a personal quota of 5.0 tons of CO2 yr-1 p-1 is a representative value for both past and future emissions; for this level of a constant per-capita emissions and without considering any mitigation, the global accumulated emissions compatible with the “well below 2°C” and 2°C targets will be exhausted by 2030 and 2050, respectively. These are references years that provide an order of magnitude of the time that is left to reverse the global warming trend. More realistic scenarios that consider a smooth transition toward a zero-emission world show that the global accumulated emissions compatible with the “well below 2°C” and 2°C targets will be exhausted by 2040 and 2080, respectively. Implications of this paper include a return to personal responsibility following equity principles among individuals, and a definition of boundaries to the personal emissions of CO2. PMID:28628676

This article builds on the premise that human consumption of goods, food and transport are the ultimate drivers of climate change. However, the nature of the climate change problem (well described as a tragedy of the commons) makes it difficult for individuals to recognise their personal duty to implement behavioural changes to reduce greenhouse gas emissions. Consequently, this article aims to analyse the climate change issue from a human-scale perspective, in which each of us has a clearly defined personal quota of CO2 emissions that limits our activity and there is a finite time during which CO2 emissions must be eliminated to achieve the "well below 2°C" warming limit set by the Paris Agreement of 2015 (COP21). Thus, this work's primary contribution is to connect an equal per capita fairness approach to a global carbon budget, linking personal levels with planetary levels. Here, we show that a personal quota of 5.0 tons of CO2 yr-1 p-1 is a representative value for both past and future emissions; for this level of a constant per-capita emissions and without considering any mitigation, the global accumulated emissions compatible with the "well below 2°C" and 2°C targets will be exhausted by 2030 and 2050, respectively. These are references years that provide an order of magnitude of the time that is left to reverse the global warming trend. More realistic scenarios that consider a smooth transition toward a zero-emission world show that the global accumulated emissions compatible with the "well below 2°C" and 2°C targets will be exhausted by 2040 and 2080, respectively. Implications of this paper include a return to personal responsibility following equity principles among individuals, and a definition of boundaries to the personal emissions of CO2.

Full Text Available Emissions of biogenic volatile organic compounds (BVOC are a chief uncertainty in calculating the burdens of important atmospheric compounds like tropospheric ozone or secondary organic aerosol, reflecting either imperfect chemical oxidation mechanisms or unreliable emission estimates, or both. To provide a starting point for a more systematic discussion we review here global isoprene and monoterpene emission estimates to-date. We note a surprisingly small variation in the predictions of global isoprene emission rate that is in stark contrast with our lack of process understanding and the small number of observations for model parameterisation and evaluation. Most of the models are based on similar emission algorithms, using fixed values for the emission capacity of various plant functional types. In some cases, these values are very similar but differ substantially in other models. The similarities with regard to the global isoprene emission rate would suggest that the dominant parameters driving the ultimate globalestimate, and thus the dominant determinant of model sensitivity, are the specific emission algorithm and isoprene emission capacity. But the models also differ broadly with regard to their representation of net primary productivity, method of biome coverage determination and climate data. Contrary to isoprene, monoterpene estimates show significantly larger model-to-model variation although variation in terms of leaf algorithm, emission capacities, the way of model upscaling, vegetation cover or climatology used in terpene models are comparable to those used for isoprene. From our summary of published studies there appears to be no evidence that the terrestrial modelling community has been any more successful in "resolving unknowns" in the mechanisms that control global isoprene emissions, compared to global monoterpene emissions. Rather, the proliferation of common parameterization schemes within a large variety of model platforms

Full Text Available ObjectivesThe German socio-demographic estimation scale was developed by Jahn et al. (1 to quickly predict premorbid global cognitive functioning in patients. So far, it has been validated in healthy adults and has shown a good correlation with the full and verbal IQ of the Wechsler Adult Intelligence Scale (WAIS in this group. However, there are no data regarding its use as a bedside test in epilepsy patients.MethodsForty native German speaking adult patients with refractory epilepsy were included. They completed a neuropsychological assessment, including a nine scale short form of the German version of the WAIS-III and the German socio-demographic estimation scale by Jahn et al. (1 during their presurgical diagnostic stay in our center. We calculated means, correlations, and the rate of concordance (range ±5 and ±7.5 IQ score points between these two measures for the whole group, and a subsample of 19 patients with a global cognitive functioning level within 1 SD of the mean (IQ score range 85–115 and who had completed their formal education before epilepsy onset.ResultsThe German demographic estimation scale by Jahn et al. (1 showed a significant mean overestimation of the global cognitive functioning level of eight points in the epilepsy patient sample compared with the short form WAIS-III score. The accuracy within a range of ±5 or ±7.5 IQ score points for each patient was similar to that of the healthy controls reported by Jahn et al. (1 in our subsample, but not in our whole sample.ConclusionOur results show that the socio-demographic scale by Jahn et al. (1 is not sufficiently reliable as an estimation tool of global cognitive functioning in epilepsy patients. It can be used to estimateglobal cognitive functioning in a subset of patients with a normal global cognitive functioning level who have completed their formal education before epilepsy onset, but it does not reliably predict global cognitive functioning in epilepsy patients

In this work, the current version of the satellite-based HELIOSAT method and ground-based linear Ångström–Prescott type relations are used in combination. The first approach is based on the use of a correlation between daily bright sunshine hours (s) and cloud index (n). In the second approach a new correlation is proposed between daily solar irradiation and daily data of s and n which is based on a physical parameterization. The performances of the proposed two combined models are tested against conventional methods. We test the use of obtained correlation coefficients for nearby locations. Our results show that the use of sunshine duration together with the cloud index is quite satisfactory in the estimation of daily horizontal global solar irradiation. We propose to use the new approaches to estimate daily global irradiation when the bright sunshine hours data is available for the location of interest, provided that some regression coefficients are determined using the data of a nearby station. In addition, if surface data for a close location does not exist then it is recommended to use satellite models like HELIOSAT or the new approaches instead the Ångström type models. - Highlights: • Satellite imagery together with surface measurements in solar radiation estimation. • The new coupled and conventional models (satellite and ground-based) are analyzed. • New models result in highly accurate estimation of daily global solar irradiation

Full Text Available Introduction: Standing height is an important anthropometric parameter to track longitudinal growth, to estimate body fatness and to calculate energy requirement. Measurement of height may be difficult in children who cannot stand. Aim: To establish regression equation for estimation of height from arm span in children. To check comparative relevancy of this equation with fixed height-to-arm span ratio (HAR for estimation of height. Materials and Methods: A cross-sectional study was conducted with 6-11 years school children (n=1465, Boys=774, Girls=691 in state of Odisha, India. Height was measured by portable stadiometer and arm span was measured by fiberglass measuring tape to nearest 0.1 cm. Pearson correlation and regression analysis was carried out between height and arm span data. p<0.05 (two tail was considered statistically significant. Results: Mean height and arm span in boys (124.16±8.74 cm and 125.57±10.43 cm respectively was significantly more (p<0.001 than height and arm span in girls (121.18±10.37 cm and 121.50±11.68 cm respectively. Mean HAR was 0.9942±0.0279. Correlation between height and arm span in boys was r = 0.94 (p<0.001 and in girls was r = 0.96 (p<0.001. Overall correlation coefficient was r = 0.95 (p<0.001. Regression equation for estimation of height from arm span was established: Height (cm = 0.8192 * arm span (cm + 21.46. Conclusion: Height in children of 6-11 years showed strong positive correlation with arm span. Regression equation established from this study can be used to estimate height from arm span. This estimation is more reliable than estimation of height from HAR.

The estimation of parameter values for mathematical models of biological systems is an optimization problem that is particularly challenging due to the nonlinearities involved. One major difficulty is the existence of multiple minima in which standard optimization methods may fall during the search. Deterministic global optimization methods overcome this limitation, ensuring convergence to the global optimum within a desired tolerance. Global optimization techniques are usually classified into stochastic and deterministic. The former typically lead to lower CPU times but offer no guarantee of convergence to the global minimum in a finite number of iterations. In contrast, deterministic methods provide solutions of a given quality (i.e., optimality gap), but tend to lead to large computational burdens. This work presents a deterministic outer approximation-based algorithm for the global optimization of dynamic problems arising in the parameter estimation of models of biological systems. Our approach, which offers a theoretical guarantee of convergence to global minimum, is based on reformulating the set of ordinary differential equations into an equivalent set of algebraic equations through the use of orthogonal collocation methods, giving rise to a nonconvex nonlinear programming (NLP) problem. This nonconvex NLP is decomposed into two hierarchical levels: a master mixed-integer linear programming problem (MILP) that provides a rigorous lower bound on the optimal solution, and a reduced-space slave NLP that yields an upper bound. The algorithm iterates between these two levels until a termination criterion is satisfied. The capabilities of our approach were tested in two benchmark problems, in which the performance of our algorithm was compared with that of the commercial global optimization package BARON. The proposed strategy produced near optimal solutions (i.e., within a desired tolerance) in a fraction of the CPU time required by BARON.

Schistosomiasis is a water-based disease that is believed to affect over 200 million people with an estimated 97% of the infections concentrated in Africa. However, these statistics are largely based on population re-adjusted data originally published by Utroska and colleagues more than 20 years...... ago. Hence, these estimates are outdated due to large-scale preventive chemotherapy programs, improved sanitation, water resources development and management, among other reasons. For planning, coordination, and evaluation of control activities, it is essential to possess reliable schistosomiasis...

Full Text Available Ionosphere research using the Global Navigation Satellite Systems (GNSS techniques is a hot topic, with their unprecedented high temporal and spatial sampling rate. We introduced a new GNSS Ionosphere Monitoring and Analysis Software (GIMAS in order to model the global ionosphere vertical total electron content (VTEC maps and to estimate the GPS and GLObalnaya NAvigatsionnaya Sputnikovaya Sistema (GLONASS satellite and receiver differential code biases (DCBs. The GIMAS-based Global Ionosphere Map (GIM products during low (day of year from 202 to 231, in 2008 and high (day of year from 050 to 079, in 2014 solar activity periods were investigated and assessed. The results showed that the biases of the GIMAS-based VTEC maps relative to the International GNSS Service (IGS Ionosphere Associate Analysis Centers (IAACs VTEC maps ranged from −3.0 to 1.0 TECU (TEC unit (1 TECU = 1 × 1016 electrons/m2. The standard deviations (STDs ranged from 0.7 to 1.9 TECU in 2008, and from 2.0 to 8.0 TECU in 2014. The STDs at a low latitude were significantly larger than those at middle and high latitudes, as a result of the ionospheric latitudinal gradients. When compared with the Jason-2 VTEC measurements, the GIMAS-based VTEC maps showed a negative systematic bias of about −1.8 TECU in 2008, and a positive systematic bias of about +2.2 TECU in 2014. The STDs were about 2.0 TECU in 2008, and ranged from 2.2 to 8.5 TECU in 2014. Furthermore, the aforementioned characteristics were strongly related to the conditions of the ionosphere variation and the geographic latitude. The GPS and GLONASS satellite and receiver P1-P2 DCBs were compared with the IAACs DCBs. The root mean squares (RMSs were 0.16–0.20 ns in 2008 and 0.13–0.25 ns in 2014 for the GPS satellites and 0.26–0.31 ns in 2014 for the GLONASS satellites. The RMSs of receiver DCBs were 0.21–0.42 ns in 2008 and 0.33–1.47 ns in 2014 for GPS and 0.67–0.96 ns in 2014 for GLONASS. The monthly

To investigate the published papers of ophthalmology in past ten years and explore the development of ophthalmology. The data of this study retrieved from Science Citation Index Expanded and downloaded online in November 2017, including all the papers with publication year from 2007-2016 were analyzed. The papers were based on the Web of Science category and the journals were based on the Journal Citation Report category. The number of ophthalmology papers increased from 7450 to 9089 during 2007 to 2017. The average rate increased 2.2% annually. USA accounts for one thirds of the total and two thirds of the highly cited papers. In Asia, China, Japan and South Korea were in Top 10 by the number of ophthalmology papers. UK, Germany, Japan and Australia also had great impact in global ophthalmology. The hot spots included endothelial growth factor, optical coherence tomography and open-angle glaucoma. USA is in the leading position in global ophthalmology. Part of Asian countries play an important role in the development of ophthalmology, but the impact needs to be improved.

The International Heliophysical Year (IHY) in 2007 & 2008 will celebrate the 50th anniversary of the International Geophysical Year (IGY) and, following its tradition of international research collaboration, will focus on the cross-disciplinary studies of universal processes in the heliosphere. The main goal of IHY Education and Outreach Program is to create more global access to exemplary resources in space and earth science education and public outreach. By taking advantage of the IHY organization with representatives in every nation and in the partnership with the United Nations Basic Space Science Initiative (UNBSSI), we aim to promote new international partnerships. Our goal is to assist in increasing the visibility and accessibility of exemplary programs and in the identification of formal or informal educational products that would be beneficial to improve the space and earth science knowledge in a given country; leaving a legacy of enhanced global access to resources and of world-wide connectivity between those engaged in education and public outreach efforts that are related to IHY science. Here we describe how to participate in the IHY Education and Outreach Program and the benefits in doing so. Emphasis will be given to the role played by developing countries; not only in selecting useful resources and helping in their translation and adaptation, but also in providing different approaches and techniques in teaching.

Drylands are among those regions most sensitive to climate and environmental changes and human-induced perturbations. The most widely accepted definition of the term dryland is a ratio, called the Surface Wetness Index (SWI), of annual precipitation to potential evapotranspiration (PET) being below 0.65. PET is commonly estimated using the Thornthwaite (PET Th) and Penman-Monteith equations (PET PM). The present study compared spatiotemporal characteristics of global drylands based on the SWI with PET Th and PET PM. Results showed vast differences between PET Th and PET PM; however, the SWI derived from the two kinds of PET showed broadly similar characteristics in the interdecadal variability of global and continental drylands, except in North America, with high correlation coefficients ranging from 0.58 to 0.89. It was found that, during 1901-2014, global hyper-arid and semi-arid regions expanded, arid and dry sub-humid regions contracted, and drylands underwent interdecadal fluctuation. This was because precipitation variations made major contributions, whereas PET changes contributed to a much lesser degree. However, distinct differences in the interdecadal variability of semi-arid and dry sub-humid regions were found. This indicated that the influence of PET changes was comparable to that of precipitation variations in the global dry-wet transition zone. Additionally, the contribution of PET changes to the variations in global and continental drylands gradually enhanced with global warming, and the Thornthwaite method was found to be increasingly less applicable under climate change.

Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is â€œflatâ€ . While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between â€œoldâ€ countries and â€œnewâ€ . As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

Introduction The aims of this study were to estimate all-cause and cause-specific mortality and years of life lost, investigated by disability-adjusted life-years (DALYs), due to colorectal cancer attributable to physical inactivity in Brazil and in the states; to analyze the temporal trend of these estimates over 25 years (1990–2015) compared with globalestimates and according to the socioeconomic status of states of Brazil. Methods Databases from the Global Burden of Disease Study (GBD) for Brazil, Brazilian states and global information were used. It was estimated the total number and the age-standardized rates of deaths and DALYs for colorectal cancer attributable to physical inactivity in the years 1990 and 2015. We used the Socioeconomic Development Index (SDI). Results Physical inactivity was responsible for a substantial number of deaths (1990: 1,302; 2015: 119,351) and DALYs (1990: 31,121; 2015: 87,116) due to colorectal cancer in Brazil. From 1990 to 2015, the mortality and DALYs due to colorectal cancer attributable to physical inactivity increased in Brazil (0.6% and 0.6%, respectively) and decreased around the world (-0.8% and -1.1%, respectively). The Brazilian states with better socioeconomic indicators had higher rates of mortality and morbidity by colorectal cancer due to physical inactivity (pBrazil. Conclusions Over 25 years, the Brazilian population showed more worrisome results than around the world. Actions to combat physical inactivity and greater cancer screening and treatment are urgent in the Brazilian states. PMID:29390002

The growing health risks associated with greenhouse gas emissions highlight the need for new energy policies that emphasize efficiency and low-carbon energy intensity. We assessed the relationships among electricity use, coal consumption, and health outcomes. Using time-series data sets from 41 countries with varying development trajectories between 1965 and 2005, we developed an autoregressive model of life expectancy (LE) and infant mortality (IM) based on electricity consumption, coal consumption, and previous year's LE or IM. Prediction of health impacts from the Greenhouse Gas and Air Pollution Interactions and Synergies (GAINS) integrated air pollution emissions health impact model for coal-fired power plants was compared with the time-series model results. The time-series model predicted that increased electricity consumption was associated with reduced IM for countries that started with relatively high IM (> 100/1,000 live births) and low LE (electricity consumption regardless of IM and LE in 1965. Increasing coal consumption was associated with increased IM and reduced LE after accounting for electricity consumption. These results are consistent with results based on the GAINS model and previously published estimates of disease burdens attributable to energy-related environmental factors, including indoor and outdoor air pollution and water and sanitation. Increased electricity consumption in countries with IM consumption has significant detrimental health impacts.

We use observations of time variable gravity from GRACE to estimate mass changes for the Antarctic and Greenland Ice Sheets, the Glaciers and Ice Caps (GIC) and land water storage for the time period 2002-2015 and evaluate their total contribution to sea level. We calculate regional sea level changes from these present day mass fluxes using an improved scaling factor for the GRACE data that accounts for the spatial and temporal variability of the observed signal. We calculate a separate scaling factor for the annual and the long-term components of the GRACE signal. To estimate the contribution of the GIC, we use a least square mascon approach and we re-analyze recent inventories to optimize the distribution of mascons and recover the GRACE signal more accurately. We find that overall, Greenland controls 43% of the global trend in eustatic sea level rise, 16% for Antarctica and 29% for the GIC. The contribution from the GIC is dominated by the mass loss of the Canadian Arctic Archipelago, followed by Alaska, Patagonia and the High Mountains of Asia. We report a marked increase in mass loss for the Canadian Arctic Archipelago. In Greenland, following the 2012 high summer melt, years 2013 and 2014 have slowed down the increase in mass loss, but our results will be updated with summer 2015 observations at the meeting. In Antarctica, the mass loss is still on the rise with increased contributions from the Amundsen Sea sector and surprisingly from the Wilkes Land sector of East Antarctica, including Victoria Land. Conversely, the Queen Maud Land sector experienced a large snowfall in 2009-2013 and has now resumed to a zero mass gain since 2013. We compare sea level changes from these GRACE derived mass fluxes after including the atmospheric and ocean loading signal with sea level change from satellite radar altimetry (AVISO) corrected for steric signal of the ocean using Argo measurements and find an excellent agreement in amplitude, phase and trend in these estimates

U.S. Environmental Protection Agency — The set of commercially available chemical substances in commerce that may have significant global warming potential (GWP) is not well defined. Although there are...

Full Text Available Methyl chloride (CH3Cl is a chlorine-containing trace gas in the atmosphere contributing significantly to stratospheric ozone depletion. Large uncertainties in estimates of its source and sink magnitudes and temporal and spatial variations currently exist. GEIA inventories and other bottom-up emission estimates are used to construct a priori maps of the surface fluxes of CH3Cl. The Model of Atmospheric Transport and Chemistry (MATCH, driven by NCEP interannually varying meteorological data, is then used to simulate CH3Cl mole fractions and quantify the time series of sensitivities of the mole fractions at each measurement site to the surface fluxes of various regional and global sources and sinks. We then implement the Kalman filter (with the unit pulse response method to estimate the surface fluxes on regional/global scales with monthly resolution from January 2000 to December 2004. High frequency observations from the AGAGE, SOGE, NIES, and NOAA/ESRL HATS in situ networks and low frequency observations from the NOAA/ESRL HATS flask network are used to constrain the source and sink magnitudes. The inversion results indicate global total emissions around 4100 ± 470 Gg yr−1 with very large emissions of 2200 ± 390 Gg yr−1 from tropical plants, which turn out to be the largest single source in the CH3Cl budget. Relative to their a priori annual estimates, the inversion increases global annual fungal and tropical emissions, and reduces the global oceanic source. The inversion implies greater seasonal and interannual oscillations of the natural sources and sink of CH3Cl compared to the a priori. The inversion also reflects the strong effects of the 2002/2003 globally widespread heat waves and droughts on global emissions from tropical plants, biomass burning and salt marshes, and on the soil sink.

Background: During the early postnatal period, the impact of nutrition on DNA methylation has not been well studied in humans. The aim was to quantify the relationship between one-carbon metabolism nutrient intake during the first three years of life and global DNA methylation levels at four years. Design: Childhood dietary intake was assessed using infant feeding questionnaires, food frequency questionnaires, 4-day weighed food records and 24-h food records. The dietary records were used to estimate the intake of methionine, folate, vitamins B2, B6 and B12 and choline. The accumulative nutrient intake specific rank from three months to three years of age was used for analysis. Global DNA methylation (%5-methyl cytosines (%5-mC)) was measured in buccal cells at four years of age, using an enzyme-linked immunosorbent assay (ELISA) commercial kit. Linear regression models were used to quantify the statistical relationships. Results: Data were collected from 73 children recruited from the Women and their Children’s Health (WATCH) study. No association was found between one-carbon metabolism nutrient intake and global DNA methylation levels (P 0.05). Global DNA methylation levels in males were significantly higher than in females (median %5-mC: 1.82 vs. 1.03, males and females respectively, (P 0.05)). Conclusion: No association was found between the intake of one-carbon metabolism nutrients during the early postnatal period and global DNA methylation levels at age four years. Higher global DNA methylation levels in males warrants further investigation. PMID:29495543

Full Text Available Background: During the early postnatal period, the impact of nutrition on DNA methylation has not been well studied in humans. The aim was to quantify the relationship between one-carbon metabolism nutrient intake during the first three years of life and global DNA methylation levels at four years. Design: Childhood dietary intake was assessed using infant feeding questionnaires, food frequency questionnaires, 4-day weighed food records and 24-h food records. The dietary records were used to estimate the intake of methionine, folate, vitamins B2, B6 and B12 and choline. The accumulative nutrient intake specific rank from three months to three years of age was used for analysis. Global DNA methylation (%5-methyl cytosines (%5-mC was measured in buccal cells at four years of age, using an enzyme-linked immunosorbent assay (ELISA commercial kit. Linear regression models were used to quantify the statistical relationships. Results: Data were collected from 73 children recruited from the Women and their Children’s Health (WATCH study. No association was found between one-carbon metabolism nutrient intake and global DNA methylation levels (P > 0.05. Global DNA methylation levels in males were significantly higher than in females (median %5-mC: 1.82 vs. 1.03, males and females respectively, (P < 0.05. Conclusion: No association was found between the intake of one-carbon metabolism nutrients during the early postnatal period and global DNA methylation levels at age four years. Higher global DNA methylation levels in males warrants further investigation.

In Japan, the electric power by nuclear energy accounts for about 20 % of the total power at present. Then, radiation is utilized extensively in such fields as industries, agriculture and medicine. The expenditures (budgets) estimated for the fiscal year 1985 are about 343.8 billion yen plus contract authorization limitation about 146.7 billion yen. In connection with the expenditures estimation (of which a breakdown is given in tables), the research and development plans for nuclear energy relation for fiscal year 1985 are presented: strengthening in nuclear energy safety, promotion of nuclear power generation, establishment of the nuclear fuel cycle, development of advanced power reactors, research on nuclear fusion, promotion of radiation utilizations, strengthening in the research and development infrastructure, promotion of international cooperation, etc. (Mori, K.)

Perfluorinated alkylate substances (PFASs) are highly persistent and may cause immunotoxic effects. PFAS-associated attenuated antibody responses to childhood vaccines may be affected by PFAS exposures during infancy, where breastfeeding adds to PFAS exposures. Of 490 members of a Faroese birth...... cohort, 275 and 349 participated in clinical examinations and provided blood samples at ages 18 months and 5 years. PFAS concentrations were measured at birth and at the clinical examinations. Using information on duration of breastfeeding, serum-PFAS concentration profiles during infancy were estimated......, with decreases by up to about 20% for each two-fold higher exposure, while associations for serum concentrations at ages 18 months and 5 years were weaker. Modeling of serum-PFAS concentration showed levels for age 18 months that were similar to those measured. Concentrations estimated for ages 3 and 6 months...

Ocean FEST (Families Exploring Science Together) engages elementary school students and their parents and teachers in hands-on science. Through this evening program, we educate participants about ocean and earth science issues that are relevant to their local communities. In the process, we hope to inspire more underrepresented students, including Native Hawaiians, Pacific Islanders and girls, to pursue careers in the ocean and earth sciences. Hawaii and the Pacific Islands will be disproportionately affected by the impacts of global climate change, including rising sea levels, coastal erosion, coral reef degradation and ocean acidification. It is therefore critically important to train ocean and earth scientists within these communities. This two-hour program explores ocean properties and timely environmental topics through six hands-on science activities. Activities are designed so students can see how globally important issues (e.g., climate change and ocean acidification) have local effects (e.g., sea level rise, coastal erosion, coral bleaching) which are particularly relevant to island communities. The Ocean FEST program ends with a career component, drawing parallel between the program activities and the activities done by "real scientists" in their jobs. The take-home message is that we are all scientists, we do science every day, and we can choose to do this as a career. Ocean FEST just completed our pilot year. During the 2009-2010 academic year, we conducted 20 events, including 16 formal events held at elementary schools and 4 informal outreach events. Evaluation data were collected at all formal events. Formative feedback from adult participants (parents, teachers, administrators and volunteers) was solicited through written questionnaires. Students were invited to respond to a survey of five questions both before and after the program to see if there were any changes in content knowledge and career attitudes. In our presentation, we will present our

Full Text Available Typhoid and paratyphoid fever remain important causes of morbidity worldwide. Accurate disease burden estimates are needed to guide policy decisions and prevention and control strategies.

Sunlit and shaded leaf separation proposed by Norman (1982) is an effective way to upscale from leaf to canopy in modeling vegetation photosynthesis. The Boreal Ecosystem Productivity Simulator (BEPS) makes use of this methodology, and has been shown to be reliable in modeling the gross primary productivity (GPP) derived from CO2flux and tree ring measurements. In this study, we use BEPS to investigate the effect of canopy architecture on the global distribution of GPP. For this purpose, we use not only leaf area index (LAI) but also the first ever global map of the foliage clumping index derived from the multiangle satellite sensor POLDER at 6 km resolution. The clumping index, which characterizes the degree of the deviation of 3-dimensional leaf spatial distributions from the random case, is used to separate sunlit and shaded LAI values for a given LAI. Our model results show that global GPP in 2003 was 132 ± 22 Pg C. Relative to this baseline case, our results also show: (1) global GPP is overestimated by 12% when accurate LAI is available but clumping is ignored, and (2) global GPP is underestimated by 9% when the effective LAI is available and clumping is ignored. The clumping effects in both cases are statistically significant (p < 0.001). The effective LAI is often derived from remote sensing by inverting the measured canopy gap fraction to LAI without considering the clumping. Global GPP would therefore be generally underestimated when remotely sensed LAI (actually effective LAI by our definition) is used. This is due to the underestimation of the shaded LAI and therefore the contribution of shaded leaves to GPP. We found that shaded leaves contribute 50%, 38%, 37%, 39%, 26%, 29% and 21% to the total GPP for broadleaf evergreen forest, broadleaf deciduous forest, evergreen conifer forest, deciduous conifer forest, shrub, C4 vegetation, and other vegetation, respectively. The global average of this ratio is 35%.

Biomass burning constitutes a major contribution to global emissions of carbon dioxide, carbon monoxide, methane, greenhouse gases and aerosols. Furthermore, biomass burning has an impact on health, transport, the environment and land use. Vegetation fires are certainly not recent phenomena and the impacts are not always negative. However, evidence suggests that fires are becoming more frequent and there is a large increase in the number of fires being set by humans for a variety of reasons. Knowledge of the interactions and feedbacks between biomass burning, climate and carbon cycling is needed to help the prediction of climate change scenarios. To obtain this knowledge, the scientific community requires, in the first instance, information on the spatial and temporal distribution of biomass burning at the global scale. This paper presents an inventory of burned areas at monthly time periods for the year 2000 at a resolution of 1 kilometer (km) and is available to the scientific community at no cost. The burned area products have been derived from a single source of satellite-derived images, the SPOT VEGETATION S1 1 km product, using algorithms developed and calibrated at regional scales by a network of partners. In this paper, estimates of burned area, number of burn scars and average size of the burn scar are described for each month of the year 2000. The information is reported at the country level. This paper makes a significant contribution to understanding the effect of biomass burning on atmospheric chemistry and the storage and cycling of carbon by constraining one of the main parameters used in the calculation of gas emissions

Biomass burning constitutes a major contribution to global emissions of carbon dioxide, carbon monoxide, methane, greenhouse gases and aerosols. Furthermore, biomass burning has an impact on health, transport, the environment and land use. Vegetation fires are certainly not recent phenomena and the impacts are not always negative. However, evidence suggests that fires are becoming more frequent and there is a large increase in the number of fires being set by humans for a variety of reasons. Knowledge of the interactions and feedbacks between biomass burning, climate and carbon cycling is needed to help the prediction of climate change scenarios. To obtain this knowledge, the scientific community requires, in the first instance, information on the spatial and temporal distribution of biomass burning at the global scale. This paper presents an inventory of burned areas at monthly time periods for the year 2000 at a resolution of 1 kilometer (km) and is available to the scientific community at no cost. The burned area products have been derived from a single source of satellite-derived images, the SPOT VEGETATION S1 1 km product, using algorithms developed and calibrated at regional scales by a network of partners. In this paper, estimates of burned area, number of burn scars and average size of the burn scar are described for each month of the year 2000. The information is reported at the country level. This paper makes a significant contribution to understanding the effect of biomass burning on atmospheric chemistry and the storage and cycling of carbon by constraining one of the main parameters used in the calculation of gas emissions.

Full Text Available Background: Collaborations for global surgery face many challenges to achieve fair and safe patient care and to build sustainable capacity. The 2004 terrorist attack on a school in Beslan in North Ossetia in the Russian North Caucasus left many victims with complex otologic barotrauma. In response, we implemented a global surgery partnership between the Vladikavkaz Children's Hospital, international surgical teams, the North Ossetian Health Ministry, and civil society organizations. This study's aim was to describe the implementation and 5-year results of capacity building for complex surgery in a postconflict, mid-income setting. Design: We conducted an observational study at the Children's Hospital in Vladikavkaz in the autonomous Republic of North Ossetia-Alania, part of the Russian Federation. We assessed the outcomes of 15 initial patients who received otologic surgeries for complex barotrauma resulting from the Beslan terrorism attack and for other indications, and report the incidence of intra- and postoperative complications. Results: Patients were treated for trauma related to terrorism (53% and for indications not related to violence (47%. None of the patients developed peri- or postoperative complications. Three patients (two victims of terrorism who underwent repair of tympanic perforations presented with re-perforations. Four junior and senior surgeons were trained on-site and in Germany to perform and teach similar procedures autonomously. Conclusions: In mid-income, postconflict settings, complex surgery can be safely implemented and achieve patient outcomes comparable to global standards. Capacity building can build on existing resources, such as operation room management, nursing, and anesthesia services. In postconflict environments, substantial surgical burden is not directly attributable to conflict-related injury and disease, but to health systems weakened by conflicts. Extending training and safe surgical care to include

The global gravity wave (GW) potential energy (PE) per unit mass is derived from SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) temperature profiles over the past 14 years (2002-2015). Since the SABER data cover longer than one solar cycle, multivariate linear regression is applied to calculate the trend (means linear trend from 2002 to 2015) of global GW PE and the responses of global GW PE to solar activity, to QBO (quasi-biennial oscillation) and to ENSO (El Niño-Southern Oscillation). We find a significant positive trend of GW PE at around 50°N during July from 2002 to 2015, in agreement with ground-based radar observations at a similar latitude but from 1990 to 2010. Both the monthly and the deseasonalized trends of GW PE are significant near 50°S. Specifically, the deseasonalized trend of GW PE has a positive peak of 12-15% per decade at 40°S-50°S and below 60 km, which suggests that eddy diffusion is increasing in some places. A significant positive trend of GW PE near 50°S could be due to the strengthening of the polar stratospheric jets, as documented from Modern Era Retrospective-analysis for Research and Applications wind data. The response of GW PE to solar activity is negative in the lower and middle latitudes. The response of GW PE to QBO (as indicated by 30 hPa zonal winds over the equator) is negative in the tropical upper stratosphere and extends to higher latitudes at higher altitudes. The response of GW PE to ENSO (as indicated by the Multivariate ENSO Index) is positive in the tropical upper stratosphere.

The solar radiation received at the surface of the earth, apart from its relevance to several daily human activities, plays an important role in the growth and development of plants. The aim of the current work was to develop and gauge an estimation model for the evaluation of the global solar radiation flux density as a function of the solar energy potential at soil surface. Radiometric data were collected at Ponta Grossa, PR, Brazil (latitude 25°13' S, longitude 50°03' W, altitude 880 m). Estimated values of solar energy potential obtained as a function of only one measurement taken at solar noon time were confronted with those measured by a Robitzsch bimetalic actinograph, for days that presented insolation ratios higher than 0.85. This data set was submitted to a simple linear regression analysis, having been obtained a good adjustment between observed and calculated values. For the estimation of the coefficients a and b of Angström's equation, the method based on the solar energy potential at soil surface was used for the site under study. The methodology was efficient to assess the coefficients, aiming at the determination of the global solar radiation flux density, whith quickness and simplicity, having also found out that the criterium for the estimation of the solar energy potential is equivalent to that of the classical methodology of Angström. Knowledge of the available solar energy potential and global solar radiation flux density is of great importance for the estimation of the maximum atmospheric evaporative demand, of water consumption by irrigated crops, and also for building solar engineering equipment, such as driers, heaters, solar ovens, refrigerators, etc [pt

How were cities distributed globally in the past? How many people lived in these cities? How did cities influence their local and regional environments? In order to understand the current era of urbanization, we must understand long-term historical urbanization trends and patterns. However, to date there is no comprehensive record of spatially explicit, historic, city-level population data at the global scale. Here, we developed the first spatially explicit dataset of urban settlements from 3700 BC to AD 2000, by digitizing, transcribing, and geocoding historical, archaeological, and census-based urban population data previously published in tabular form by Chandler and Modelski. The dataset creation process also required data cleaning and harmonization procedures to make the data internally consistent. Additionally, we created a reliability ranking for each geocoded location to assess the geographic uncertainty of each data point. The dataset provides the first spatially explicit archive of the location and size of urban populations over the last 6,000 years and can contribute to an improved understanding of contemporary and historical urbanization trends.

Global observations of aerosol properties from space are critical for understanding climate change and air quality applications. The Ozone Monitoring Instrument (OMI) onboard the EOS-Aura satellite provides information on aerosol optical properties by making use of the large sensitivity to aerosol absorption and dark surface albedo in the UV spectral region. These unique features enable us to retrieve both aerosol extinction optical depth (AOD) and single scattering albedo (SSA) successfully from radiance measurements at 354 and 388 nm by the OMI near UV aerosol algorithm (OMAERUV). Recent improvements to algorithms in conjunction with the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) and Atmospheric Infrared Sounder (AIRS) carbon monoxide data also reduce uncertainties due to aerosol layer heights and types significantly in retrieved products. We present validation results of OMI AOD against space and time collocated Aerosol Robotic Network (AERONET) measured AOD values over multiple stations representing major aerosol episodes and regimes. We also compare the OMI SSA against the inversion made by AERONET as well as an independent network of ground-based radiometer called SKYNET in Japan, China, South-East Asia, India, and Europe. The outcome of the evaluation analysis indicates that in spite of the "row anomaly" problem, affecting the sensor since mid-2007, the long-term aerosol record shows remarkable sensor stability. The OMAERUV 10-yearglobal aerosol record is publicly available at the NASA data service center web site (http://disc.sci.gsfc.nasa.gov/Aura/data-holdings/OMI/omaeruv_v003.shtml).

towing tank experiments using a container ship scale model. The estimates for both regular and irregular waves confirm the results. Finally, the estimator is applied to full-scale data gathered from a container ship operating in the Atlantic Ocean during a storm. Again the theoretical results...

Previous research has examined heuristics--simplified decision-making rules-of-thumb--for geospatial reasoning. This study examined at two locations the influence of beliefs about local coastline orientation on estimated directions to local and distant places; estimates were made immediately or after fifteen seconds. This study goes beyond…

In the modern era of globalization, the economic activities expand with the passage of time. This expansion may increase demand for energy both in developing and developed countries. Therefore, this study assesses the impact of financial development on energy consumption incorporating the role of globalization in Next-11 countries. A group of panel estimation techniques is used to analyze the panel data and time series data for the time 1990-2014. The empirical results of the study suggest that financial development stimulates energy consumption. Also, globalization increases demand for energy consumption, although the single country analysis suggests that the effect of globalization on energy demand is heterogeneous among N-11 countries. Furthermore, feedback hypothesis is confirmed between financial development and energy consumption. Also, bidirectional causality is found between economic growth and energy consumption. The findings urge for the attention of policymaker in emerging countries to develop a strategy to reduce the consequences of energy consumption by controlling resource transfer through globalization to the host country and by adopting energy conversation policies.

First let's describe the context. Approximately 80% of our students in Medellin major in engineering. In the National University system students can choose freely twenty percent of the credits. Their decisions are made taking into account various reasons. As far as we know the amount of work and the expected grades are factors besides the interest in the topics. Statistics show that there is an even distribution among complementary professional, cross-disciplinary and general interest courses. Plan B took the name from Lexter Brown book, which was the original inspiration and text. The program expanded with more in depth consideration to a general understanding of climate and climate change science, and to water and energy crises because they are close to my research area. But we consider other global change uses as well, including recycling, loss of biodiversity, food crises, economics of climate change and demographic and social issues. We developed a textbook whose title would translate as "Where is the Globe heading?" that refers to a usual saying during Christmas time in relation to candle balloons popular at that time of the year that children and teen-agers try to catch. The expression reflects the need for predictions, call for action, but also acknowledges that chance is a factor to consider. I believe it summarizes well the content of the course. The class meets in a large auditorium with 250 sits. We moved from the usual size room of about 50 because of the large demand during registration. This forced us to adjust the methodology, but our evaluation is that such a large audience is worthwhile. Student's feedbacks at the end of the semester confirm this with very good rating and general comments. Besides crude diagnostics of the problems based on data and science we always make an effort to present solutions. For instance there is ample consideration to renewal energy technologies. Globalization is also a theme of the course, there are local actions but

to population centres, electrical transmission grids, terrain types, and protected land areas are important parts of the resource assessment downstream of the generation of wind climate statistics. Related to these issues of integration are the temporal characteristics and spatial correlation of the wind...... resources. These aspects will also be addressed by the Global Wind Atlas. The Global Wind Atlas, through a transparent methodology, will provide a unified, high resolution, and public domain dataset of wind energy resources for the whole world. The wind atlas data will be the most appropriate wind resource...

Annual resolution reconstructions of alpine temperatures are rare, particularly for the Southern Hemisphere, while no snow cover reconstructions exist. These records are essential to place in context the impact of anthropogenic global warming against historical major natural climate events such as the Roman Warm Period (RWP), Medieval Climate Anomaly (MCA) and Little Ice Age (LIA). Here we show for a marginal alpine region of Australia using a carbon isotope speleothem reconstruction, warming over the past five decades has experienced equivalent magnitude of temperature change and snow cover decline to the RWP and MCA. The current rate of warming is unmatched for the past 2000 years and seasonal snow cover is at a minimum. On scales of several decades, mean maximum temperatures have undergone considerable change ≈ ± 0.8 °C highlighting local scale susceptibility to rapid temperature change, evidence of which is often masked in regional to hemisphere scale temperature reconstructions.

Dental age estimation (AE) tests are routinely done on living and deceased persons. There is anecdotal evidence suggesting an increase in age estimations due to the refugee crisis. Our aim is to determine the reasons and methods for performing dental AE tests in both living and deceased individuals. Global trends in AE over the past 10 years were also investigated. A database of all forensic laboratories was obtained and an electronic questionnaire was sent to all of them. The questionnaire was self-developed and included questions on the reasons for performing AE tests, the preferred methods used in living and deceased individuals, and the people/organizations who requested these AE tests. The number of tests performed annually varied between 0 and 500 and the majority were on asylum seekers, refugees, and for adoption cases. Most units used multiple techniques to determine the age among the living, but seldom used more than three techniques for the deceased. The majority of tests were requested by coroners and the legal fraternity. There has been an increase in the number of dental AEs carried out and this has been mostly due to asylum seekers and refugees. The most common techniques for the living were variations of Demirjian's technique while country specific techniques were used for the deceased.

Full Text Available Dental age estimation (AE tests are routinely done on living and deceased persons. There is anecdotal evidence suggesting an increase in age estimations due to the refugee crisis. Our aim is to determine the reasons and methods for performing dental AE tests in both living and deceased individuals. Global trends in AE over the past 10 years were also investigated. A database of all forensic laboratories was obtained and an electronic questionnaire was sent to all of them. The questionnaire was self-developed and included questions on the reasons for performing AE tests, the preferred methods used in living and deceased individuals, and the people/organizations who requested these AE tests. The number of tests performed annually varied between 0 and 500 and the majority were on asylum seekers, refugees, and for adoption cases. Most units used multiple techniques to determine the age among the living, but seldom used more than three techniques for the deceased. The majority of tests were requested by coroners and the legal fraternity. There has been an increase in the number of dental AEs carried out and this has been mostly due to asylum seekers and refugees. The most common techniques for the living were variations of Demirjian’s technique while country specific techniques were used for the deceased.

Education and training in human rights has been set as a priority by the United Nations. Health and human rights are closely related. Training professionals from various backgrounds in human rights might ultimately contribute to improve the health of individuals and communities. We present the 5 years' experience with a 3-week residential Global Health and Human Rights Course developed at the University of Geneva and implemented with the support/participation of international organizations (IOs) and non-governmental organizations active in the health and human rights sector. Over the years, roughly 150 students from 43 nationalities, with many different educational backgrounds, attended the course. The male/female ratio was 1/5. The adopted educational approach was multifold and comprised lectures from academics and experts with field experience, group work, individual case studies, journal clubs, and site visits. Evaluation data show that site visits at IOs were highly appreciated as well as networking opportunities among students, with academics and experts with field experience. The variety of topics discussed was, at times, "too much"; yet, it allowed students to measure the extent of the challenges the field is facing. The adopted active learning approach facilitated the exchange of experiences among students and allowed them to get acquainted with different cultural sensitivities. The Global Health and Human Rights Summer-School of the University of Geneva allowed its participants, coming from all over the world, to identify challenges of the interlinked fields of health and human rights, reflect upon their underlying causes, and imagine possible solutions. Sharing our experience will hopefully help passionate educators around the world to develop similar programs.

Hydrochlorofluorocarbons (HCFCs) are ozone depleting substances and potent greenhouse gases that are controlled under the Montreal Protocol. However, the majority of the 274 HCFCs included in Annex C of the protocol do not have reported global warming potentials (GWPs) which are used to guide the phaseout of HCFCs and the future phase down of hydrofluorocarbons (HFCs). In this study, GWPs for all C1-C3 HCFCs included in Annex C are reported based on estimated atmospheric lifetimes and theoretical methods used to calculate infrared absorption spectra. Atmospheric lifetimes were estimated from a structure activity relationship (SAR) for OH radical reactivity and estimated O(1D) reactivity and UV photolysis loss processes. The C1-C3 HCFCs display a wide range of lifetimes (0.3 to 62 years) and GWPs (5 to 5330, 100-year time horizon) dependent on their molecular structure and the H-atom content of the individual HCFC. The results from this study provide estimated policy-relevant GWP metrics for the HCFCs included in the Montreal Protocol in the absence of experimentally derived metrics.

Information on the global risk factors of children mortality is crucial to guide global efforts to improve survival. Corruption has been previously shown to significantly impact on child mortality. However no recent quantification of its current impact is available. The impact of corruption was assessed through crude Pearson's correlation, univariate and multivariate linear models coupling national under-five mortality rates in 2008 to the national "perceived level of corruption" (CPI) and a large set of adjustment variables measured during the same period. The final multivariable model (adjusted R(2)= 0.89) included the following significant variables: percentage of people with improved sanitation (p.valueCorruption Perception Index (p.valuecorruption) was associated with an increase in the log of national under-five mortality rate of 0.0644. According to this result, it could be roughly hypothesized that more than 140000 annual children deaths could be indirectly attributed to corruption. Global response to children mortality must involve a necessary increase in funds available to develop water and sanitation access and purchase new methods for prevention, management, and treatment of major diseases drawing the global pattern of children deaths. However without paying regard to the anti-corruption mechanisms needed to ensure their proper use, it will also provide further opportunity for corruption. Policies and interventions supported by governments and donors must integrate initiatives that recognise how they are inter-related.

Illness and death from diseases caused by contaminated food are a constant threat to public health and a significant impediment to socio-economic development worldwide. To measure the global and regional burden of foodborne disease (FBD), the World Health Organization (WHO) established the Foodborne

Future global copper demand is expected to keep rising due to copper's indispensable role in modern technologies. Unfortunately, increasing copper extraction and decreasing ore grades intensify energy use and generate higher environmental impact. A potential solution would be reaching a circular

Water storage is an important way to cope with temporal variation in water supply anddemand. The storage capacity and the lifetime of water storage reservoirs can besigniﬁcantly reduced by the inﬂow of sediments. A global, spatially explicit assessment ofreservoir storage loss in conjunction with

The increasing absolute number of paediatric CT scans raises concern about the safety and efficacy and the effects of consecutive diagnostic ionising radiation. To demonstrate a method to evaluate the lifetime attributable risk of cancer incidence/mortality due to a single low-dose helical chest CT in a two-year patient cohort. A two-year cohort of 522 paediatric helical chest CT scans acquired using a dedicated low-dose protocol were analysed retrospectively. Patient-specific estimations of radiation doses were modelled using three different mathematical phantoms. Per-organ attributable cancer risk was then estimated using epidemiological models. Additional comparison was provided for naturally occurring risks. Total lifetime attributable risk of cancer incidence remains low for all age and sex categories, being highest in female neonates (0.34%). Summation of all cancer sites analysed raised the relative lifetime attributable risk of organ cancer incidence up to 3.6% in female neonates and 2.1% in male neonates. Using dedicated scan protocols, total lifetime attributable risk of cancer incidence and mortality for chest CT is estimated low for paediatric chest CT, being highest for female neonates. (orig.)

Despite substantial interest in urban agriculture, little is known about the aggregate benefits conferred by natural capital for growing food in cities. Here we perform a scenario-based analysis to quantify ecosystem services from adoption of urban agriculture at varying intensity. To drive the scenarios, we created global-scale estimates of vacant land, rooftop and building surface area, at one kilometer resolution, from remotely sensed and modeled geospatial data. We used national scale agricultural reports, climate and other geospatial data at global scale to estimate agricultural production and economic returns, storm-water avoidance, energy savings from avoided heating and cooling costs, and ecosystem services provided by nitrogen sequestration, pollination and biocontrol of pests. The results indicate that vacant lands, followed by rooftops, represent the largest opportunities for natural capital put to agricultural use in urban areas. Ecosystem services from putting such spaces to productive use are dominated by agricultural returns, but energy savings conferred by insulative characteristics of growth substrate also provide economic incentives. Storm water avoidance was estimated to be substantial, but no economic value was estimated. Relatively low economic returns were estimated from the other ecosystem services examined. In aggregate, approximately $10-100 billion in economic incentives, before costs, were estimated. The results showed that relatively developed, high-income countries stand the most to gain from urban agricultural adoption due to the unique combination of climate, crop mixture and crop prices. While the results indicate that urban agriculture is not a panacea for urban food security issues, there is potential to simultaneously ameliorate multiple issues around food, energy and water in urbanized areas.

Empirical and theoretical models of sub-seafloor organic matter transformation, degradation and methanogenesis require estimates of initial seafloor total organic carbon (TOC). This subsurface methane, under the appropriate geophysical and geochemical conditions may manifest as methane hydrate deposits. Despite the importance of seafloor TOC, actual observations of TOC in the world's oceans are sparse and large regions of the seafloor yet remain unmeasured. To provide an estimate in areas where observations are limited or non-existent, we have implemented interpolation techniques that rely on existing data sets. Recent geospatial analyses have provided accurate accounts of global geophysical and geochemical properties (e.g. crustal heat flow, seafloor biomass, porosity) through machine learning interpolation techniques. These techniques find correlations between the desired quantity (in this case TOC) and other quantities (predictors, e.g. bathymetry, distance from coast, etc.) that are more widely known. Predictions (with uncertainties) of seafloor TOC in regions lacking direct observations are made based on the correlations. Global distribution of seafloor TOC at 1 x 1 arc-degree resolution was estimated from a dataset of seafloor TOC compiled by Seiter et al. [2004] and a non-parametric (i.e. data-driven) machine learning algorithm, specifically k-nearest neighbors (KNN). Built-in predictor selection and a ten-fold validation technique generated statistically optimal estimates of seafloor TOC and uncertainties. In addition, inexperience was estimated. Inexperience is effectively the distance in parameter space to the single nearest neighbor, and it indicates geographic locations where future data collection would most benefit prediction accuracy. These improved geospatial estimates of TOC in data deficient areas will provide new constraints on methane production and subsequent methane hydrate accumulation.

Assessing the mortality impact of the 2009 influenza A H1N1 virus (H1N1pdm09) is essential for optimizing public health responses to future pandemics. The World Health Organization reported 18,631 laboratory-confirmed pandemic deaths, but the total pandemic mortality burden was substantially higher. We estimated the 2009 pandemic mortality burden through statistical modeling of mortality data from multiple countries. We obtained weekly virology and underlying cause-of-death mortality time series for 2005-2009 for 20 countries covering ∼35% of the world population. We applied a multivariate linear regression model to estimate pandemic respiratory mortality in each collaborating country. We then used these results plus ten country indicators in a multiple imputation model to project the mortality burden in all world countries. Between 123,000 and 203,000 pandemic respiratory deaths were estimatedglobally for the last 9 mo of 2009. The majority (62%-85%) were attributed to persons under 65 y of age. We observed a striking regional heterogeneity, with almost 20-fold higher mortality in some countries in the Americas than in Europe. The model attributed 148,000-249,000 respiratory deaths to influenza in an average pre-pandemic season, with only 19% in persons representation of low-income countries among single-country estimates and an inability to study subsequent pandemic waves (2010-2012). We estimate that 2009 global pandemic respiratory mortality was ∼10-fold higher than the World Health Organization's laboratory-confirmed mortality count. Although the pandemic mortality estimate was similar in magnitude to that of seasonal influenza, a marked shift toward mortality among persons Europe. A collaborative network to collect and analyze mortality and hospitalization surveillance data is needed to rapidly establish the severity of future pandemics. Please see later in the article for the Editors' Summary.

Since 2008, the World Health Organization (WHO) has coordinated the Global Rotavirus Surveillance Network, a network of sentinel surveillance hospitals and laboratories that report to ministries of health (MoHs) and WHO clinical features and rotavirus testing data for children aged reporting and testing inclusion criteria for data analysis. Of the 37 countries with sites meeting inclusion criteria, 13 (35%) had introduced rotavirus vaccine nationwide. All 79 sites included in the analysis were meeting 2008 network objectives of documenting presence of disease and describing disease epidemiology, and all countries were using the rotavirus surveillance data for vaccine introduction decisions, disease burden estimates, and advocacy; countries were in the process of assessing the use of this surveillance platform for other vaccine-preventable diseases. However, the review also indicated that the network would benefit from enhanced management, standardized data formats, linkage of clinical data with laboratory data, and additional resources to support network functions. In November 2013, WHO's Strategic Advisory Group of Experts on Immunization (SAGE) endorsed the findings and recommendations made by the review team and noted potential opportunities for using the network as a platform for other vaccine-preventable disease surveillance. WHO will work to implement the recommendations to improve the network's functions and to provide higher quality surveillance data for use in decisions related to vaccine introduction and vaccination program sustainability.

Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field.

The year 2013–2014 has been designated the GlobalYear Against Orofacial Pain by the International Association for the Study of Pain. Accordingly, a multidisciplinary Canadian and international group of clinical, research and knowledge-transfer experts attended a workshop in Montreal, Quebec. The workshop had two aims: to identify new pathways for innovative diagnosis and management of chronic orofacial pain states; and to identify opportunities for further collaborative orofacial pain research and education in Canada. Three topics related to chronic orofacial pain were explored: biomarkers and pain signatures for chronic orofacial pain; misuse of analgesic and opioid pain medications for managing chronic orofacial pain; and complementary alternative medicine, topical agents and the role of stress in chronic orofacial pain. It was determined that further research is needed to: identify biomarkers of chronic orofacial post-traumatic neuropathic pain, with a focus on psychosocial, physiological and chemical-genetic factors; validate the short-and long-term safety (ie, no harm to health, and avoidance of misuse and addiction) of opioid use for two distinct conditions (acute and chronic orofacial pain, respectively); and promote the use of topical medications as an alternative treatment in dentistry, and further document the benefits and safety of complementary and alternative medicine, including stress management, in dentistry. It was proposed that burning mouth syndrome, a painful condition that is not uncommon and affects mainly postmenopausal women, should receive particular attention. PMID:25522352

The year 2013-2014 has been designated the GlobalYear Against Orofacial Pain by the International Association for the Study of Pain. Accordingly, a multidisciplinary Canadian and international group of clinical, research and knowledge-transfer experts attended a workshop in Montreal, Quebec. The workshop had two aims: to identify new pathways for innovative diagnosis and management of chronic orofacial pain states; and to identify opportunities for further collaborative orofacial pain research and education in Canada. Three topics related to chronic orofacial pain were explored: biomarkers and pain signatures for chronic orofacial pain; misuse of analgesic and opioid pain medications for managing chronic orofacial pain; and complementary alternative medicine, topical agents and the role of stress in chronic orofacial pain. It was determined that further research is needed to: identify biomarkers of chronic orofacial post-traumatic neuropathic pain, with a focus on psychosocial, physiological and chemical-genetic factors; validate the short- and long-term safety (i.e., no harm to health, and avoidance of misuse and addiction) of opioid use for two distinct conditions (acute and chronic orofacial pain, respectively); and promote the use of topical medications as an alternative treatment in dentistry, and further document the benefits and safety of complementary and alternative medicine, including stress management, in dentistry. It was proposed that burning mouth syndrome, a painful condition that is not uncommon and affects mainly postmenopausal women, should receive particular attention.

Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental