Share this article with

Electron Beam Melting (EBM) is an increasingly used Additive Manufacturing (AM) technique employed by many industrial sectors, including the medical device and aerospace industries. In-process EBM monitoring for quality assurance purposes has been a popular research area. Electronic imaging has recently been investigated as one of the in-process EBM data collection methods, alongside thermal/ optical imaging techniques. Despite certain capabilities of an electronic imaging system have been investigated, experiments are yet to be carried out to benchmark one of the most important features of any imaging systems – spatial resolution. This article addresses this knowledge gap by: (1) proposing an indicator for the estimation of spatial resolution which includes the Backscattered Electrons (BSE) information depth, (2) estimating the achievable spatial resolution when electronic imaging is carried out inside an Arcam A1 EBM machine, and (3) presenting an experimental method to conduct edge resolution evaluation with the EBM machine. Analyses of experimental results indicated that the spatial resolution was of the order of 0.3mm-0.4mm when electronic imaging was carried out at room temperature. It is believed that by disseminating an analysis and experimental method to estimate and quantify spatial resolution, this study has contributed to the on-going quality assessment research in the field of in-process monitoring of the EBM process.

Share this article with

Electron Beam Melting (EBM) is an increasingly used Additive Manufacturing (AM) technique employed by many industrial sectors, including the medical device and aerospace industries. The application of this technology is, however, challenged by the lack of process monitoring and control system that underpins process repeatability and part quality reproducibility. An electronic imaging system prototype has been developed to serve as an EBM monitoring technique, the capabilities of which have been verified at room temperature and at 320+10°C. Nevertheless, in order to fully assess the applicability of this technique, the image quality needs to be investigated at a range of elevated temperatures to fully understand the influence of thermal noise due to heat. In this paper, electronic imaging pilot trials at elevated temperatures, ranging from room temperature to , were carried out. Image quality measure Q of the digital electron images was evaluated, and the influence of temperature was investigated. In this study, raw electronic images generated at higher temperatures had greater Q values, i.e. better global image quality. It has been demonstrated that, for temperatures between , the influence of temperature on electronic image quality was not adversely affecting the visual clarity of image features. It is envisaged that the prototype has significant potential to contribute to in-process EBM monitoring in many manufacturing sectors.

Share this article with

The 2030 Agenda for Sustainable Development, the SDGs, are high on the agenda for most countries of the world. In its publication of the SDGs, the UN has provided the goals and target descriptions that, if implemented at a country level, would lead towards a sustainable future. The IAEG (InterAgency Expert Group of the SDGs) was tasked with disseminating indicators and methods to countries that can be used to gather data describing the global progress towards sustainability. However 2030 Agenda leaves it to countries to adopt the targets with each government setting its own national targets guided by the global level of ambition but taking into account national circumstances. At present, guidance on how to go about this is scant, but it is clear that the responsibility is with countries to implement and that it is actions at a country level that will determine the success of the SDGs. SDG reporting by countries takes on two forms 1) global reporting using prescribed indicator methods and data; 2) National Voluntary Reviews where a country reports on its own progress in more detail but is also able to present data that are more appropriate for the country. For the latter, countries need to be able to adapt the global indicators to fit national priorities and context, thus the global description of an indicator could be reduced to describe only what is relevant to the country. Countries may also, for the National Voluntary Review, use indicators that are unique to the country but nevertheless contribute to measurement of progress towards the global SDG target. Importantly, for those indicators that relate to the security of natural resources security (e.g. water) indicators, there are no prescribed numerical targets/standards or benchmarks. Rather countries will need to set their own benchmarks or standards against which performance can be evaluated. This paper presents a procedure that would enable a country to describe national targets with associated benchmarks that are appropriate for the country. The procedure focusses on those SDG targets that are natural resource-security focussed e.g. extent of water-related ecosystems (6.6), desertification (15.3) etc., because the selection of indicator methods and benchmarks is based on the location of natural resources, their use and present state and how they fit into national strategies.

Share this article with

The development of new target stations for radioisotope production based on a dedicated 70~MeV commercial cyclotron is described. Currently known as the South African Isotope Facility (SAIF), this initiative will free the existing separated-sector cyclotron (SSC) at iThemba LABS (near Cape Town) to mainly pursue research activities in nuclear physics and radiobiology. It is foreseen that the completed SAIF facility will realize a three-fold increase in radioisotope production capacity compared to the current programme based on the SSC.

Share this article with

We propose that the high energy Cosmic Ray particles around the spectral turn-down commonly called the {\it knee} and up to the upturn, commonly called the {\it ankle}, mostly come from Blue Super Giant star explosions. At the upturn, i.e. the {\it ankle}, Cosmic Rays probably switch to another source class, most likely extragalactic sources. To show this we recently compiled a set of Radio Supernova data to list the magnetic field, shock speed and radius scale (Biermann et al. 2018) \cite{Biermann18}. Using particle acceleration theory at shocks, those numbers can be transformed into characteristic {\it knee} and {\it ankle} energies. Without adjusting any free parameters both of these observed energies are directly indicated by the supernova data. We now proceed to the next step in the argument, and use the Supernova Remnant data of the starburst galaxy M82. Assuming that they are Blue Supergiant star explosions, the shock will race to their outer edge with a magnetic field that follows ${B (r) \, r \, \sim \, const.}$. We argue that the shock runs through the entire magnetic plasma wind region at full speed all the way out to the wind-shell, which is of order parsec scale. The speed is observed to be $\sim \, 0.1 \, c$ at about ${10^{16} \, {\rm cm}}$ radius in the plasma wind. This demonstrates how Blue Supergiant star explosions can provide the Cosmic Ray particles across the {\it knee} and up to the {\it ankle} energy range. The data from the CREAM (Cosmic Ray Energetics and Mass Experiment) mission will test this cosmic ray concept which is reasonably well grounded in two independent radio supernova data sets. The next step in developing our understanding is to obtain accurate Cosmic Ray data near to the {\it knee}, and use unstable isotopes of Cosmic Ray nuclei at high energy to probe the "piston" driving the explosion. We plan to combine these data with the physics of the budding black hole which is probably forming in each of these stars to learn more.

Share this article with

Adenine nucleotide (AN) 2nd messengers such as 3’,5’-cyclic adenosine monophosphate (cAMP) are central elements of intracellular signaling, but many details of underlying processes remain still elusive. Like all nucleotides, cyclic nucleotide monophosphates (cNMPs) are net-negatively charged at physiologic pH which limits their applicability in cell-based settings. Thus, many cellular assays rely on sophisticated techniques like microinjection or electroporation. This setup is not feasible for medium- to high-throughput formats, and the mechanic stress that cells are exposed to raises the probability of interfering artefacts or false-positives. Here, we present a short and flexible chemical route yielding membrane-permeable, bio-reversibly masked cNMPs for which we employed the octanoyloxybenzyl (OB) group. We further show hydrolysis studies on chemical stability and enzymatic activation, and present results of real-time assays, where we used cAMP and Ca2+ live cell imaging to demonstrate high permeability and prompt intracellular conversion of some selected masked cNMPs. Consequently, our novel OB-masked cNMPs constitute valuable precursor-tools for non-invasive studies on intracellular signaling.

Share this article with

The accurate virus detection, strain discrimination, and source attribution of contaminated food items remains a persistent challenge because of the high mutation rates anticipated to occur in foodborne RNA viruses, such as Hepatitis A virus (HAV). This has led to predictions of the existence of more than one sequence variant between the hosts (inter-host) or within an individual host (intra-host). However, there have been no reports of intra-host variants from an infected single individual, and little is known about the accuracy of the single nucleotide variations (SNVs) calling with various methods. In this study, the presence and identity of viral SNVs, either between HAV clinical specimens or among a series of samples derived from HAV clone1-infected FRhK4 cells, were determined following analyses of nucleotide sequences generated using next-generation sequencing (NGS) and pyrosequencing methods. The results demonstrate the co-existence of inter- and intra-host variants both in the clinical specimens and the cultured samples. The discovery and confirmation of multi-viral RNAs in an infected individual is dependent on the strain discrimination at the SNV level, and critical for successful outbreak traceback and source attribution investigations. The detection of SNVs in a time series of HAV infected FRhK4 cells improved our understanding on the mutation dynamics determined probably by different selective pressures. Additionally, it demonstrated that NGS could potentially provide a valuable investigative approach toward SNV detection and identification for other RNA viruses.

Share this article with

Risk-averse areas such as the medical, aerospace and energy sectors have been somewhat slow towards accepting and applying Additive Manufacturing (AM) in many of their value chains. This is partly because there are still signicant uncertainties concerning the quality of AM builds. This paper introduces a machine learning algorithm for the automatic detection of faults in AM products. The approach is semi-supervised in that, during training, it is able to use data from both builds where the resulting components were certied and builds where the quality of the resulting components is unknown. This makes the approach cost ecient, particularly in scenarios where part certication is costly and time consuming. The study specically analyses Selective Laser Melting (SLM) builds. Keyfeatures are extracted from large sets of photodiode data, obtained during the building of 49 tensile test bars. Ultimate tensile strength (UTS) tests were then used to categorise each bar as `faulty' or `acceptable'. A fully supervised approach identied faulty specimens with a 77% success rate whilethe semi-supervised approach was able to consistently achieve similar results, despite being trained on a fraction of the available certication data. The results show that semi-supervised learning is a promising approach for the automatic certication of AM builds that can be implemented at a fraction of the cost currently required.

Share this article with

Universities, like cities, have embraced novel technologies and data-based solutions to improve their campuses with ‘smart’ becoming a welcomed concept. Campuses in many ways are small-scale cities. They increasingly seek to address similar challenges and to deliver improved experiences to their users. How can data be used in making this vision a reality? What can we learn from smart campuses that can be scale up to smart cities? A short research study was conducted over a three-month period at a public university in the United Kingdom employing stakeholder interviews and user surveys, aiming at gaining insight into these questions. Based on the study, the authors suggest that making data publicly available could bring many benefits to different groups of stakeholders and campus users. These benefits come with risks and challenges such as data privacy and protection and infrastructure hurdles. However, if these challenges can be overcome, open data could contribute significantly to improving campuses and user experiences, and potentially set an example for smart cities.

Share this article with

In this study, a full-scale pilot testing was performed with side-by-side operation of a conventional enhanced biological phosphorus removal (EBPR) process and a side-stream EBPR (S2EBPR) process. A comparison of the performance, activities and population dynamics of key functionally relevant populations between the two configurations were carried out. The results demonstrated that, with the same influent wastewater characteristics, S2EBPR configuration showed more effective and stable orthophosphate (PO4-P) removal performance (up to 94% with average effluent concentration down to 0.1 mg P/L) than conventional EBPR, especially when the mixers in side-stream reactor were operated intermittently. Mass balance analysis illustrated that both denitrification and EBPR performance have been enhanced in S2EBPR configuration through diverting primary effluent to anoxic zone and producing additional carbon (~40%) via fermentation in side-stream reactor. Microbial characterization showed that there was no significant difference in the relative abundances of Ca. Accumulibacter (~5.9%) and Tetrasphaera (~16%) putative polyphosphate-accumulating organisms (PAOs) between the two configurations. However, lower relative abundance of known GAOs was observed in S2EBPR configuration (1.1%) than the conventional one (2.7%). A relatively higher PAO activity and increased degree of dependence on glycolysis pathway than TCA cycle was observed in S2EBPR configuration using P release and uptake batch test. Adequate anaerobic solid retention time (SRT) and conditions that generate continuous and slow feeding/production of volatile fatty acid (VFA) with higher composition percentage of propionate in the side-stream reactor of S2EBPR process likely provide a competitive advantage for PAOs over GAOs.

Share this article with

Photoplethysmography (PPG) is a simple-to-perform vascular optics measurement technique that can detect changes in blood volume in the microvascular tissue bed. Beat-to-beat analysis of the PPG waveform enables the study of the variability of pulse features such as amplitude and pulse arrival time (PAT), and when quantified in the time and frequency domains, has considerable potential to shed light on perfusion changes associated with peripheral arterial disease (PAD). In this pilot study innovative multi-site bilateral finger and toe PPG recordings from 43 healthy control subjects and 31 PAD subjects were compared (recordings each at least 5 minutes, collected in a warm temperature-controlled room). Beat-to-beat normalized amplitude and PAT variability was then quantified in the time-domain using SD and IQR measures and in the frequency-domain bilaterally using Magnitude Squared Coherence (MSC). Significantly reduced normalized amplitude variability (healthy control 0.0384 (IQR 0.0217-0.0744) vs PAD 0.0160 (0.0080-0.0338) (p<0.001) and significantly increased PAT variability (healthy control 0.0063 (0.0052-0.0086) vs PAD 0.0093 (0.0078-0.0144) (p<0.001) was demonstrated in PAD using the time-domain analysis. Frequency-domain analysis demonstrated significantly lower MSC values across a range of frequency bands for PAD patients. These changes suggest a loss of right-to-left body side coherence and cardiovascular control in PAD. This study has also demonstrated the feasibility of using these measurement and analysis methods in studies investigating multi-site PPG variability for a wide range of cardiac and vascular patient groups.

Share this article with

3D bioprinting holds great promise in the field of regenerative medicine as it can create complex structures in a layer-by-layer manner using cell-laden bioinks, making it possible to imitate native tissues. Current bioinks lack both the high printability and the biocompatibility required in this respect. Hence, the development of bioinks that are capable of both properties is needed. In our previous study, a furfuryl-gelatin based bioink, crosslinkable by visible light, was used for creating mouse mesenchymal stem cell-laden structures with high fidelity. In this study, lattice mesh geometries were printed in a comparative study to test against the properties of a traditional rectangular-sheet. After 3D printing and crosslinking, both structures were analysed for swelling and rheological properties, and their porosity estimated using scanning electron microscopy. Results showed that the lattice structure was relatively more porous but sturdy and exhibited a lower degradation rate compared to the rectangular-sheet. Further, the lattice allowed encapsulation of a greater number of cells, allowing them to proliferate to a greater extent compared to the rectangular-sheet that retained a lesser number of cells initially. All of these results collectively affirmed that the lattice poses as a superior scaffold design for tissue engineering applications.

Share this article with

This brief review highlights some current issues in Galactic stellar nucleosynthesis, and some recent laboratory studies by the Wisconsin atomic physics group that have direct application to stellar spectroscopy, in order to advance our understanding of the chemical evolution of our Galaxy. The relevant publication history of the lab studies are summarized, and investigations into the abundances of neutron-capture and iron-peak elements in low metallicity stars are described. Finally, new initatives in near-infrared spectroscopy are briefly explored.

Share this article with

Immobility during hospitalization is widely recognized as a contributor to deconditioning, functional loss, and increased need for institutional post-acute care. Several studies have demonstrated that inpatient walking programs can mitigate some of these negative outcomes, yet hospital mobility programs are not widely available in U.S. hospitals. STRIDE is a supervised walking program for hospitalized older adults that fills this important gap in clinical care. Herein we describe how STRIDE works and how it is being disseminated to other hospitals using the Replicating Effective Programs (REP) framework. Guided by REP, we define core components of the program and areas where the program can be tailored to better fit the needs and local conditions of its new context (hospital). We describe key adaptations made by 4 hospitals who have implemented the STRIDE program and discuss lessons learned for successful implementation of hospital mobility programs.

Share this article with

Reliable evapotranspiration (ET) estimation is a key factor for water resources planning‎, attaining sustainable water resources use, irrigation water management, and water regulation. During the past few decades, researchers have developed a variety of remote sensing techniques to estimate ET. The Earth Engine Evapotranspiration Flux (EEFlux) application uses Landsat imagery archives on the Google Earth Engine platform to calculate the daily evapotranspiration at the local field scale (30 m). Automatically calibrated for each Landsat image, the EEFlux application design is based on the widely vetted Mapping Evapotranspiration at high Resolution with Internalized Calibration (METRIC) model and produces ET estimation maps for any Landsat 5, 7 or 8 scene in a matter of seconds. In this research we evaluate the consistency and accuracy of EEFlux products that are produced when standard US and global assets are used. Processed METRIC products for 58 scenes distributed around the western and central United States were used as the baseline for comparison. The goal of this paper is to compare the results from EEFlux with the ‎standard METRIC applications to illustrate the utility of the EEFlux products as ‎they currently stand. Given that EEFlux is derived from METRIC, differences are expected to occur due to differing ‎calibration methods (automatic versus manual) and differing input datasets. The products compared include the fraction of reference ET (ETrF), actual ET (ETa), and surface energy balance components net radiation (Rn), ground heat flux (G), and sensible heat flux (H), as well as Ts, albedo and NDVI. The product comparisons show that the intermediate products of Ts, Albedo, and NDVI, and also Rn have similar values and behavior for both EEFlux and METRIC. Larger differences were found for H and G. Despite the more significant differences in H and G, results show that EEFlux is able to calculate ETrF and ETa values comparable to the values from trained expert METRIC users for agricultural areas. For non-agricultural areas such as semi-arid rangeland and forests, the automated EEFlux calibration algorithm needs to be improved in order to be able to reproduce ETrF and ETa that is similar to the manually calibrated METRIC products.

Share this article with

Indigenous Australians have been particularly affected by type 2 diabetes mellitus due to their genetic susceptibility and a range of environmental risk factors. Recent genetic studies link predisposition to some diseases, including diabetes, to archaic humans, such as Neanderthals and Denisovans, suggesting persistence of ancient alleles in the genomes of modern humans. In this review we discuss the evolutionary role of the negative genetic selection associated with an adopted Western lifestyle as well as DNA variants influencing predisposition to obesity and diabetes in the Australian Indigenous population. We review the contribution of the ancient gene/pathways to the modern human phenotypes including the Neanderthalhaplotype-tagging SNPs in NTRK2 gene, which may continue to play a role in obesity in Indigenous Australians.

Share this article with

Forest canopy structure (CS) controls many ecosystem functions and is highly variable across landscapes, but the magnitude and scale of this variation is not well understood. We used a portable canopy lidar system to characterize variation in five categories of CS along N = 3 transects (140–800 m long) at each of six forested landscapes within the eastern USA. The cumulative coefficient of variation was calculated for subsegments of each transect to determine the point of stability for individual CS metrics. We then quantified the scale at which CS is autocorrelated using Moran’s I in an Incremental Autocorrelation analysis. All CS metrics reached stable values within 300 m but varied substantially within and among forested landscapes. A stable point of 300 m for CS metrics corresponds with the spatial extent that many ecosystem functions are measured and modeled. Additionally, CS metrics were spatially autocorrelated at 40 to 88 m, suggesting that patch scale disturbance or environmental factors drive these patterns. Our study shows CS is heterogeneous across temperate forest landscapes at the scale of 10’s of meters, requiring a resolution of this size for upscaling CS with remote sensing to large spatial scales.

Share this article with

Decision tables have been used for many years in data processing and business applications to simulate complex rule sets. Several computer languages have been developed based on rule systems and they are easily programmed in several current languages. Land management and river-reservoir models simulate complex land management operations and reservoir management in highly regulated river systems. Decision tables are a precise yet compact way to model the rule sets and corresponding actions found in these models. In this study, we discuss the suitability of decision tables to simulate management in the river basin scale Soil and Water Assessment Tool (SWAT+) model. Decision tables are developed to simulate automated irrigation and reservoir releases. A simple auto irrigation application of decision tables was developed using plant water stress as a condition for irrigating corn in Texas. Sensitivity of the water stress trigger and irrigation application amounts were shown on soil moisture and corn yields. In addition, the Grapevine Reservoir near Dallas, Texas was used to illustrate the use of decision tables to simulate reservoir releases. The releases were conditioned on reservoir volumes and flood season. The release rules as implemented by the decision table realistically simulated flood releases as evidenced by a daily NSE (Nash-Sutcliffe Efficiency) of 0.52 and a percent bias of -1.1%. Using decision tables to simulate management in land, river and reservoir models was shown to have several advantages over current approaches including: 1) mature technology with considerable literature and applications, 2) ability to accurately represent complex, real world decision making, 3) code that is efficient, modular and easy to maintain, and 4) tables that are easy to maintain, support, and modify.

Share this article with

This article responds to an identified and significant gap in the existing scholarly canon to consider the extent to which cinematic representations construct heroin users—the ‘junkie’—as a criminalised ‘Other’ which confer legitimacy on the notion that such are criminogenic and deviant. Positioned within the disciplinary bounds of cultural criminology, this article focuses on five films - Sid and Nancy (1986); The Basketball Diaries (1995); Trainspotting (1996); Requiem for a Dream (2000); and, T2 Trainspotting (2017). Drawing together Hall’s (1997) theories of representation and Hjelm’s (2014) theories of social constructionism, the findings from a narrative analysis of each of the films—individually and comparatively—explores the following themes, junkies: as criminogenic; as dangerous underclass; as embodying decay and depravity; and in relation to female junkies, as junkie whores. In doing so, this article elucidates new thinking and ideas about cinematic representations of junkies and how this shapes and influences social norms and mores.

Share this article with

Scrub typhus threatens one billion people in the Asia-Pacific area and cases have emerged outside this region. It is caused by infection with any of the multitude of strains of the bacterium, Orientia tsutsugamushi. A vaccine that affords heterologous protection and a commercially available molecular diagnostic assay are lacking. Herein, we determined that the nucleotide and translated amino acid sequences of outer membrane protein A (OmpA) are highly conserved among 51 O. tsutsugamushi isolates. Molecular modeling revealed the predicted tertiary structure of O. tsutsugamushi OmpA to be very similar to that of the phylogenetically related pathogen, Anaplasma phagocytophilum, including the location of a helix that contains residues functionally essential for A. phagocytophilum infection. PCR primers were developed that amplified ompA DNA from all O. tsutsugamushi strains, but not from negative control bacteria. Using these primers in quantitative real-time PCR enabled sensitive detection and quantitation of O. tsutsugamushiompA DNA from organs of mice that had been experimentally infected with the Karp or Gilliam strains. The high degree of OmpA conservation among O. tsutsugamushi strains evidences its potential to serve as a molecular diagnostic target and justifies its consideration as a candidate for developing a broadly protective scrub typhus vaccine.

Share this article with

The effects of buckwheat intake on cardiovascular diseases (CVD) have not been systematically investigated. The aim of the present study was to comprehensively summarise studies in humans and animals evaluating the impact of buckwheat consumption on CVD risk markers and to conduct a meta-analysis of relevant data. Thirteen randomised, controlled human studies, two cross-sectional human studies and twenty-one animal studies were identified. Using random effects models, the weighted mean difference of post-intervention concentrations of blood glucose, total cholesterol and triglycerides were significantly decreased following buckwheat intervention compared with controls [differences in blood glucose: -0.85 mmol/L (95% CI: -1.31, -0.39), total cholesterol: 0.50 mmol/L (95% CI: -0.80, -0.20) and triglycerides: 0.25 mmol/L (95% CI: -0.49, -0.02)]. Responses of a similar magnitude were seen in two cross-sectional studies. For animal studies, nineteen of twenty-one studies showed a significant reduction in total cholesterol of between 12 and 54%, and fourteen of twenty studies showed a significant reduction in triglycerides of between 2 and 74%. All exhibited high unexplained heterogeneity. There was inconsistency in HDL cholesterol outcomes in both human and animal studies. It remains unclear whether increased buckwheat intake significantly benefits other markers of CVD risk, such as weight, blood pressure, insulin, and LDL-cholesterol, and underlying mechanisms responsible for any effects are unclear.

Share this article with

Arctic feedbacks will accelerate climate change and could jeopardise mitigation efforts. The permafrost carbon feedback releases carbon to the atmosphere from thawing permafrost and the sea ice albedo feedback increases solar absorption in the Arctic Ocean. A constant positive albedo feedback and zero permafrost feedback have been used in nearly all climate policy studies to date, while observations and models show that the permafrost feedback is significant and that both feedbacks are nonlinear. Using novel dynamic emulators in the integrated assessment model PAGE-ICE, we investigate nonlinear interactions of the two feedbacks with the climate and economy under a range of climate scenarios consistent with the Paris Agreement. The permafrost feedback interacts with the land and ocean carbon uptake processes, and the albedo feedback evolves through a sequence of nonlinear transitions associated with the loss of Arctic sea ice in different months of the year. The US’s withdrawal from the current national pledges could increase the total discounted economic impact of the two Arctic feedbacks until 2300 by $25 trillion, reaching nearly $120 trillion, while meeting the 1.5 °C and 2 °C targets will reduce the impact by an order of magnitude.

Share this article with

In the British setting, the deployment of the phrase ‘doing god’ has become increasingly common to refer to an emerging trend whereby religion has acquired an increasingly prominent role in political spaces and discourses. This was particularly prominent while David Cameron was Prime Minister and leader of the Conservative Party. While historically, religion has not had a prominent place in either the former Prime Minister David Cameron. Here, the findings from critical analyzing a series of Cameron’s public pronouncements about religion—and Christianity in particular—is set out to try and better understand his own adherence to Christianity (the personal) how this intersected with his politics and role as Prime Minister (the political), and more importantly how this shaped his views about Britain being a Christian country (the national). Contextualised within the embryonic scholarly literature relating to the phenomenon of ‘doing god’ in the contemporary British setting, this article concludes by considering alternative and analogous frames through which greater elucidation of the true motivations of his pronouncements might be understood.

Share this article with

The crack geometry and associated strain field around Berkovich and Vickers indents on silicon have been studied by X-ray diffraction imaging and micro-Raman spectroscopy scanning. The techniques are complementary, the Raman data coming from within a few micrometers of the indentation, whereas the X-ray image probes the strain field at a distance of typically tens of micrometers. For example, Raman data provides an explanation for the central contrast feature in the X-ray images of an indent. Strain relaxation from breakout and high temperature annealing are examined and it is demonstrated that millimeter length cracks, similar to those produced by mechanical damage from misaligned handling tools, can be generated in a controlled fashion by indentation within 75 micrometers of the bevel edge of 200mm diameter wafers.

Share this article with

Environmental policy involving citizen science (CS) is of growing interest. In support of this open data stream, validation or quality assessment of the CS data and their appropriate usage for evidence-based policy making, needs a flexible and easily adaptable data curation process ensuring transparency. Addressing these needs, this paper describes an approach for automatic quality assurance as proposed by the Citizen OBservatory WEB (COBWEB) FP7 project. This approach is based upon a workflow composition that combines different quality controls, each belonging to seven categories or ‘pillars’. Each pillar focuses on a specific dimension in the types of reasoning algorithms for CS data qualification. These pillars attribute values to a range of quality elements belonging to three complementary quality models. Additional data from various sources, such as Earth Observation (EO) data, are often included as part of the inputs of quality controls within the pillars. However, qualified CS data can also contribute to the validation of EO data. Therefore, the question of validation can be considered as ‘two sides of the same coin’. Based on an invasive species CS study, concerning Fallopia japonica (Japanese knotweed), the paper discusses the flexibility and usefulness of qualifying CS data, either when using an EO data for the validation within the quality assurance process, or validating an EO data product that describes the risk of occurrence of the plant. Both validation paths are found to be improved by quality assurance of the CS data. Addressing the reliability of CS open data, issues and limitations of the role of quality assurance for validation, due to the quality of secondary data used within the automatic workflow, are described, e.g. error propagation, paving the route to improvements in the approach.

Share this article with

The creation of the National Muslims Women’s Advisory Group (NMWAG) in 2008 by Britain’s New Labour Government was part of a strategy which sought to engage different levels of Muslim communities beneath an overarching focus on reducing ‘Islamic extremism’. To do so however, Government acknowledged that it would need to support Muslim women to overcome some of the constraints it believed were placed on Muslim women in contemporary Britain. Deeming theology and religious interpretation to be one of those constraints, Government saw the need to empower Muslim women to ‘influence and challenge’ religious and theological discourses as a priority. This article therefore offers a case study on a project that was commissioned by Government that sought to empower Muslim women to ‘influence and challenge’ theological interpretations in collaboration with the NMWAG. Having gained unprecedented access to the NMWAG, its activities and engagement with Government, this article presents previously unpublished findings from that project to focus on two key themes: Muslim women, their identity and position; and theology, leadership and the participation of women. Having explored these in detail, this article concludes by critically reflecting on the way in which Government engaged and interacted with Muslim women, the role and relative success of the NMWAG and, most importantly, the extent to which the NMWAG was able to ‘influence and challenge’ interpretations of Islamic theology.

Share this article with

Soon after the Conservative-led Coalition government came to power in 2010, Baroness Sayeeda Warsi announced that Islamophobia had passed the ‘dinner-table test’ in contemporary Britain. Resultantly, the need to address Islamophobia was identified as a priority for the Coalition. This article critically analyses how the Coalition sought to achieve this and the extent to which it was successful. Focusing on the period 2010-5, this article initially frames what is meant by Islamophobia before briefly setting out how it had been responded to by previous British governments. As regards the Coalition, a threefold approach is adopted that considers the All-Party Parliamentary Group on Islamophobia, the Cross-Government Working Group on Anti-Muslim Hate and the political discourses used by the Coalition about Muslims and Islam more generally. Concluding that the Coalition failed to meet the high expectations set by Warsi’s speech, this article considers why this might have been so.

Share this article with

The choice associated with words is a fundamental property of natural languages. It lies at the heart of quantitative linguistics, computational linguistics, and language sciences more generally. Information-theory gives us tools at hand to measure precisely the average amount of choice associated with words—the word entropy. Here we use three parallel corpora—encompassing ca. 450 million words in 1916 texts and 1259 languages—to tackle some of the major conceptual and practical problems of word entropy estimation: dependence on text size, register, style and estimation method, as well as non-independence of words in co-text. We present three main results: 1) a text size of 50K tokens is sufficient for word entropies to stabilize throughout the text, 2) across languages of the world, word entropies display a unimodal distribution that is skewed to the right. This suggests that there is a trade-off between the learnability and expressivity of words across languages of the world. 3) There is a strong linear relationship between unigram entropies and entropy rates, suggesting that they are inherently linked. We discuss the implications of these results for studying the diversity and evolution of languages from an information-theoretic point of view.

Share this article with

The development of low-cost, open-source Remotely Operated Vehicle (ROV) systems has provided almost unrestricted access for researchers looking to monitor the marine environment in ever greater resolution. Sampling microbial communities from the marine environment, however, still usually relies on Niskin-bottle sampling (ROV or CTD based), a method which introduces an inaccuracy and variability that is incompatible with metatranscriptomic analysis. Here, we describe a versatile, easily-replicated platform which achieves in situ mRNA preservation, via the addition of RNAlater to filtered microbial cells, to enhance ROV or CTD functionality.

Share this article with

Recent analyses of cosmic microwave background surveys have revealed hints that there may be a non-trivial running of the running of the spectral index. If future experiments were to confirm these hints, it would prove a powerful discriminator of inflationary models, ruling out simple single field models. We discuss how isocurvature perturbations in multi-field models can be invoked to generate large runnings in a non-standard hierarchy, and find that a minimal model capable of practically realising this would be a two-field model with a non-canonical kinetic structure. We also consider alternative scenarios such as variable speed of light models and canonical quantum gravity effects and their implications for runnings of the spectral index.

Share this article with

Adaptation to climate change is being addressed in many domains. This means that there are multiple perspectives on adaptation; often with differing visions resulting in disconnected responses and outcomes. Combining singular perspectives into coherent, combined perspectives that include multiple needs and visions can help to deepen the understanding of various aspects of adaptation and provide more effective responses. Such combinations of perspectives can help to increase the range and variety of adaptation measures available for implementation or avoid maladaptation compared with adaptations derived from a singular perspective. The objective of this paper is to present and demonstrate a framework for structuring the local adaptation responses using the inputs from multiple perspectives. The adaptation response framing has been done by: (i) contextualizing climate change adaptation needs; (ii) analyzing drivers of change; (iii) characterizing measures of adaptation; and (iv) establishing links between the measures with a particular emphasis on taking account of multiple perspectives. This framework was demonstrated with reference to the management of flood risks in a case study Can Tho, Vietnam. The results from the case study show that multiple perspective framing of adaptation responses enhance the understanding of various aspects of adaptation measures, thereby leading to flexible implementation practices.

Share this article with

In this data deposit, I describe a dataset that is the result of content mining 167,318 published articles for statistical test results. As a result of this content mining, 688,112 results from 50,845 articles were extracted. In order to provide a comprehensive set of data, the statistical results are supplemented with metadata from the article they originate from. The dataset is provided in a comma separated file (CSV) in long-format. For each of the 688,112 results, 20 variables are included, of which seven are article metadata and 13 pertain to the individual statistical results (e.g., reported and recalculated p-value). A five-pronged approach was taken to generate the dataset: (i) collect journal lists, (ii) spider journal pages for articles, (iii) download articles, (iv) add article metadata, and (v) mine articles for statistical results.

Filter Results

To filter search results by subject area: select one or multiple subjects, click “Apply” to confirm.