To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .

To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services
Please confirm that you accept the terms of use.

This article contributes to an emerging scholarly debate over the support displayed by key Azhari ʿulamaʾ for the 3 July 2013 coup in Egypt and the subsequent massacres of anticoup protesters. I focus on the Islamic legal justifications articulated by the former grand mufti of Egypt ʿAli Jumʿa, which academics have contextualized primarily in relation to quietist precedents from late medieval Islamic political thought or his Sufi background. By contrast, I consider Jumʿa's justifications as representative of a nationalist discourse that has its historical origins in the protonationalism of Rifaʿa al-Tahtawi (d. 1873). My argument has wider implications for our conceptualization of the contemporary Islamic tradition. If, as scholars have argued, the Islamic tradition is a framework for inquiry rather than a set of doctrines, then in the 19th century a concern for the nation and its future became a key part of that framework. I contend that these additions came to redefine the worldview and politics of the ʿulamaʾ in terms of national progress and its horizon of expectations.

Four rates of aminopyralid (30, 60, 90, and 120 g ae ha−1 [0.4, 0.9, 1.3, and 1.8 oz ae acre−1]) were compared for their ability to reduce abundance of nonnative dicot species and favor native species in an invaded Cascade Mountain meadow near Trout Lake, WA. Treatments were applied in two replicated studies (June 2009 and 2010), and foliar cover and species richness were monitored for two years. First-year control of nonnative dicots from application of 30 g ae ha−1 of aminopyralid (69%) was greater than that of native dicots (29%); whereas, significant control of both species groups occurred at the higher rates. By the second year after treatment, absolute differences in cover between treated and non-treated plots averaged −17% and −21% for native and nonnative dicots, respectively, and +1% and +27% for native and nonnative monocots, respectively. First-year control of Canada thistle and oxeye daisy was greater after treatment in 2009 (88% and 90%, respectively) than after treatment in 2010 (56% and 55%, respectively), probably because lower spring temperatures in 2010 limited vegetation development and plant susceptibility to aminopyralid. Cover of Kentucky bluegrass and sheep fescue averaged 20% and 6% greater, respectively, in treated plots than in non-treated plots. Application of 30 g ae ha−1 of aminopyralid had no detectable effect on second-year richness of native and nonnative species relative to non-treated plots; however, higher rates caused 24% to 43% reductions in richness of each species group. Research results suggest that application of aminopyralid at 30 g ae ha−1 has the potential to reduce abundance of nonnative dicot species in similar meadow communities of the Pacific Northwest with little or no negative impacts to abundance and richness of native species. As a potential strategy to limit the subsequent spread of Kentucky bluegrass, a grass herbicide, such as fluazifop or sethoxydim, could be added to the treatment.

Active surveillance for MRSA colonization was performed in both ICUs. In June 2005, a chlorhexidine bathing protocol was implemented in the surgical ICU. Changes in S. aureus transmission and infection rate before and after implementation were analyzed using time-series methodology.

Results.

The intervention unit had a 20.68% decrease in MRSA acquisition after institution of the bathing protocol (12.64 cases per 1,000 patient-days at risk before the intervention vs 10.03 cases per 1,000 patient-days at risk after the intervention; β, −2.62 [95% confidence interval (CI), −5.19 to −0.04]; P = .046). There was no significant change in MRSA acquisition in the control ICU during the study period (10.97 cases per 1,000 patient-days at risk before June 2005 vs 11.33 cases per 1,000 patient-days at risk after June 2005; β, −11.10 [95% CI, −37.40 to 15.19]; P = .40). There was a 20.77% decrease in all S. aureus (including MRSA) acquisition in the intervention ICU from 2002 through 2007 (19.73 cases per 1,000 patient-days at risk before the intervention to 15.63 cases per 1,000 patient-days at risk after the intervention [95% CI, −7.25 to −0.95]; P = .012)]. The incidence of ICU-acquired MRSA infections decreased by 41.37% in the intervention ICU (1.96 infections per 1,000 patient-days at risk before the intervention vs 1.15 infections per 1,000 patient-days at risk after the intervention; P = .001).

Conclusions.

Institution of daily chlorhexidine bathing in an ICU resulted in a decrease in the transmission of S. aureus, including MRSA. These data support the use of routine daily chlorhexidine baths to decrease rates of S. aureus transmission and infection.

Staphylococcus aureus is an important cause of infection in intensive care unit (ICU) patients. Colonization with methicillin-resistant S. aureus (MRSA) is a risk factor for subsequent S. aureus infection. However, MRSA-colonized patients may have more comorbidities than methicillin-susceptible S. aureus (MSSA)-colonized or noncolonized patients and therefore may be more susceptible to infection on that basis.

Objective.

To determine whether MRSA-colonized patients who are admitted to medical and surgical ICUs are more likely to develop any S. aureus infection in the ICU, compared with patients colonized with MSSA or not colonized with S. aureus, independent of predisposing patient risk factors.

Design.

Prospective cohort study.

Setting.

A 24-bed surgical ICU and a 19-bed medical ICU of a 1,252-bed, academic hospital.

Patients.

A total of 9,523 patients for whom nasal swab samples were cultured for S. aureus at ICU admission during the period from December 2002 through August 2007.

Methods.

Patients in the ICU for more than 48 hours were examined for an ICU-acquired S. aureus infection, defined as development of S. aureus infection more than 48 hours after ICU admission.

ICU patients colonized with S. aureus were at greater risk of developing a S. aureus infection in the ICU. Even after adjusting for patient-specific risk factors, MRSA-colonized patients were more likely to develop S. aureus infection, compared with MSSA-colonized or noncolonized patients.

Nutrigenomics is the study of how constituents of the diet interact with genes, and their products, to alter phenotype and, conversely, how genes and their products metabolise these constituents into nutrients, antinutrients, and bioactive compounds. Results from molecular and genetic epidemiological studies indicate that dietary unbalance can alter gene–nutrient interactions in ways that increase the risk of developing chronic disease. The interplay of human genetic variation and environmental factors will make identifying causative genes and nutrients a formidable, but not intractable, challenge. We provide specific recommendations for how to best meet this challenge and discuss the need for new methodologies and the use of comprehensive analyses of nutrient–genotype interactions involving large and diverse populations. The objective of the present paper is to stimulate discourse and collaboration among nutrigenomic researchers and stakeholders, a process that will lead to an increase in global health and wellness by reducing health disparities in developed and developing countries.

To determine the occurrence of co-colonization or co-infection with VRE and MRSA among medical patients requiring intensive care.

Design:

Prospective, single-center, observational study.

Setting:

A 19-bed medical ICU in an urban teaching hospital.

Patients:

Adult patients requiring at least 48 hours of intensive care and having at least one culture performed for microbiologie evaluation.

Results:

Eight hundred seventy-eight consecutive patients were evaluated. Of these patients, 402 (45.8%) did not have microbiologie evidence of colonization or infection with either VRE or MRSA 355 (40.4%) were colonized or infected with VRE, 38 (4.3%) were colonized or infected with MRSA, and 83 (9.5%) had co-colonization or co-infection with VRE and MRSA. Multiple logistic regression analysis demonstrated that increasing age, hospitalization during the preceding 6 months, and admission to a long-term-care facility were independently associated with colonization or infection due to VRE and co-colonization or co-infection with VRE and MRSA. The distributions of positive culture sites for VRE (stool, 86.7%; blood, 6.5%; urine, 4.8%; soft tissue or wound, 2.0%) and for MRSA (respiratory secretions, 34.1%; blood, 32.6%; urine, 17.1%; soft tissue or wound, 16.2%) were statistically different (P < .001).

Conclusions:

Co-colonization or co-infection with VRE and MRSA is common among medical patients requiring intensive care. The recent emergence of vancomycin-resistant Staphylococcus aureus and the presence of a patient population co-colonized or co-infected with VRE and MRSA support the need for aggressive infection control measures in the ICU.

To determine the epidemiology of colonization with vancomycin-resistant Enterococcus (VRE) among intensive care unit (ICU) patients.

Design:

Ten-month prospective cohort study.

Setting:

A 19-bed medical ICU of a 1,440-bed teaching hospital.

Methods:

Patients admitted to the ICU had rectal swab cultures for VRE on admission and weekly thereafter. VRE-positive patients were cared for using contact precautions. Clinical data, including microbiology reports, were collected prospectively during the ICU stay.

Results:

Of 519 patients who had admission stool cultures, 127 (25%) had cultures that were positive for VRE. Risk factors for VRE colonization identified by multiple logistic regression analysis were hospital stay greater than 3 days prior to ICU admission (adjusted odds ratio [AOR], 3.6; 95% confidence interval [CI95], 2.3 to 5.7), chronic dialysis (AOR, 2.4; CI95, 1.2 to 4.5), and having been admitted to the study hospital one to two times (AOR, 2.3; CI95,1.4 to 3.8) or more than two times (AOR, 6.5; CI95, 3.7 to 11.6) within the past 12 months. Of the 352 VRE-negative patients who had one or more follow-up cultures, 74 (21%) became VRE positive during their ICU stay (27 cases per 1,000 patient-ICU days).

Conclusion:

The prevalence of VRE culture positivity on ICU admission was high and a sizable fraction of ICU patients became VRE positive during their ICU stay despite contact precautions for VRE-positive patients. This was likely due in large part to prior VRE exposures in the rest of the hospital where these control measures were not being used.

The dopaminergic system in the brain seems to play an important role in the regulation of sexual behaviour. The relationship between genes for the D1, D2 and D4 dopamine receptors and age at first sexual intercourse (AFSI) was examined in a sample of 414 non-Hispanic, European–American men and women. A significant association was observed between a DRD2 allele and AFSI and an even stronger association when the DRD2 allele was interacted with a DRD1 allele. A constrained regression model was constructed predicting AFSI using sex and a group of nine psychosocial variables as predictors. Adding the DRD2 and the DRD2-by-DRD1 predictors to this model increased the explained variance by 23 and 55%, respectively. Although these findings suggest a stronger association among males than among females, further research will be necessary to clarify this question, as well as to establish whether the observed association holds in other racial/ethnic groups.

Science is a method of knowing and not a set of established facts. Even though this is widely acknowledged in the literature of science education, it is not so in the perception of the general public. Therefore, it is little wonder that when a “scientific fact” changes due to the discovery of new evidence, students raised on the notion of fact-based science may become confused and begin to mistrust science. Involvement in actual scientific experience, however, can make a difference. Once students participate in scientific endeavors, they see science as an active process, something ongoing instead of something completed. Few experiences compare with the paleontological field trip as an educational tool for exemplifying the basic method of science. We believe the following reasons make frequent field trips worth incorporating into the standard science curriculum at all grade levels.

Suppose You are given one hour to tell someone about paleontology. The audience is naive but interested. They want to know something about paleontology but have not previously received, and may ever again receive, formal instruction in the subject. What will you choose to talk about?

Cognitive style, creativity, and intelligence are not properly cognitive executive functions, but they are nevertheless important to consider for a full picture of the cognitive capabilities and functioning of children with visual impairments. In each area there is important evidence of individual differences.

Cognitive style

The term cognitive style refers to the fact that children differ in their approaches to cognitive tasks. Although there are different ways of defining cognitive style, one of the most useful is the global–articulated dimension. An articulated style refers to the ability to impose structure on an inherently unstructured situation, or to recognize structure that exists. In contrast, global style refers to the tendency to deal with events in a diffuse and unstructured manner. The dimension thus has to do with cognitive organization.

This aspect of cognitive functioning is of interest in children with visual impairments because of the role that vision is hypothesized to play in the developing cognitive abilities of sighted children. In studies of sighted children, Witkin and his colleagues have generally found a developmental progression from global to articulated cognitive style; that is, with increasing age children become more able to differentiate structure within a field and more able to impose structure when little exists (Witkin, Birnbaum, Lomonaco, Lehr, & Herman, 1968). Vision is thought to play a major role not only in the developing articulation of visual perception itself, but also as an aid to articulation of experience gained through other sensory modalities. Witkin et al. thus hypothesized that congenitally blind children would show individual consistencies of cognitive style, but that in general they would be relatively more global in their cognitive functioning than sighted children of comparable age.

In research with children, a fundamental concern is how abilities and characteristics change over time as age increases and development proceeds. Two basic research models have been used in this quest, the crosssectional model and the longitudinal model. The cross-sectional model involves taking samples of children at the various ages required by the research design, evaluating each sample, and constructing the developmental picture by charting characteristics as a function of chronological age. In the longitudinal model, one sample of children is selected at the earliest age required, and then the same children are followed as they age and are tested at each subsequent age level. The developmental picture emerges as the children's results are accumulated over time.

Each of these models has significant advantages and disadvantages. The cross-sectional model has the important advantage that the time required to complete the research is relatively short: testing can be completed efficiently because the samples at the various ages are simultaneously available and can be tested in short order. However, the critical shortcoming of the cross-sectional model is that although it reveals the general nature of development, it does not tell us about its continuity within individuals. The longitudinal model reverses these features. It has the major advantage that it reveals the developmental continuity (or lack thereof) within individual children. However, it is inefficient in terms of time and expense and has the additional problem of attrition; for a variety of reasons not under the researcher's control, not all children in the initial sample will be available for evaluation at subsequent test times.

The vast majority of developmental research has been conducted using the cross-sectional model.

The newborn human infant does not exhibit behaviors that can be considered truly “social.” Gradually, the infant becomes responsive to specific people and develops emotional bonds to them. These are the beginnings of social and emotional relationships that will develop and mature throughout the individual's lifetime. As the infant develops, parents and others begin to impose the demands of “socialization”; that is, they expect the child to modify his or her behavior in response to their expectations and to assume increasing responsibility for self-care.

Of course, language is a significant medium of social interaction; we will examine its development as a tool of communication. The roots of language lie in infancy, though, and it is necessary to look at prelanguage communicative behaviors. Increasingly, the child faces requirements to monitor his or her own behavior without being continuously directed by others. At some point in the first few years, the child enters into social relationships that extend beyond the family, and whether this is in child care, preschool, or a school setting, additional demands are placed on the child's capacities for interaction and adaptation to the social situation.

As the child progressively adapts to the demands of socialization and independence, stable characteristics of self emerge. These are seen in the consistencies in behavioral and attitudinal traits that affect the nature and quality of the child's interaction with other people as well as his or her pattern of self-conceptualization and self-regard.

Coming to know about the physical world is a long process that begins early in infancy and is not completed for years. The child's contact with the physical world comes via the senses, and thus the first set of questions has to do with the quality of the infant's perceptual capabilities. Perception is not the end of the matter, though – the infant and child also have to acquire the skills to engage in motor and locomotor interaction with the environment. Throughout, the child gradually gains the ability to conceptualize the physical world through thought, rather than just acting upon it. A large and important part of this growing ability to conceptualize is the perception and understanding of the spatial aspects of the physical world. In Part I we address each of these major issues in separate chapters.