Sample records for 20th world computer

During August 17th-21st, 2014, the University of Alaska Anchorage, along with other local, state, and federal agencies throughout Alaska, will host the 20(th) International Epidemiological Association's (IEA) World Congress of Epidemiology (WCE 2014). The theme for this Congress is "Global Epidemiology in a Changing Environment: The Circumpolar Perspective." The changing environment includes the full range of environments that shape population health and health inequities from the physical to the social and economic. Our circumpolar perspective on these environments includes views on how political systems, work, immigration, Indigenous status, and gender relations and sexuality affect the global world and the health of its people. Suggestions and insights from the 3(rd) North American Congress of Epidemiology (2011) and the first-ever joint regional workshop co-organized by the IEA North American Region and the IEA Latin American and Caribbean Region held at the 19(th) IEA World Congress of Epidemiology (2011) have helped direct the focus for WCE 2014. Since the Arctic regions are feeling the effects of climate change first, we believe focusing on the emerging data on the health impacts of climate change throughout the world will be an important topic for this Congress. This will include a broad range of more traditional epidemiology areas such as infectious disease epidemiology, environmental epidemiology, health disparities, and surveillance and emergency preparedness. Addressing health inequities and promoting health equity is likewise a key concern of the Congress. This Congress will also host presentations on injury epidemiology, occupational health, infectious diseases, chronic diseases, maternal and child health, surveillance and field epidemiology, mental health, violence (from self-directed, e.g., suicide, to interpersonal to structural), psychoactive substance use (including tobacco), and measures of subjective health. Attention will be given to

As in any historical endeavor, periodization is an attempt to manage change, and present it coherently, by noting points where key breaks in framework occur. In world history, periodization has come to convey, particularly, shifts in the pattern of interactions and contacts among many, though not always all, major societies. In this article, the…

List of illustrations; List of tables; Acknowledgements; Introduction; 1. From the physical world to the biological universe: Democritus to Lowell; 2. Life in the solar system; 3. Solar systems beyond; 4. Extraterrestrials in literature and the arts: the role of imagination; 5. The UFO controversy and the extraterrestrial hypothesis; 6. The origin and evolution of life in the extraterrestrial context; 7. SETI: the search for extraterrestrial intelligence; 8. The meaning of life; 9. Summary and conclusion: the biological universe; Select bibliographical essay; Index.

In this age and time, capturing 'state of the art' of computing in a conference proceedings gets to be increasingly hard. It is quite common too for the submitted abstracts to refer to studies yet to be done - and the time span between abstract submission and the actual conference is often less than six months. By the time the proceedings appear in journal form, a similar period after its closing session, some of the work is over a year old, by which time new ideas will have been formed and the deployment of current ones progressed - at times beyond recognition. The preface is continued in the pdf.

This site provides data from the 20th Century Reanalysis Project, offering temperature, pressure, humidity, and wind predictions in 200 km sections all around the earth from 1871 to 2010, every 6 hours, based on historical data. The ensemble mean and standard deviation for each value were calculated over a set of 56 simulations. Data for each of the 56 ensemble members are included here. The dataset consists of files in netCDF 4 format that are available for download from the National Energy Research. The goal of the 20th Century Reanalysis Project is to use a Kalman filter-based technique to produce a global trophospheric circulation dataset at four-times-daily resolution back to 1871. The only dataset available for the early 20th century consists of error-ridden hand-drawn analyses of the mean sea level pressure field over the Northern Hemisphere. Modern data assimilation systems have the potential to improve upon these maps, but prior to 1948, few digitized upper-air sounding observations are available for such a reanalysis. The global tropospheric circulation dataset will provide an important validation check on the climate models used to make 21st century climate projections....[copied from http://portal.nersc.gov/project/20C_Reanalysis/

This booklet is concerned with the last half of the 19th and the beginning of the 20th century when a great surge of knowledge vital to atomic science took place, as illustrated by work by Faraday, Mendeleev, Roentgen, Becquerel and the Curies. Each succeeding discovery brought atomic science closer to the great breakthrough that marked the close…

A short oral history of the NNSA's Stockpile Stewardship Program, produced in association with the 20th anniversary of the program. It features Siegfried Hecker, Rose Gottemoeller, Victor Reis, Charles McMillan, Joan Rohlfing, Omar Hurricane, Roger Hagengruber, and John Taylor.

On May 22, 1980, a symposium was held at Brookhaven to celebrate the 20th birthday of the AGS, to recall its beginnings, and to review major discoveries that have been made with its beams. The talks at the symposium are recorded in this volume.

The use of the Cray 2 supercomputer, the fastest computer in the world, at ARC is detailed. The Cray 2 can perform 250 million calculations per second and has 10 times the memory of any other computer. Ames researchers are shown creating computer simulations of aircraft airflow, waterflow around a submarine, and fuel flow inside of the Space Shuttle's engines. The video also details the Cray 2's use in calculating airflow around the Shuttle and its external rockets during liftoff for the first time and in the development of the National Aero Space Plane.

This article reports on the use of a virtual world ("Second Life") in computing education, and identifies the precursors of current virtual world systems. The article reviews the potential for virtual worlds as tools in computing education. It describes two areas where "Second Life" has been used in computing education: as a development…

This proceedings contains summaries of sessions on topics related to the use of computing across a wide range of disciplines and levels of education, including curriculum and instructional strategies, current and emerging technologies, social and ethical issues, library/media, technology implementation, exhibitors, teacher education and training,…

This paper relies on documentary analysis available from primary and secondary data to identify policies that were put in place to arrest the shift of the Tanzanian educational system during the last decade of the 20th century from "Education for Self Reliance" to "Education and Training Policy"; to pick one policy and develop a discussion about…

Describes an innovative, tool-based high school computer literacy course that involves student use of productivity software to learn what computers do in the real world rather than programing. Several student projects, teacher qualifications and responsibilities, and the role of peer teaching are discussed. (MBR)

This report of proceedings contains 28 papers that were presented at an international conference on school librarianship. The papers are: (1) "Multi-Ethnic Materials for Children and Young Adults in a Changing World" (Spencer G. Shaw); (2) "Eating Computers" (Dean Marney); (3) "Literature, Reading and the School Library Resource Centre in a…

From its very beginning, the 20th century represented the period of the main breakthrough for allergology as a clinical and scientific entity. The first years of this period were extraordinarily exciting because of the discovery of the anaphylactic reaction in 1902 and its clinical diagnosis as 'local anaphylaxis', 'serum sickness' (1903) or even as 'anaphylactic shock' (1907). The term 'allergy' was coined in 1906 and led to the recognition of allergic diseases as a pathogenetic entity. The first patient organization of hay fever sufferers was founded in Germany in 1900, the same year in which the very first report on immunotherapy was published in New York. In 1911 the era of actual immunotherapy started in London, becoming scientific with the first double-blind study in 1956, and still today being regarded as the backbone of allergology. In 1919 it was shown that allergy could be transferred by blood, in 1921 by serum (Prausnitz-Küstner test) and in 1966 the mystic 'reagins' were recognized as immunoglobulin (Ig) E. The development of the radioallergosorbent test for quantifying specific IgE antibody was a diagnostic landmark for allergists all over the world. The history of allergy diagnosis started with the introduction of a 'functional skin test', named the patch test in 1894. The scratch test was described in 1912 and the patch test in 1931. From 1908 the skin was tested by intracutaneous injections, and from 1930 by a 'puncture test' (a precursor of the prick test) which has been in worldwide use in modified variations since 1959. The rub test ('friction test') was added in 1961. Systematically applied provocation tests started with conjunctival provocation (1907), followed by nasal and bronchial provocation with allergens (1914 and 1925). PMID:24925382

Even in difficult economic times, colleges and universities continue to invest in residence hall construction projects as a way to attract new students and keep existing ones on campus. According to data from "American School & University"'s 20th annual Residence Hall Construction Report, the median new project completed in 2008 was less expensive…

It is the 20th anniversary of the release of Dale Parnell's landmark book "The Neglected Majority". In this book, Parnell pointed out that for too long America's educational system has focused on the highest and lowest achievers. He made the case that most of those students in the middle two high school quartiles neither prepare for nor aspire to…

One of the most prominent physicists of the 20th century, Lev Davidovich Landau, was at the same time a great universalist who made fundamental contributions in diverse areas of physics: quantum mechanics, solid state physics, theory of magnetism, phase transition theory, nuclear and particle physics, quantum electrodynamics (QED), low-temperature physics, fluid dynamics, atomic collision theory, theory of chemical reactions, and other disciplines.

The purpose of this article is to present Mexico's healthcare in the 20th century. This was a process that was based on illustrated rationalism, positivism and neopositivism. Knowledge and science used to veer away from all aspects of charity and beneficence. Liberal legacy were favourable to government stocks and the state managed to raise a considerable amount of financial and human resources. PMID:12602086

The proceedings of the 20th Aerospace Mechanisms Symposium, hosted by the NASA Lewis Research Center, Cleveland, Ohio, on May 7-9, 1986, is documented herein. During the 3 days, 23 technical papers were presented by experts from the United States and Western Europe. A panel discussion by an International group of experts on future directions In mechanisms was also presented; this discussion, however, is not documented herein. The technical topics addressed included deployable structures, electromagnetic devices, tribology, thermal/mechanical/hydraulic actuators, latching devices, positioning mechanisms, robotic manipulators, and computerized mechanisms synthesis.

This article gives an outline of the history of antimatter from the concept first introduced in 1898 up to the present day and is intended to complement the article 'Antihydrogen on Tap’ on page 229. It is hoped that it will provide enough historical background material along with interesting snippets of information for teachers to feel informed about the topic when in the classroom. Antimatter is the perfect example of 20th century science incorporating quantum mechanics and relativity, and showing progression from a theoretical idea to mass production within the space of 100 years. The final section is about using the idea of antihydrogen in the classroom.

The 20th century has been one of the most intense and convulsive periods in the History of humanity. A century of paradoxes and contrasts, it began with optimism, it witnessed the apocalypse of two world wars, and finished with unimaginable scientific progress that gave us a new civilization that we cannot yet grasp. In this century, significant events happened that shaped our time and projected their results toward an immediate future. Some of these were providential in understanding man's life, fighting against illnesses and prolonging life, and others were of undeniable social importance for humanity. Some knowledge was based on the work of others. Philosophy was embedded in mathematics, as was science in philosophy, while politics and the economy exercised so decisive an influence in our way of feeling and living that culture and society were affected to the core. Within that century the biggest technological revolution of all the time was also created, as transcendent as it was unimaginable, which put mankind on the road to the stars with the moon landing and in the process created the information society whose signature symbol, the internet, emerged as a new demiurge. However, the 20th century, with all its misfortune and splendor, paradoxes and contrasts, creation and destruction, was the most transcendent in the whole of history and it bequeaths to the future a promising horizon in the search for a renovated meaning of life and a yearning for peaceful coexistence for the whole humanity. PMID:15754756

Among of the highlights of the 20th century were flights of spacecraft to other bodies of the Solar System. This paper describes briefly the missions attempted, their goals, and fate. Information is presented in five tables on the missions launched, their goals, mission designations, dates, discoveries when successful, and what happened if they failed. More detailed explanations are given in the accompanying text. It is shown how this enterprise developed and evolved step by step from a politically driven competition to intense scientific investigations and international cooperation. Initially, only the USA and USSR sent missions to the Moon and planets. Europe and Japan joined later. The USSR carried out significant research in Solar System exploration until the end of the 1980s. The Russian Federation no longer supports robotic planetary exploration for economic reasons, and it remains to be seen whether the invaluable Russian experience in planetary space flight will be lost. Collaboration between Russian and other national space agencies may be a solution.

During the Second World War scientists and engineers were involved as never before in all technical phases of the war effort. It included intelligence, logistics and large scale automated computation. Much of this required team work which led to the adoption of interdisciplinary perspectives and found expression after the war in new fields of enquiry such as cybernetics, biophysics and artificial intelligence. While Europe was recovering from its devastation, the United States entered an unprecedented age of prosperity beginning in the 1940s and 50s. The political and budgetary environment was favorable for scientific research and it was felt in Europe as well as the U.S.A. I discuss some of these conditions and the figures associated with the work that became the foundation for advances throughout the second half of the 20th century and conclude with a few observations on quantitative neuroscience and the problem of representation. PMID:23313751

The Cray XT system at ORNL is the world s most powerful computer with several applications exceeding one-petaflops performance. This paper describes the architecture of Jaguar with combined XT4 and XT5 nodes along with an external Lustre file system and external login nodes. We also present some early results from Jaguar.

With the publishing of Sir Isaac Newton's Principia Mathematica in 1687, a scientific paradigm was established that clearly dominated society for two and half centuries. Many historians of science have identified the Copenhagen interpretation of the quantum theory, formulated c.1927, as having completed a scientific revolution that ended the reign of classical Newtonian science. A rival claim to contemporary scientific revolution, however, has been put forward by Ilya Prigogine and the Brussels school of thermodynamics based on Prigogine's work in non-equilibrium thermodynamics. Using the historical consensus model of scientific revolution first articulated by Thomas S. Kuhn in 1962, this analysis examines the extent to which the Copenhagen interpretation of the quantum theory and the work of IIya Prigogine complete the conceptual, scientific paradigm-shift necessary for a scientific revolution. The resulting historical evidence shows that the Copenhagen interpretation did not complete a paradigm-shift; instead, it was a self-revelation by the scientific community which revealed the essence and fundamental limitations of Newtonian science. Evidence further indicates that the valid claim to scientific revolution in the 20th century lies with the contemporary work of Prigogine and the Brussels school. By abandoning the deterministic, mechanical world-view of the Newtonian paradigm and accepting a new reality of process and irreversible time, Prigogine and his associates have established the foundations for a revolutionary new scientific paradigm.

The nonprofit organization One Laptop per Child (OLPC) has a mission that is easy to articulate but very challenging to achieve: Provide laptop computers to the almost 2 billion children who live in parts of the world where poverty and the lack of a structured educational system deprive them of an adequate education. To further complicate the…

The American Physical Society's part of its centennial celebration in March of 1999 decided to develop a timeline wall chart on the history of 20th century physics. This resulted in eleven consecutive posters, which when mounted side by side, create a 23-foot mural. The timeline exhibits and describes the millstones of physics in images and words. The timeline functions as a chronology, a work of art, a permanent open textbook, and a gigantic photo album covering a hundred years in the life of the community of physicists and the existence of the American Physical Society. Each of the eleven posters begins with a brief essay that places a major scientific achievement of the decade in its historical context. Large portraits of the essays' subjects include youthful photographs of Marie Curie, Albert Einstein, and Richard Feynman among others, to help put a face on science. Below the essays, a total of over 130 individual discoveries and inventions, explained in dated text boxes with accompanying images, form the backbone of the timeline. For ease of comprehension, this wealth of material is organized into five color-coded story lines the stretch horizontally across the hundred years of the 20th century. The five story lines are: Cosmic Scale, relate the story of astrophysics and cosmology; Human Scale, refers to the physics of the more familiar distances from the global to the microscopic; Atomic Scale, focuses on the submicroscopic world of atoms, nuclei and quarks; Living World, chronicles the interaction of physics with biology and medicine; Technology, traces the applications of physic to everyday living. Woven into the bottom border of the timeline are period images of significant works of art, architecture, and technological artifacts such as telephones, automobiles, aircraft, computers, and appliances. The last poster, covering the years since 1995, differs from the others. Its essay concerns the prospect for physics into the next century, and is illustrated

The treatment of neurosurgical casualties suffered during the wars of the 20th century had a significant impact on the formation and early growth of neurosurgery as a specialty. This chapter explores how the evolution of military tactics and weaponry along with the circumstances surrounding the wars themselves profoundly influenced the field. From the crystallization of intracranial projectile wound management and the formal recognition of the specialty itself arising from World War I experiences to the radical progress made in the outcomes of spinal-cord-injured soldiers in World War II or the fact that the neurosurgical training courses commissioned for these wars proved to be the precursors to modern neurosurgical training programs, the impact of the 20th century wars on the development of the field of neurosurgery is considerable. PMID:27035828

Networks can play an important role for nurses in user-to-user communication because they can be used both within and outside the health care delivery system. The choices include an information exchange, which can be an effective strategy for sharing personal concerns, problems, and achievements about the computer; commercial data bases with their vast sources of information and research data; or local area networks, effective in an office or campus setting. All of these networks can put worlds of information and services just a few words or keyboard strokes away, because they offer, outside of your own computer, a whole new dimension of retrieval, storage, reference, and communication capabilities. These networks can significantly enhance computing potential by providing an overall expansion of information. PMID:3903669

The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole Internet. Many applications have been developed on such http servers. One important and novel development on the World Wide Web (WWW) has been the development of computer vision and image processing related courseware facilities and indeed image processing packages. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive packages suited primarily for teaching and demonstration packages. Within the WWW there are many pointers that highlight more research based activities. This paper addresses the issues of the implementation of the computer vision and image processing packages, the advantages gained from using a hypertext based system, and also relates the practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed. A suite of multimedia based tools have been developed to facilitate such systems and these are described in the paper. A brief survey of related sources of information on the World Wide Web also is presented.

This volume contains the Proceedings of the 20th Nordic Semiconductor Meeting (NSM20). The Meeting was held in Tampere on August 25 through August 27, 2003, hosted by the Optoelectronics Research Centre (ORC) of the Tampere University of Technology (TUT). NSM20 provided a truly international forum for the discussion of the state-of-the-art semiconductor physics, technology, and industry in Scandinavia, and highlighted selected results achieved elsewhere in the world. While the earlier meetings the first held in the 1960's and since then every other year on rotating basis in Denmark, Finland, Iceland, Norway, and Sweden focused on silicon technologies, the Tampere Meeting was concerned more than ever with optoelectronics, which has become an unexpectedly strong field of research and industry in Northern Europe. An excellent array of keynote speakers provided the audience with the latest developments in all the main fields of the Meeting and together with other speakers fostered new ideas that have the potential for further advancement of these strategic sciences and technologies. There were over 100 registered participants, presenting a total of 100 scientific contributions. From these contributions 62 manuscripts were accepted for publication in the Proceedings, representing all the key areas of the Meeting. There was the largest number of industrial sponsors of any Nordic Semiconductor Meeting, which was a remarkable thing in the current world economic cycle. In fact, the organization of NSM20 would not have been possible without the support from ORC, IEEE Finland Section, Institute of Physics of TUT, Chroma Technology Corp. (USA), Coherent Tutcore Oy (Finland), Europractice c/o YOLE Development (France), EV Group GmbH (Austria), Instrumentti Mattila Oy (Finland), FAB Support Ab (Sweden), Keithley Instruments Inc. (UK), Modulight Inc. (Finland), Nokia Oyj (Finland), Oxford Instruments GmbH (Germany), Oy SV Vacuumservice Ab (Finland), Scandinavian Airlines Systems

This article describes the "The Presidential Timeline of the 20th Century," a newly unveiled website jointly created by the Learning Technology Center of The University of Texas at Austin and The National Archives' 12 presidential libraries. This web-based resource provides access to the continually growing store of digitized assets from the…

This paper records personal reminiscences of seven 20th century ophthalmologists who each in his own way metaphorically split the atom and, thereby, changed ophthalmology forever. In addition to their major contributions, they each shared some very desirable traits. They were gentlemen who were devoted to their families and their families to them. PMID:23768923

Sociologists consider inequality in educational attainment to be a major cause of inequality between people in their chances of occupying a more advantageous class position. However, there is dispute as to whether educational inequality according to social class background declined during the 20th century. What is not in doubt is the expansion of…

Dramatic world-wide changes occurred during the 20th century in both nutrient delivery and in-stream retention. In this paper, we use a combined nutrient-input, hydrology, in-stream nutrient retention model to quantitatively track the changes in the global freshwater N and P cycles over the 20th century. Global nutrient delivery almost doubled due to expanding agriculture and increasing wastewater discharge. Nutrient retention also increased by a factor of two as a result of the rapidly growing number of dams and reservoirs. This increase in nutrient retention could not balance the increase in nutrient delivery to rivers. River export to coastal seas increased during the 20th century from 19 to 37 Tg yr-1 of N and 2 to 4 Tg yr-1 of P. There are important differences in riverine N:P export ratios in various parts of the world resulting from the interplay of multiple processes and economic activities in different river basins. Increasing nutrient loading of freshwater systems is a threat to water quality. Furthermore, the global river export increase in the molar N:P ratio during recent decades may affect the ecology within both the river basins and the coastal system. This ratio change may be driven by the recent stagnation of P fertilizer use in most industrialized countries, in comparison to the ever increasing N fertilizer use.

The increase of global mean temperature during the 20th century, according to the Intergovernmental Panel on Climate Change (IPCC), is very plausible due to the anthropogenic greenhouse gas emissions. In addition, climate model projections suggest that the global mean temperature will further rise during the 21st century. While the vast majority of scientists have endorsed IPCC's conclusions, not a few individual scientists, have expressed a disagreement regarding the validity of climate model projections. In this study, the answer to a fundamental question is sought. That is, how probable was the global warming of the 20th century considering only recorded and reconstructed global mean temperatures values, and assuming that the global mean temperature is a stationary stochastic process. In order to answer this question, a stationary stochastic model is set that incorporates (a) the observed autocorrelation structure of the global mean temperature, (b) past observations of global mean temperature and (c) global, regional and site-specific reconstructions of global mean temperature changes during the last two millennia. Based on an intense Monte Carlo simulation, the probability of a global mean temperature trend with equal or greater slope than the observed one in the 20th century is presented.

One of the primary goals of paleo-sea level research is to assess the stability of ice sheets and glaciers in warming climates. In this context, the 20th century may be thought of as the most recent, recorded, and studied of all past episodes of warming. Over the past decade, a consensus has emerged in the literature that 20th century global mean sea level (GMSL), inferred from tide gauge records, rose at a mean rate of 1.6-1.9 mm/yr. This sea-level rise can be attributed to multiple sources, including thermal expansion of the oceans, ice sheet and glacier mass flux, and anthropogenic changes in land water storage. The Fifth Assessment Report of the IPCC summarized the estimated contributions of these sources over 1901-1990 and computed a total rate, using a bottom-up approach, of ~1.0 mm/yr, which falls significantly short of the rate inferred from tide gauge records. Using two independent probabilistic approaches that utilize models of glacial isostatic adjustment, ocean dynamics, and the sea-level fingerprints of rapid land-ice melt to analyze tide gauge records (Kalman smoothing and Gaussian process regression), we are able to close the 20th century sea-level budget and resolve the above enigma. Our revised estimate for the rate of GMSL rise during 1901-1990 is 1.1-1.3 mm/yr (90% credible interval). This value, which is ~20-30% less than previous estimates, suggests that the change in the GMSL rate from the 20th century to the last two decades (2.7 ± 0.4 mm/yr, consistent with past estimates) was greater than previous estimates. Moreover, since some forward projections of GMSL change into the next century are based in part on past estimates of GMSL change, our revised rate may impact projections of GMSL rise for the 21st century and beyond.

Because higher level cognitive processes generally involve the use of world knowledge, computational models of these processes require the implementation of a knowledge base. This article identifies and discusses 4 strategies for dealing with world knowledge in computational models: disregarding world knowledge, "ad hoc" selection, extraction from…

The 20th Power and Energy Society Annual Conference was held on August 18-20, 2009 at Shibaura Institute of Technology. The total number of technical papers was 352, and technical sessions were 47 (46 oral sessions and 1 poster session). An invited lecture, a panel discussion, technical exhibitions and two technical tours were organized. All events were very well attended and the final enrollment attained to 881 registrations. The conference was successfully closed by the great contribution of all participants. The outline of the conference is reported in this article.

This sabbatical report surveys some computer software presently being developed, already in use, and/or available, and describes computer use in several Massachusetts colleges. A general introduction to computers, word processors, artificial intelligence, and computer assisted instruction is provided, as well as a discussion of what computers can…

In the first half of the 20th century G. W. Stewart was a physics faculty member at the University of Iowa (UI) with a distinguished record of research and teaching, especially in acoustics. Much of his research focused on the design and use of several types of acoustical filters. Some apparatus which he developed or utilized are still housed in the Department of Physics and Astronomy or are available in detailed diagrams. Demonstration apparatus (apparently homemade) from his era are still available for use. Carl E. Seashore, a renowned psychologist also at UI in the early 20th century, had interdisciplinary interests linking psychology, speech and hearing, music, and acoustics. He was responsible for obtaining an Henrici harmonic analyzer, a mechanical Fourier analyzer manufactured in Switzerland, a special grant from the state legislature during Depression conditions provided the funding. It resides in the Department of Speech Pathology and Audiology at UI. The Grinnell College Physics Historical Museum houses a set of 18 Helmholtz resonators and a Savart bell and resonator. Apparatus at Iowa State University, the University of Northern Iowa, and other Iowa institutions will also be described. Pictures and diagrams as well as some actual apparatus will be exhibited.

It's no secret that fewer and fewer women are entering computer science fields. Attracting high school girls to computer science is only part of the solution. Retaining them while they are in higher education or the workforce is also a challenge. To solve this, there is a need to show girls that computer science is a wide-open field that offers…

The Cosmic Ray Energetics and Mass (CREAM) balloon instrument, designed to detect and measure the composition and spectra of high energy galactic cosmic ray particles, had its maiden flight on December 16 2004 and was afloat and taking data during the January 20th solar flare. The CREAM instrument consists of a finely segmented silicon charge detector, a timing charge detector, and several layers of scintillating fiber hodoscopes, as well as a calorimeter and transition radiation detectors to measure cosmic-ray energies above several hundred GeV. While the latter were not designed to be triggered by solar particles, signals were seen in the silicon charge detector and several layers of hodoscopes at the onset of the giant solar flare, indicating that solar flare particles were passing through the instrument. We will review our measurements and analysis of the data recorded during the solar flare.

We report here trends in the usage of “mood” words, that is, words carrying emotional content, in 20th century English language books, using the data set provided by Google that includes word frequencies in roughly 4% of all books published up to the year 2008. We find evidence for distinct historical periods of positive and negative moods, underlain by a general decrease in the use of emotion-related words through time. Finally, we show that, in books, American English has become decidedly more “emotional” than British English in the last half-century, as a part of a more general increase of the stylistic divergence between the two variants of English language. PMID:23527080

Infectious diseases have led to illness and death for many famous musicians, from the classical period to the rock 'n' roll era. By the 20th century, as public health improved and orchestral composers began living more settled lives, infections among American and European musicians became less prominent. By mid-century, however, seminal jazz musicians famously pursued lifestyles characterized by drug and alcohol abuse. Among the consequences of this risky lifestyle were tuberculosis, syphilis, and chronic viral hepatitis. More contemporary rock musicians have experienced an epidemic of hepatitis C infection and HIV/AIDS related to intravenous drug use and promiscuity. Musical innovation is thus often accompanied by diseases of neglect and overindulgence, particularly infectious illnesses, although risky behavior and associated infectious illnesses tend to decrease as the style matures. PMID:20660936

Infectious diseases have led to illness and death for many famous musicians, from the classical period to the rock ’n’ roll era. By the 20th century, as public health improved and orchestral composers began living more settled lives, infections among American and European musicians became less prominent. By mid-century, however, seminal jazz musicians famously pursued lifestyles characterized by drug and alcohol abuse. Among the consequences of this risky lifestyle were tuberculosis, syphilis, and chronic viral hepatitis. More contemporary rock musicians have experienced an epidemic of hepatitis C infection and HIV/AIDS related to intravenous drug use and promiscuity. Musical innovation is thus often accompanied by diseases of neglect and overindulgence, particularly infectious illnesses, although risky behavior and associated infectious illnesses tend to decrease as the style matures. PMID:20660936

In the academic year 2001-2002, Pace University (New York) Computer Science and Information Systems (CSIS) students developed real-world Web and pervasive computing systems for actual customers. This paper describes the general use of team projects in CSIS at Pace University, the real-world projects from this academic year, the benefits of…

The impact of television on mass culture and of computer technology on research in the social sciences and humanities is discussed. Questions are raised concerning the quality of research based on information available from computers. It is suggested that "a super-abundance of information is no substitute for wisdom." (PP)

Discusses in detail the modem (short for modulator/demodulator), an electronic device which links one computer with another using telephone lines, and its importance to home computers. An extensive chart is included which gives purchasing information on low-speed and medium-speed modems. (JJD)

Pollen records in sediment cores from sites in the historic Everglades allowed us to document the natural variability of the ecosystem over the past 2,000 years and contrast it to 20th century changes in wetland plant communities. The natural system included extensive water-lily sloughs, sawgrass ridges, and scattered tree islands extending from Lake Okeechobee southward through Shark River Slough. Between ~1000 AD and 1200 AD, weedy species such as Amaranthus (water hemp) became more abundant, indicating decreased annual rainfall, shorter hydroperiods, and shallower water depths during this time. After ~1200 AD, vegetation returned to its pre-1000 AD composition. During the 20th century, two phases of hydrologic alteration occurred. Completed by 1930, the first phase included construction of the Hoover Dike, canals linking Lake Okeechobee to the Atlantic Ocean, and the Tamiami Trail. Reconstructions of plant communities indicate that these changes shortened hydroperiods and lowered water depths throughout the Everglades. The extent of water-lily slough communities decreased, and tree islands became larger in Shark River Slough. The second phase resulted from construction of canals and levees in the 1950s, creating three Water Conservation Areas. The response of plant communities to these changes varied widely depending on location in the Everglades. In Loxahatchee NWR, weedy and short-hydroperiod plant species became more abundant in marshes, and species composition of tree islands changed. In Water Conservation Area 2A, cattail replaced sawgrass in marshes with high nutrient influx; the ridge and slough structure of the marshes was replaced by more homogeneous sawgrass marshes; sustained high water levels for more than a decade resulted in loss of tree islands that had existed for more than 1,000 years. In Everglades National Park, the extent of slough vegetation decreased further. Near Florida Bay, the rate of mangrove intrusion into fresh-water marshes

The decommissioning of nuclear submarines, disposal of highly-enriched uranium and weapons-grade plutonium, and processing of high-level radioactive wastes represent the most challenging issues facing the cleanup of 20th century radiation legacy wastes and facilities. The US and Russia are the two primary countries dealing with these challenges, because most of the world's fissile inventory is being processed and stored at multiple industrial sites and nuclear weapons production facilities in these countries.

Tornado activity is associated with extreme convective weather which can cause extended damage and even in some cases the loss of life. The complex inland terrain of Greece along with the Ionian Sea at the west and the Aegean Sea at the east appear to be a favorable area for fury phenomena such as tornadoes, waterspouts and funnel clouds. In this study, the spatial and temporal variability of tornado activity in Greece for the period 1900-1999 are presented. The spatial distribution of tornadoes, waterspouts and funnel clouds reveals the vulnerability of specific geographical areas, such as the west Greece and the south Aegean Sea. As far as the intra annual variability is concerned, the maximum of tornado activity dominates within the cold period of the year (October-March) while according to the daily distribution, tornadoes happen frequently during the warm hours of the day. It is remarkable to mention that in Greece, within the 20th century, the tornado activity caused the loss of 4 lifes, the injury of 40 people and numerous damages on human constructions and cultivations.

Tornado activity is associated with extreme convective weather which can cause extended damages and even more in some cases the loss of life. The complex inland terrain of Greece along with the Ionion Sea at the west and the Aegean Sea at the east appear to be a favorable area for fury phenomena such as tornado, waterspouts and funnel clouds. In this study, the spatial and temporal variability of tornadoes activity in Greece for the period 1910-1999 are presented. The spatial distribution of tornadoes, waterspouts and funnel clouds reveals the vulnerability of specific geographical areas to tornado activity, such as the western Greece and the southern Aegean Sea. As far as the intra annual variability is concerned, the maximum of tornadoes activity dominates within the cold period of the year (October-March) while according to the daily distribution, tornadoes happen frequently during the warm hours of the day. Furthermore, especially for the cases after 1957, the prevailing synoptic conditions during the tornado activity, based on the analysis of the 500 hPa Geopotential Heights from the archives of the European Centre for Medium-Range Weather Forecasts (ECMWF), were examined, in order to identify the weather patterns associated with the tornado genesis and development. It is remarkable to mention that in Greece, within the 20th century, the tornado activity caused the loss of 4 lifes, the injury of 40 people and numerous damages on human constructions and cultivations.

At the beginning of the 20th century, Barbados was described as the most unhealthy place in the British Empire; at the end of the century, it is considered amongst the healthiest of developing countries. At the start of the century the statistics were harsh; for example, there was an infant mortality rate of 400 per 1000 live births. It is now between 10 and 15 per 1000 live births. In the last two-thirds of the century, there was a series of ongoing revolutions in Education, Public Health and Hospital Services that affected the health status favourably. The revolution in education was enhanced by the provision of University education starting with Medicine at Mona, Jamaica. Training of doctors expanded to Barbados in 1967 and has been an essential ingredient in the medical care revolution of the last third of the century. In 1953, the first Public Health Centre was opened and Barbados can now boast the most modern public health and primary care facilities. However, modern lifestyles are associated with an epidemic of obesity, diabetes mellitus and hypertension. HIV/AIDS has emerged as a major problem. Health in the 21st century will need to look at lifestyles--the effects of the internal combustion engine, the availability of tools of violence, the lure of 'illegal drugs', personal relationships and gender as well as the driving forces behind the associated lifestyles. PMID:11824009

While the practice of Western medicine is known today to doctors of all ethnic and religious groups, its standards are subject to the availability of resources. The medical ethics guiding each doctor is influenced by his/her religious or cultural background or affiliation, and that is where diversity exists. Much has been written about Jewish and Christian medical ethics. Islamic medical ethics has never been discussed as an independent field of ethics, although several selected topics, especially those concerning sexuality, birth control and abortions, have been more discussed than others. Islamic medical ethics in the 20th century will be characterised on the basis of Egyptian fatawa (legal opinions) issued by famous Muslim scholars and several doctors. Some of the issues discussed by Islamic medical ethics are universal: abortions, organ transplants, artificial insemination, cosmetic surgery, doctor-patient relations, etc. Other issues are typically Islamic, such as impediments to fasting in Ramadan, diseases and physical conditions that cause infringement of the state of purity, medicines containing alcohol, etc. Muslims' attitudes to both types of ethical issues often prove that pragmatism prevails and the aim is to seek a compromise between Islamic heritage and the achievements of modern medicine, as long as basic Islamic dogma is not violated. PMID:2614792

The 20th Annual Prostate Cancer Foundation (PCF) Scientific Retreat was held from October 24 to 26, 2013, in National Harbor, Maryland. This event is held annually for the purpose of convening a diverse group of leading experimental and clinical researchers from academia, industry, and government to present and discuss critical and emerging topics relevant to prostate cancer (PCa) biology, and the diagnosis, prognosis, and treatment of PCa patients, with a focus on results that will lend to treatments for the most life-threatening stages of this disease. The themes that were highlighted at this year's event included: (i) mechanisms of PCa initiation and progression: cellular origins, neurons and neuroendocrine PCa, long non-coding RNAs, epigenetics, tumor cell metabolism, tumor-immune interactions, and novel molecular mechanisms; (ii) advancements in precision medicine strategies and predictive biomarkers of progression, survival, and drug sensitivities, including the analysis of circulating tumor cells and cell-free tumor DNA-new methods for liquid biopsies; (iii) new treatments including epigenomic therapy and immunotherapy, discovery of new treatment targets, and defining and targeting mechanisms of resistance to androgen-axis therapeutics; and (iv) new experimental and clinical epidemiology methods and techniques, including PCa population studies using patho-epidemiology. PMID:24719035

The first real breakthrough in the research of brain organization and thinking in the 20th century was made in neurophysiological investigations performed in direct contact with different sites of the brain, which became possible in diagnosis and treatment. The second breakthrough is happening at present. It is based on the opportunities provided by the non-invasive technique. The theory of the unique character of the brain system consisting of rigid and flexible elements maintaining thinking was created as well as concepts on the reliability in the system, of the error detector and intrinsic protective mechanisms of the brain. In the clinic these data enabled us to help patients who had lost various functions due to stroke. In confirmation with the above theory it was revealed that the same task could be solved in the brain by systems consisting of different elements due to environmental changes or even direction of attention. Data on the functional properties or every zone of the cortex and subcortex as well as cerebellum are rapidly increasing in number. The first priority lies in neurophysiologically penetrating into the physiological character and micromosaic of the activation sites of PET. The main aim of future brain research lies in the investigation of the fine physiological rearrangements which underlie thinking, i.e. deciphering its brain code. This is going to be the basis for the third, extremely valid breakthrough in the research on brain organization of thinking. PMID:10677649

Nutrition in the 20th century is examined with respect to changes in the American diet due to changes in the economy and evolution from an agrarian to an industrialized society. The American farm family diet from two regions of the United States during the 1930s is studied on the basis of overall availability of food commodities. A discussion of the diet staples and differences in farm family health is presented and related to nutritional deficiencies. Beginning in the 1920s through the early 1930s dietary deficiencies became a major focus of public health officials in the United States. Identification of the cause of these human nutritional deficiencies prompted significant research by government agencies such as the U.S. Department of Agriculture, Food and Drug Administration, and National Institutes of Health. Medical schools, universities, pharmaceutical corporations, and private institutions directed their resources into basic chemical research and clinical trials to assess the role of vitamins, minerals, proteins, lipids, carbohydrates, and nutrients for improving human health and nutrition. Chemists played an important role in the discovery of vitamins, minerals, and essential nutrients, validating the efficacy through tedious clinical trials. They developed synthetic vitamins affording food manufacturers and pharmaceutical companies the opportunity to capitalize upon fortifying foods for consumers. The American chemist was also responsible for the development of commodities to maximize crop yield through pesticides and fertilizers. PMID:19719130

The following calendar systems, introduced in Europe from 18^{th} to 20^{th} century, which were in use for a shorter or longer period by a larger or smaller community, were reviewed and discussed: The French Revolutionary Calendar, the Theosebic calendar invented by Theophilos Kairis, the Revolutionary Calendar of the Soviet Union (or 'Bolshevik calendar'), the Fascist calendar in Italy and the calendar of the Metaxas dictatorship in Greece before World War II. Also the unique of them, which is still in use, the New Rectified Julian calendar of the Orthodox Church, adopted according to proposition of Milutin Milanković on the Congress of Orthodox Churches in 1923 in Constantinople, is presented and discussed. At the end, difficulties to introduce a new calendar are discussed as well.

This paper presents 5-yearly data on the height of young adult men in 15 Western European countries for birth cohorts from the middle of the 19th to the end of the 20th century. The results indicate that from the 1870s to the 1970s average height increased by around 11 cm, or more than 1cm per decade. The main finding is that for the northern and middle European groups of countries the gains in height were most rapid in the period 1911-15 to 1951-55, a period that embraced two World Wars and the Great Depression but also witnessed advances in public health and hygiene. For the southern countries growth was fastest in the period 1951-55 to 1976-80. These findings suggest that advances in height were determined not only by income and living standards but also by a variety of other socioeconomic trends. PMID:20399715

The latitudinal position of the Northern hemisphere Jet stream (NHJ) modulates both long term climate trends and the occurrence and frequency of extreme weather events. Precipitation anomalies in particular are associated with NHJ variability and the resulting floods/droughts can have considerable societal and economic impacts. Therefore a better understanding of NHJs role in regional climate is essential in assessing the natural and socio-economic impacts of projected future change in NHJ features. We developed a new climatology of the 300 hPa NHJ based on its seasonally explicit latitudinal position. We used the 20th Century Reanalysis V2 (20CR) data at monthly resolution from 1930-2012 to define the latitudinal position of NHJ as the latitude with the greatest 300 hPa scalar wind speed (m s-1). From these data, we identified four seasons with coherent NHJ patterns (January-February, April-May, July-August, and October-November) and detected longitudinal sectors (total of 15 sectors for all the seasons) where the seasonal jet shows strong spatial coherence. We examined the influence of seasonal NHJ position on the geographical distribution of precipitation and temperature patterns for all sectors. Furthermore, we compared NHJ positions to atmospheric circulation indices at inter-annual and multi-decadal time scales. We found a significant relationship between the NHJ position and the North Atlantic and Arctic oscillations for all seasons and across the majority of longitudinal sectors. In addition to this, our NHJ data set supports a connection with ocean-atmosphere interactions over the northern Pacific Ocean on various time scales: we found significant correlations between the North Pacific Jet and the Pacific Decadal Oscillation for all seasons and with El Niño Southern Oscillation for the winter season. Our results emphasize the importance of the seasonal and spatial characteristics of NHJ, as well as climate teleconnections, when considering regional

Not until the 19th century theories on sleep were based upon experimental findings in animal and humans. The so-called 'hypnotoxin theory' culminated, when Legendre and Piéron successfully induced sleep in a dog by transmission of cerebrospinal fluid from a dog deprived of sleep. The main discussion concerning the origin of sleep has been the question if sleep is a passive or an active state. Similarities with coma, the positive Babinski sign and pathoanatomical findings in patients who died after encephalitis lethargica were the arguments for the 'deafferentiation hypothesis'. Bremer's classical brainstem-transsections in cats confirmed this idea. Pavlov was the major representative of the idea that sleep was due to a general inhibition of the brain. Hess induced physiological sleep in cats by electrical stimulation of the diencephalon, proving the active nature of sleep. The introduction of the EEG in animals by Caton and in humans by Berger allowed for the first time the measurement of sleep depth without waking the sleeper. After discovery of the REM sleep periods by Aserinsky and Kleitman in 1953 and the demonstration of periodical sleep cycles by Dement and Kleitman, polysomnography with simultaneous whole night recording of EEG, EMG, electrooculogram and other physiological parameters was established as the major diagnostic tool in sleep disorders. One of the most important questions about the function of sleep is still unresolved. NREM sleep is believed to have a restorative function, whereas REM sleep might be involved in learning processes. According to the sleep interpretation of Sigmund Freud, the dream content represents endogenous wishes which cannot be expressed during wakefulness because of an internal 'sensor'. A more recent theory by Hobson explains the dreams by a very unspecific brainstem activity occurring during REM sleep which projects to the frontal brain and activates stored memory. The most important sleep disease of the 20th century is

needed not only to serve minority populations but also to serve as mentors and role models for prospective and current students. The first African-American resident to graduate from the Bellevue Residency Program did indeed treat the underserved, as Dr. Vincent founded the Vincent Sanatorium, dedicated to treating African-American patients, and training African-American nurses and doctors. Over the course of the 20th century, Bellevue Hospital has trained increasing numbers of African-American physicians. It is hoped that, like their predecessor, Dr. Vincent, they will provide care to underserved communities and to the community as a whole, as well as serve as role models for generations to come. PMID:15040520

Food security and the overall wellbeing of human kind are threatened by overexploitation of our freshwater resources. Water scarcity is not only a threat to people, but also to many of the planet's key ecosystems. Due to increasing population pressure, changing water consumption behaviour, and climate change, the threat is projected to become even worse in the future. Water can be physically scarce in two ways: population-driven water shortage occurs in areas where a large population has to depend on a limited resources (indicated by m3/capita/yr), while demand-driven water stress is related to the excessive use of otherwise sufficient water resources (indicated by demand/supply ratio). Although many studies have increased our understanding of current water scarcity and how this may increase in the future, the understanding of trajectories with the past development of the water scarcity is less well understood. To date, studies of past water resources have focused on either water shortage or water stress. We aim to calculate global water scarcity, both water stress and water shortage, for the period 1900-2005. We can thus provide, for the first time, continuous regional trends and local analyses of trajectories of water scarcity for the entire 20th century. By including both dimensions of water scarcity, we can increase the understanding of reasons behind the scarcity. We found that in year 1900 13% of the population (i.e. 0.22 billion people) was living in areas that suffer some kind of water scarcity (<1700 m3/capita/yr or ratio >0.2), while in year 2005 this percentage has increased to 57% (3.80 billion). Especially the population suffering from both high water stress (ratio >0.4) and high water shortage (<1000 m3/capita/yr) has risen considerably, from 2% (29 million people) in 1900, up to 19% (1.2 billion people) in 2005. Geographically these concern mainly northern African regions, the Middle East, Pakistan and parts of India and Northern China. The region of

In this presentation, an overview is given of global dam building activities in the 20th century. Political, economical and hydrological factors shaped the building of large dams. The development of the relations between these three factors and dam building over time is examined. One can argue whether or not history is simply "one damn thing after another" but the second half of the 20th century suggests that history is at least reflected by the construction of one dam after another. The financial crisis of the 1930's started the first construction wave of large hydropower dams in the United States. This wave continued into the Second World War. During the Cold War, the weapon race between the USA and USSR was accompanied by a parallel neck-and-neck race in dam construction. By the 1970's, dam construction in the USA tapered off, while that in the USSR continued until its political disintegration. In China, we see two spurts in dam development, the first one coinciding with the disastrous Great Leap Forward and the second with the liberalization of the Chinese economy after the fall of the Berlin Wall. Economic and political events thus shaped to an important extent decisions surrounding the construction of large dams. Clearly, there are some hydrological prerequisites for the construction of dams. The six largest dam building nations are USSR, Canada, USA, China, Brazil, and India, all large countries with ample water resources and mountain ranges. Australia has relatively little reservoir storage for the simple fact that most of this country is flat and dry. A few countries have relatively large amounts of reservoir storage. Especially Uganda (Owens Falls), Ghana (Akosombo), and Zimbabwe (Kariba) are examples of small countries where gorges in major rivers were "natural" places for large dams and reservoirs to be built early on. It seems that, deserts aside, the average potential storage capacity lies for most continents around 10 cm or about 50% of the total

We analyze interannual to multi-decadal growth variations of 555 oak trees from Central-West Germany. A network of 13 pedunculate oak (Quercus robur L.) and 33 sessile oak (Quercus petraea (Matt.) Liebl.) site chronologies is compared with gridded temperature, precipitation, cloud-cover, vapor pressure and drought (i.e., Palmer Drought Severity Index, PDSI) fluctuations. A hierarchic cluster analysis identifies three groups for each oak species differentiated by ecologic settings. When high precipitation is primarily a characteristic for one Q. robur and one Q. petraea cluster, the other clusters are more differentiated by prevailing temperature conditions. Correlation analysis with precipitation and vapor pressure reveals statistically significant (P < or = 0.05) correlations for June (r = 0.51) and annual (r = 0.43) means. Growth of both species at dry sites correlates strongly with PDSI (r = 0.39, P < or = 0.05), and weakly with temperature and cloud-cover. In natural stands, Q. robur responds more strongly to water depletion than Q. petraea. Twenty-one-year moving correlations show positive significant growth response to both PDSI and precipitation throughout the 20th century, except for the 1940s - an anomalously warm decade during which all oak sites are characterized by an increased growth and an enhanced association with vapor pressure and temperature. We suggest that the wider oak rings that are exhibited during this period may be indicative of a nonlinear or threshold-induced growth response to drought and vapor pressure, and run counter to the general response of oak to drought and precipitation that normally would result in suppressed growth in a warmer and drier environment. As the wide rings are formed during the severe drought period of the 20th century, a complex model seems to be required to fully explain the widespread oak growth. Our results indicate uncertainty in estimates of future growth trends of Central European oak forests in a warming and

1. AERIAL VIEW, LOOKING SOUTH, ALONG 20TH STREET NORTH WITH EMPIRE BUILDING (CENTER RIGHT), WOODWARD BUILDING (CENTER), JOHN HAND BUILDING (TOP LEFT), BROWN MARX BUILDING (BOTTOM LEFT), THE FOUR BUILDINGS THAT COMPRISE THIS NATIONAL REGISTER HISTORIC DISTRICT - Heaviest Corner on Earth (Commercial), First Avenue, North & Twentieth (20th) Street, North, Birmingham, Jefferson County, AL

Math education is as important today as it was 100 years ago when the early 20th century was transforming from the old world into an era of factories, airplanes, atomic energy, and medical breakthroughs. Educational leaders of the era were wrestling with how long children should stay in school, meeting the diverse needs of an influx of immigrants,…

This document presents lesson plans and related materials for teaching about the role of women in the U.S. military from World War I to Desert Storm (the Gulf War). The lesson includes a table showing the number of women who took part in Desert Storm broken down by branch of service. Another chart shows the number of women who served in the…

Women have increasingly become more involved in the workforce following World War II. Paid employment of women has shifted from primarily traditional female-oriented jobs to more non-traditional, and previously male-oriented careers. Women's participation in the workforce has lead to the study of career aspirations of women. Career aspirations are…

The NCAR Community Climate System Model and Parallel Climate Model have produced one the largest data sets for the Intergovernmental Panel on Climate Change (IPCC) and its fourth Assessment. There will be some discussion of what is in state-of-art climate models. As a result of this and other climate assessments, most of the climate research science community now believes that mankind is changing the earth's system and that global warming is taking place. The changes are not only reflected in terms of means but also extremes. The new IPCC research findings will be presented along with future computational challenges. It is expected that in the future there will be a need for both terascale and petascale computing, which will allow for higher resolution climate models that have embedded hurricanes and smaller scale weather features as well as viable biogeochemical cycles. Because of concerns of burning fossil fuels there will be special emphasis on better estimates of the Earth's carbon cycle, which is a special concern for the DOE. In order to perform future climate change simulations, the computational methods will necessarily undergo a reexamination. Finally, at the end of talk there will be a discussion of how climate model studies can aid in future policy options, some of which will address 'geoengineering' the climate system.

Malaria has been part of Peruvian life since at least the 1500s. While Peru gave the world quinine, one of the first treatments for malaria, its history is pockmarked with endemic malaria and occasional epidemics. In this review, major increases in Peruvian malaria incidence over the past hundred years are described, as well as the human factors that have facilitated these events, and concerted private and governmental efforts to control malaria. Political support for malaria control has varied and unexpected events like vector and parasite resistance have adversely impacted morbidity and mortality. Though the ready availability of novel insecticides like DDT and efficacious medications reduced malaria to very low levels for a decade after the post eradication era, malaria reemerged as an important modern day challenge to Peruvian public health. Its reemergence sparked collaboration between domestic and international partners towards the elimination of malaria in Peru. PMID:24001096

If cigarettes were introduced as a new consumer product today, it is unlikely they would receive government regulatory approval. Cigarettes have proven biologic toxicities (carcinogenesis, atherogenesis, teratogenesis) and well-established causal links to human disease. Things were very different in 1913 when the R. J. Reynolds Tobacco Company introduced the first modern cigarette, the iconic Camel. By the early 1950s, definitive scientific reports linked cigarettes and human disease, but it was more than a half century later (2006) that cigarette manufacturers were found guilty by a federal court of deceptive product marketing regarding the health hazards of tobacco use. In the United States, cigarette smoking remains a major but slowly declining problem. But in developing countries, cigarette use is expanding tremendously. In global terms, the epidemic of smoking-caused disease is projected to increase rapidly in coming decades, not decline. Society may have begun to slowly win the smoking battle in the developed world, but we are resoundingly losing the global war on smoking. All is not lost! There is some good news! The 2003 Framework Convention on Tobacco Control, supported strongly by the American College of Chest Physicians, is the first global public health treaty of the new millennium. Many developed societies have begun planning to rid their countries of cigarettes in what is called the Endgame Strategy, and now is the time for the international medical community to help change tobacco policy to a worldwide endgame approach to rid all humanity of smoking-related diseases. PMID:25451345

We have used numerical models to test the impact of the change in Sea Surface Temperatures (SSTs) and carbon dioxide (CO2) concentration on the global circulation, particularly focusing on the hydrologic cycle, namely the global cycling of water and continental recycling of water. We have run four numerical simulations using mean annual SST from the early part of the 20th century (1900-1920) and the later part (1980-2000). In addition, we vary the CO2 concentrations for these periods as well. The duration of the simulations is 15 years, and the spatial resolution is 2 degrees. We use passive tracers to study the geographical sources of water. Surface evaporation from predetermined continental and oceanic regions provides the source of water for each passive tracer. In this way, we compute the percent of precipitation of each region over the globe. This can also be used to estimate precipitation recycling. In addition, we are using the passive tracers to independently compute the global cycling of water (compared to the traditional, Q/P calculation).

ABSTRACT The profession of forest pathology evolved in the early decades of the 20th century from a science describing microorganisms that infect trees to a discipline that was required to deal with numerous disease outbreaks. The foundations of the science were carried from Europe to the "New World" and initially dealt with decay and the resource losses it caused. The profession was forced to shift direction quickly because it was called upon to address major diseases caused by the introduction of pathogens from other continents; notably organisms inciting chestnut blight, Dutch elm disease, and white pine blister rust. Changes in natural ecosystems that resulted from a legacy of poor forest practice, land abuse, and an increase in plantation monocultures gave rise to other disease problems when host-pathogen balances in natural ecosystems were disturbed. Further, the need for large numbers of tree seedlings resulted in numerous nursery disease problems. Although many of the principles of general plant pathology had application to the study of forest diseases, the long-term nature of forests requires varied approaches to their study and management. Today, the science continues to evolve as the complexities of forest ecosystems unfold. PMID:18943872

Visual observations of clouds have been performed since the establishment of meteorological observatories during the early instrumental period, and have become more systematic and reliable after the mid-19th century due to the establishment of the first national weather services. During the last decades a large number of studies have documented the trends of the total cloud cover (TCC) and cloudy types; most of these studies focus on the trends since the second half of the 20th century. Due to the lower reliability of former observations, and the fact that most of this data is not accessible in digital format, there is a lack of studies focusing on the trends of cloudiness since the mid-19th century. In the first part, this work attempts to review previous studies analyzing TCC changes with information covering at least the first half of the 20th century. Then, the study analyses a database of cloudiness observations in Southern Europe (Spain) since the second half of the 19th century. Specifically, monthly TCC series were reconstructed since 1866 by means of a so-called parameter of cloudiness, calculated from the number of cloudless and overcast days. These estimated TCC series show a high interannual and decadal correlation with the observed TCC series originally measured in oktas. After assessing the temporal homogeneity of the estimated TCC series, the mean annual and seasonal series for the whole of Spain and several subregions were calculated. The mean annual TCC shows a general tendency to increase from the beginning of the series until the 1960s; at this point, the trend becomes negative. The linear trend for the annual mean series, estimated over the 1866-2010 period, is a highly remarkable (and statistically significant) increase of +0.44% per decade, which implies an overall increase of more than +6% during the analyzed period. These results are in line with the majority of the trends observed in many areas of the world in previous studies, especially

The second part of this century has seen great developments with a variety of new technologies, such as telecommunications, computers, and space travel. However, an age-old technology, the use of fire to make metals, also developed dramatically during this period—pyrometallurgy was used to produce more metal than the cumulative production of the millennia before it. The following discusses the extraordinary developments in flash-and bath-smelting technologies; the article also examines the issues that will drive the prospects of pyrometallurgy for the future. Currently, metal markets are on "easy street": metal profits are low, and government and industry in the United States have little interest in metals extraction research and development. This article discusses factors that may change this situation in the near future.

Two women's study of writing by women in the 20th century found unexpectedly bitter and persistent anxiety and female-male conflict and a common theme of confinement and suppression of emotions and thoughts. (MSE)

Floods are at world scale the natural disaster that affects a larger fraction of the population. It is a phenomenon that extends it's effects to the surrounding areas of the hydrographic network (basins, rivers, dams) and the coast line. Accordingly to USA FEMA (Federal Emergency Management Agency) flood can be defined as:"A general and temporary condition of partial or complete inundation of two or more acres of normally dry land area or of two or more properties from: Overflow of inland or tidal waters; Unusual and rapid accumulation or runoff of surface waters from any source; Mudflow; Collapse or subsidence of land along the shore of a lake or similar body of water as a result of erosion or undermining caused by waves or currents of water exceeding anticipated cyclical levels that result in a flood as defined above." A flash flood is the result of intense and long duration of continuous precipitation and can result in dead casualties (i.e. floods in mainland Portugal in 1967, 1983 and 1997). The speed and strength of the floods either localized or over large areas, results in enormous social impacts either by the loss of human lives and or the devastating damage to the landscape and human infrastructures. The winter of 2009/2010 in Madeira Island was characterized by several episodes of very intense precipitation (specially in December 2009 and February 2010) adding to a new record of accumulated precipitation since there are records in the island. In February two days are especially rainy with absolute records for the month of February (daily records since 1949): 111mm and 97mm on the 2nd and 20th respectively. The accumulated precipitation ended up with the terrible floods on the 20th of February causing the lost of dozens of human lives and hundreds of millions of Euros of losses The large precipitation occurrences either more intense precipitation in a short period or less intense precipitation during a larger period are sometimes the precursor of

On August 23, scientists will mark the 20th anniversary of the National Science Foundation's Very Large Array (VLA), the most powerful, flexible and widely-used radio telescope in the world. "Twenty years ago, the VLA brought dramatic new observing capabilities to the world's astronomers, and today there is hardly a branch of astronomy that has not been profoundly impacted by the prolific research output of this radio telescope," said Dr. Paul Vanden Bout, Director of the National Radio Astronomy Observatory (NRAO). The anniversary will be marked in a ceremony at NRAO's Array Operations Center in Socorro, NM. The keynote speaker for this ceremony will be U.S. Senator Pete V. Domenici, R-NM. Also speaking will be Dr. Rita Colwell, NSF Director; Dr. Anneila Sargent, president-elect of the American Astronomical Society; Vanden Bout; Dr. Riccardo Giacconi, president of Associated Universities, Inc. (AUI); Dr. Paul Martin, chairman of the AUI board of trustees; and Dr. Miller Goss, NRAO's director of VLA/VLBA operations. "More than 2,200 researchers from hundreds of institutions around the world have used the VLA for more than 10,000 observing projects," said Vanden Bout. "Research conducted at the VLA has had a major impact across the entire breadth of astronomy, from nearby objects such as the Sun and planets of our own Solar System, to forming galaxies and quasars billions of light-years away in the farthest reaches of the Universe," Vanden Bout added. Major discoveries made by the VLA have ranged from the surprising detection of water ice on Mercury, the nearest planet to the Sun, to the first detection of radio emission from a Gamma Ray Burster in 1997. The VLA also discovered the first "Einstein Ring" gravitational lens in 1987, and the first "microquasar" within our own Milky Way Galaxy in 1994. Over the past two decades, the VLA also has made major contributions to our understanding of active regions on the Sun, the physics of superfast "cosmic jets" of material

Founded at the dawn of the 20th century (1908), the Astronomical Observatory of Bucharest had an evolution that followed the destiny of its country. After about half a century along which it lives the usual life of any European institution for research and education, it leaves the education after the World War II, becoming an institute of the Academy. A short period of progress is followed by an unprecedented, total isolation, even from the neighbouring countries. The fall of the communist system in the last decade of the 20th century brings new endeavours, but also a hard competition with the worldwide astronomy. What is important is that in each situation the Romanian astronomers have found the interior resources to resist and to keep the national astronomy competitive, at least as regards some of its fields. Moreover, by setting the South-Eastern Branch of the European Astronomical Society, it wishes to contribute to the finding of local solutions, with the goal of increasing the level of astronomy in countries that face the same problems, the same difficulties.

Construction of Canada’s Dominion Astrophysical Observatory (DAO) commenced in 1914 with first light on 6 May 1918. Its varied, rich contributions to the astronomical heritage of the 20th century continue into the 21st century. The first major research observatory built with public funding on the West Coast of North America, it was Canada’s first ‘big science’ project. DAO welcomed scientists from around the world to use its 1.8m telescope designed by John Stanley Plaskett working in close collaboration with the Warner and Swasey Company of Cleveland, OH. Their original design was copied seven times around the globe, the last occasion being in the 1960s. From Day 1 the DAO welcomed the public for viewing and interaction with the small scientific staff whose early efforts would today be characterized as ‘Key Projects’. Those efforts included measuring the radial velocities of O and B stars that, interpreted through Oort’s ideas of differential rotation, determined the most reliable estimate of the size and mass of the Milky Way available until radio astronomical techniques emerged in the 1950s. The first organic molecule in interstellar space, CH, was discovered by a DAO astronomer. The first, very puzzling estimate of ~3K for the temperature of interstellar space was deduced from interstellar CN observations a year after interstellar CH and CN were discovered. DAO’s heritage of innovative instrumentation continues to the present day where expertise in optically efficient, mechanically stable spectrographs and adaptive optics are much in evidence at Mauna Kea’s CFHT, Gemini and Subaru observatories. In 2009 the DAO was designated a National Historic Site. This presentation will draw links between DAO, developments of Canadian astronomy and the emergence of Mauna Kea as an exceptional global astronomical reserve.

From biographical data sources on ranking scientists, I was able to identify 35 centenarians. Among these, only one (Michel Chevereul from France) lived before the 20th century. Since the remaining 34 individuals became centenarians only from 1965, I propose that centenarian scientists are an unusual cluster, first formed in the 20th century. Among these, all except one (Alice Hamilton) were men. Six centenarian scientists, including Hamilton, had received professional medical training. The nationality ranks of the 34 centenarian scientists identified in the 20th century show 26 Americans, 6 British, one German and one French. Four of the 26 Americans were immigrants from Europe. At least three centenarians, namely Michael Heidelberger, Nathaniel Kleitman and Victor Hamburger, belong to the 'Nobel class' category, being pioneers in the disciplines of immunochemistry, sleep physiology and neuroembryology respectively. PMID:11918440

More than a speculative technology, quantum computing seems to challenge our most basic intuitions about how the physical world should behave. In this thesis I show that, while some intuitions from classical computer science must be jettisoned in the light of modern physics, many others emerge nearly unscathed; and I use powerful tools from computational complexity theory to help determine which are which. In the first part of the thesis, I attack the common belief that quantum computing resembles classical exponential parallelism, by showing that quantum computers would face serious limitations on a wider range of problems than was previously known. In particular, any quantum algorithm that solves the collision problem---that of deciding whether a sequence of n integers is one-to-one or two-to-one---must query the sequence O (n1/5) times. This resolves a question that was open for years; previously no lower bound better than constant was known. A corollary is that there is no "black-box" quantum algorithm to break cryptographic hash functions or solve the Graph Isomorphism problem in polynomial time. I also show that relative to an oracle, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform "quantum advice states"; and that any quantum algorithm needs O (2n/4/n) queries to find a local minimum of a black-box function on the n-dimensional hypercube. Surprisingly, the latter result also leads to new classical lower bounds for the local search problem. Finally, I give new lower bounds on quantum one-way communication complexity, and on the quantum query complexity of total Boolean functions and recursive Fourier sampling. The second part of the thesis studies the relationship of the quantum computing model to physical reality. I first examine the arguments of Leonid Levin, Stephen Wolfram, and others who believe quantum computing to be fundamentally impossible. I find their arguments unconvincing without a "Sure

This paper seeks new insights concerning the health transition in 20th century Spain by analyzing both traditional (mortality-based) and alternative (anthropometric-based) health indicators. Data were drawn from national censuses, vital and cause-of-death statistics and seven National Health Surveys dating from 1987 to 2006 (almost 100,000 subjects aged 20-79 were used to compute cohort height averages). A multivariate regression analysis was performed on infant mortality and economic/historical dummy variables. Our results agree with the general timing of the health transition process in Spain as has been described to date insofar as we document that there was a rapid improvement of sanitary and health care related factors during the second half of the 20th century reflected by a steady decline in infant mortality and increase in adult height. However, the association between adult height and infant mortality turned out to be not linear. In addition, remarkable gender differences emerged: mean height increased continuously for male cohorts born after 1940 but meaningful improvements in height among female cohorts was not attained until the late 1950s. PMID:21924964

Explores the growth of the U.S. testing industry since the 1900s. Discusses the technical developments that have encouraged the use of standardized testing and contributed to the growth of the testing industry. Attempts to quantify the expansion of the testing marketplace through the 20th century. Includes references. (CMK)

Modernization efforts in education, which were initiated in the 19th century, can be seen as forerunners of the modernization attempts in the Republic period. In this article, Greek education system in the Ottoman Empire will be discussed and the effects and importance of the changes observed in Greek girls' education in 19th and 20th centuries on…

Different representations of "Swedishness," as expressions of altered kinds of imagined kinship in the Swedish educational system during the first half of the 20th century, are discussed. It is argued that even though the curriculum changed, from a more religious one focusing on fostering loyalty and moral commitment to "God, the…

The 20th century saw some of the most important technological and scientific discoveries in the history of humankind. The space shuttle, the internet, and other modern advances changed society forever, and yet many students cannot imagine what life was like before these technologies existed. In the project described here, students take a firsthand…

This paper presents the Superintendent's 20th Annual Report, a comprehensive overview of Hawaii's public schools for school year 2008-09. This report contains essential progress indicators and measures, as well as highlights and comparisons of core educational data presented in a concise and user-friendly format. Appended are: (1) Glossary; (2)…

The 20th conference of the Society on NeuroImmune Pharmacology will be held March 26-29, 2014. It features the latest in research examining the intersection of neuroscience, immunology and pharmacology, relevant for human health and disease. Particular emphases are placed on HIV and other infectious diseases, and abused substances, including illicit drugs and alcohol. PMID:24573530

This paper provides an overview of cataloging in the 20th century. Highlights include: (1) issues in 1901, including the emerging cooperative cataloging system and the work of Charles Ammi Cutter; (2) the 1908 code, i.e., "Catalog Rules: Author and Title Entries," published in British and American editions; (3) the Vatican rules, a code of rules…

10. Photocopy, WATER TOWERS, late 19th or early 20th century. Original Photograph at State Historical Society of North Dakota, file No. TF899 - Fort Totten, 12 miles southwest of Devils Lake City off Route 57, Devils Lake, Ramsey County, ND

The bulk of testimony in the writings and recorded histories of the daughters of immigrants and the first generation of Native Americans educated in American schools in the late 19th and early 20th centuries reveals that, although the ties between female generations became more tangled with the strains of acculturation, the bonds were stretched…

The aim of this study was to investigate the differences of young school children in the visual preferences of paintings from the 20th century. The study was conducted at 4 elementary schools around Split, Croatia. A total of 200 children participated in the study, of which 87 were girls and 113 were boys aged 6-10 years. Visual preference testing…

At the core of the epistemology of black identity in the 20th century United States is the assertion that freedom is a human right, not a privilege to be earned. By the late 19th century, an ideology of racial uplift had emerged that revolved around four concepts--compassion, service, education, and a commitment to social and economic justice for…

In the second half of the 20th century, age-old human fantasies of leaving the Earth and touching the stars have been fulfilled by advances in space science and technology, whose roots are threaded through our history. Current advances are so explosive that the fundamental orientation of Western culture is being radically altered.

This volume comprises the papers presented at the 1998 conference of the Pacific Telecommunications Council. This PTC'98 gathering focused on "Coping with Convergence." These 20th anniversary conference proceedings present at least one contribution on 59 countries and territories. The 120 papers in this volume are arranged chronologically,…

This 50-minute VHS videotape is the second in a 2-volume series that presents 500 years of social dance, music, and fashion. It features dance and music of the 20th century, including; 1910s: animal dances, castle walk, apache, and tango; 1920s: black bottom and charleston; 1930s: marathon, movie musicals, big apple, and jitterbug; 1940s: rumba;…

This thesis aims at increasing the knowledge on past changes in extremes through the analysis of historical records of observations at meteorological stations. The key question addressed is: How did the extremes of daily surface air temperature and precipitation change in Europe's climate of the 20th century, and what can we learn from this? The contents is structured along the lines of four follow-up questions: Are the available observational datasets adequate to analyse extremes? Which trends are observed for the daily extremes of surface air temperature and precipitation? Can the observed changes in temperature extremes in recent decades be regarded as a fingerprint of anthropogenic climate change? Do the observed changes guide the development of temperature scenarios for our future climate? Europe is one of the regions of the world that lacked a readily available and accessible dataset of high-resolution observational series with sufficient density and quality to study extremes. Such a dataset was developed for temperature and precipitation and used to detect statistically significant and non-trivial changes in extremes. The temperature trends indicate a coarsening of our climate and the precipitation trends indicate an increase of wet extremes. The calculated trends represent changes that can be due to natural internal processes within the climate system and/or external forcing, which can either be natural (solar irradiance, volcanic aerosols, ozone, etc.) or anthropogenic (greenhouse gases, etc.). Comparisons between the trend patterns of temperature extremes in the station records, the patterns associated with natural variability in the observations, and the patterns of future warming and natural variability as simulated by a climate model reveal fingerprints of anthropogenic warming over Europe. The last part of this thesis goes beyond the observations of the climate of the past and speculates on future changes in extremes. It presents a 'what- if scenario

Desert dust perturbs climate by directly and indirectly interacting with incoming solar and outgoing long wave radiation, thereby changing precipitation and temperature, in addition to modifying ocean and land biogeochemistry. While we know that desert dust is sensitive to perturbations in climate and human land use, previous studies have been unable to determine whether humans were increasing or decreasing desert dust in the global average. Here we present observational estimates of desert dust based on paleodata proxies showing a doubling of desert dust during the 20th century over much, but not all the globe. Large uncertainties remain in estimates of desert dust variability over 20th century due to limited data. Using these observational estimates of desert dust change in combination with ocean, atmosphere and land models, we calculate the net radiative effect of these observed changes (top of atmosphere) over the 20th century to be -0.14 {+-} 0.11 W/m{sup 2} (1990-1999 vs. 1905-1914). The estimated radiative change due to dust is especially strong between the heavily loaded 1980-1989 and the less heavily loaded 1955-1964 time periods (-0.57 {+-} 0.46 W/m{sup 2}), which model simulations suggest may have reduced the rate of temperature increase between these time periods by 0.11 C. Model simulations also indicate strong regional shifts in precipitation and temperature from desert dust changes, causing 6 ppm (12 PgC) reduction in model carbon uptake by the terrestrial biosphere over the 20th century. Desert dust carries iron, an important micronutrient for ocean biogeochemistry that can modulate ocean carbon storage; here we show that dust deposition trends increase ocean productivity by an estimated 6% over the 20th century, drawing down an additional 4 ppm (8 PgC) of carbon dioxide into the oceans. Thus, perturbations to desert dust over the 20th century inferred from observations are potentially important for climate and biogeochemistry, and our understanding

The 20th century evolution and spatial patterns of the Top-of-Atmosphere (TOA), atmospheric, and surface energy budgets (EB) are investigated in this work. These are computed as the balance between the radiative and heat fluxes at the TOA and at the surface. Total, atmospheric and oceanic meridional energy transports are computed from the EBs. Two AMIP-like ensemble simulations are considered: Integrated Forecast System (IFS) simulations of the ERA-20CM experiment, and ECHAM5-HAM model simulations. With the latter, additional sensitivity experiments are carried out by constraining either Sea-Surface Temperatures (SST) and Sea-Ice Cover (SIC) or aerosol concentrations to climatological values. The recent decades estimates of the EB are in reasonable agreement in the two models, while they are not for what concerns the global scale evolution. Particularly, in the 1970s ERA-20CM shows a fast transition from negative to positive EBs at Top of Atmosphere (TOA) that is not found in ECHAM5-HAM. The impact of aerosols, as evidenced by the sensitivity experiments with ECHAM5-HAM, is seen to set up an inter-hemispheric gradient in the TOA and surface budget after 1960. This is also reflected by an increased total poleward transport in the Northern Hemisphere and decreased in the Southern Hemisphere. This feature is not found in ERA-20CM. SST variations do not seem to induce long-term variations in the patterns of TOA budget and related total meridional transport. Nevertheless most of the surface and atmospheric budget and transport inter-annual variability is attributable to the evolution of SST, and much more agreement is observed among the two models in this respect. Reference: Lembo V, Doris F, Martin W, and Lionello P (2015) Energy budgets and transports: global evolution and spatial patterns during the 20th Century as estimated in two AMIP-like experiments, Clim. Dyn., subm.

The majority of compute resources in todays scientific grids are based on Unix and Unix-like operating systems. In this world, user and user-group management are based around the concepts of a numeric 'user ID' and 'group ID' that are local to the resource. In contrast, grid concepts of user and group management are centered around globally assigned identifiers and VO membership, structures that are independent of any specific resource. At the fabric boundary, these 'grid identities' have to be translated to Unix user IDs. New job submission methodologies, such as job-execution web services, community-deployed local schedulers, and the late binding of user jobs in a grid-wide overlay network of 'pilot jobs', push this fabric boundary ever further down into the resource. gLExec, a light-weight (and thereby auditable) credential mapping and authorization system, addresses these issues. It can be run both on fabric boundary, as part of an execution web service, and on the worker node in a late-binding scenario. In this contribution we describe the rationale for gLExec, how it interacts with the site authorization and credential mapping frameworks such as LCAS, LCMAPS and GUMS, and how it can be used to improve site control and traceability in a pilot-job system.

WORLD is a multipurpose package to compute geodetic positions in geometry and gravity space. Here undifferenced GPS carrier beat phase observations are processed in the free network mode, namely by the prototype program called PUMA. Within two alternative model formulations, the classical Gauß-Markov Model and the so-called Mixed Model, simultaneously estimated / predicted parameters are those of type (i) Cartesian ground station coordinates (geodetic positioning), (ii) Cartesian satellite coordinates (orbit determination), (iii) receiver- and satellite-specific bias terms, (iv) initial epoch ambiguities and (v) proportional tropospheric corrections. The Mixed Model parameters appear from linearization as a point of stochastic prior information. Namely the weight matrices of stochastic prior information, e.g. for orbit parameters, is assumed to be known. Estimators of type BLUE and predictors of type inhom BLIP and hom BLUP are used. Chapter four discusses in all detail the real analysis of GPS satellite networks of free type. Most notable are the estimated bias terms α, β, γ, in a twofold classification model. The operability of PUMA is demonstrated by the use of multistation phase observations (Wild-Magnavox WM 101-receivers) in a local Berlin network (six station network). It is documented that in spite of the advanced phase observation modelling an internal relative baseline accuracy (utmost length 30 km) of the order of 3 to 5 ppm is achievable. In addition, the influence of orbital prior information on ground station measures, point position as well as accuracy, is demonstrated.

Purpose Given the recent paper by Jang et al. on “A Highly Pathogenic H5N1 Influenza Virus” which reported a novel animal model of parkinsonism, we aimed to perform a complete historical review of the 20th and 21st century literature on parkinsonism and neurological manifestations of influenza. Scope There were at least twelve major flu pandemics reported in the literature in the 20th and 21st century. Neurological manifestations most prevalent during the pandemics included delirium, encephalitis, ocular abnormalities, amyotrophy, myelopathy, radiculopathy, ataxia and seizures. Very little parkinsonism was reported with the exception of the 1917 cases originally described by von Economo. Conclusions To date there have been surprisingly few cases of neurological issues inclusive of parkinsonism associated with influenza pandemics. Given the recent animal model of H5N1 influenza associated parkinsonism, the medical establishment should be prepared to evaluate for the re-emergence of parkinsonism during future outbreaks. PMID:20650672

The prospect of traveling to the planets was science fiction at the beginning of the 20th Century and science fact at its end. The space age was born of the Cold War in the 1950s and throughout most of the remainder of the century it provided not just an adventure in the exploration of space but a suspenseful drama as the US and USSR competed to be first and best. It is a tale of patience to overcome obstacles, courage to try the previously impossible and persistence to overcome failure, a tale of both fantastic accomplishment and debilitating loss. We briefly describe the history of robotic lunar and planetary exploration in the 20th Century, the missions attempted, their goals and their fate. We describe how this enterprise developed and evolved step by step from a politically driven competition to intense scientific investigations and international cooperation.

The relatively muted warming of the surface and lower troposphere since 1998 has attracted considerable attention. One contributory factor to this "warming hiatus" is an increase in volcanically induced cooling over the early 21st century. Here we identify the signals of late 20th and early 21st century volcanic activity in multiple observed climate variables. Volcanic signals are statistically discernible in spatial averages of tropical and near-global SST, tropospheric temperature, net clear-sky short-wave radiation, and atmospheric water vapor. Signals of late 20th and early 21st century volcanic eruptions are also detectable in near-global averages of rainfall. In tropical average rainfall, however, only a Pinatubo-caused drying signal is identifiable. Successful volcanic signal detection is critically dependent on removal of variability induced by the El Niño-Southern Oscillation.

The article presents the most important anti-positivist (the neo-romantic, socio-cultural and relativistic) currents in 20th century European medical historiography. The author discusses the genesis of the anti-positivist opposition in the medical historiography, starting with a presentation of the main points of the positivist programme, and shows the reasons behind the earlier broad reception of this programme by medical historians from the medical profession. The author attributes the emergence of an anti-positivist opposition in this milieu mainly to the medical progress which occurred in the first half of the 20th century and which induced some doctors, engaged in research into medical history, to reject the positivist model of teaching and replace it wit another. This factor played a leading role in the formation of the socio-cultural and relativist currents. According to the author, the second important factor in the birth of this opposition is the 20th century progress in the methodology of historical research, which freed itself from the constraints of positivist scientism as the only acceptable method of interpreting facts. The author also discusses the socio-cultural and political conditions which led to the formation and spread of the neo-romantic movement solely in Germany. The article presents the programme principles and the most important representatives of anti-positivist currents in European (mainly German) and the circumstances behind the reception of each current by the community of medical historians, and presents the development prospects of Polish medical historiography on the treshold of the 20th and 21st centuries, in connection with the research into this field of science undertaken by a large and continuously growing number of university-educated researchers. PMID:12934578

This historical article considers nursing’s work for social justice in the 1960s civil rights movement through the lens of religious sisters and brothers who advocated for racial equality. The article examines Catholic nurses’ work with African Americans in the mid-20th century that took place amid the prevailing social conditions of poverty and racial disempowerment, conditions that were linked to serious health consequences. Historical methodology is used within the framework of “bearing witness,” a term often used in relation to the civil rights movement and one the sisters themselves employed. Two situations involving nurses in the mid-20th century are examined: the civil rights movement in Selma, Alabama, and the actions for racial justice in Chicago, Illinois. The thoughts and actions of Catholic sister and brother nurses in the mid-20th century are chronicled, including those few sister nurses who stepped outside their ordinary roles in an attempt to change an unjust system entirely. PMID:19461224

In this article, we address how the professionalization process is reflected in the way Danish nursing textbooks present 'nursing' to new members of the profession during the 20th century. The discussion is based on a discourse analysis of seven Danish textbooks on basic nursing published between 1904 and 1996. The analysis was inspired by the work of Michel Foucault, in particular the concepts of rupture and rules of formation. First, we explain how the dominating role of the human body in nursing textbooks disappears in the mid-20th century. This transformation can of course be attributed to changes in tasks and responsibilities for nurses or to the implementation of increasing amounts of knowledge and theories from other disciplines than medicine into the nurses' knowledge base. However, inspired by Foucault, we consider these historical changes to be the conditions of possibilities and not causes. The second part of the analysis shows that along with 'the disappearance of the body', a second discursive change appears: the role of doctors and medicine changes fundamentally from about mid-20th century. Finally, we argue that this discursive reorganization enabling new patterns of thought to emerge was driven by a professional interest in describing nursing as an independent profession. PMID:25238323

The IPCC used an experiment that had approximately 20 different climate models fit the temperature history of the 20th century. A remarkably good and convincing fit was obtained by combining selected models into a multi-model ensemble. This may be seen in figure 9.5 of the AR4 Scientific Basis report. The fallacy is that each modeling group used different forcings, effectively simulating a different imaginary planet. Since the IPCC models differ by more than 2-1 in climate sensitivity it would be quite amazing if they could all agree on temperature in the late 20th century when CO2 was rapidly increasing. Allowing each model to be excited by different forcings effectively makes the model be a rather complicated curve fitting program. If one accepts that the models are being used to do curve fitting then the supposedly better results obtained by averaging multiple models is easily explainable as the reduction of error that results from averaging approximations to a function with uncorrelated errors. Finally the late 20th century temperature rise is too small for a 3 degree climate sensitivity for doubling of CO2 and the explanations for the warming shortfall that rely on aerosol cooling or ocean warming are easily refuted.There may be alternative explanations for the shortfall or it may be that climate sensitivity is much lower than projected by the IPCC.

This investigation replicates previous research into K-12 students' responses to mid-20th-century art music. The study extends that research to include undergraduates and graduates as well as an additional group of graduate students who had taken a 20th-century music class. Children's responses showed remarkable consistency and indicated that…

The Permanent Service for Mean Sea Level (PSMSL)collects quality-controlled sea levels from tide gages on all seas, and tabulates them at www.pol.ac.uk/psmsl/psmsl(underline)individual(underline)stations.html. I examined annual average sea levels (Ra in column 6) for generally open-coast tide gages having data at the years defining quarter points in the 20th century: 1900, 1925, 1950, 1975, 2000. Gages lacking data for a given date, say 1975, were assumed to qualify if they had data for one year, plus or minus, of the missing data, i. e., for 1974 or 1976 in this example. This examination of data from gages on all seas identified 54 gages with data for the last three of the five dates, which included 26 gages with data for the last four of the five dates, which included 7 gages with data for all five dates. This means that sea-level change during the last quarter (Q4) of the 20th century could be compared at 54 sites with sea-level change in Q3, at 26 sites with sea- level change in Q2, and at 7 sites with sea-level change in Q1, providing 87 tests of the widely reported acceleration in rate of sea-level rise at the end of the 20th century. If sea level is rising at an accelerating rate, then sea-level rise during Q4 should almost always exceed sea-level rises in Q1, Q2, and Q3 of the 20th century. Of the 87 tests, 44 showed more sea-level rise in Q4, and 43 showed less sea-level rise in Q4, compared to the earlier quarters. Thus there is no evidence for an accelerating rise in sea level at the end of the 20th century from these quality-controlled data. The data do indicate that sea-level changes are synchronized over long reaches of shoreline (Sturges, 1990), and sites where gages are imbedded in deposits of clastic sediment have higher apparent sea-level rise attributable to sediment compaction. Beach erosion on the East Coast of the U.S. is widely attributed to the acceleration of sea-level rise, yet all 8 long-term gages at this coast show significantly LESS

Over the past 150 years, global temperatures have increased by 0.6°C. It has been suggested that this increase in temperature, especially since 1980, has been unprecedented over the past millennium. In order to put the current warming trend into context, various efforts are underway to reconstruct the longer pre-instrumental history of climate variability. Here, we present a sea surface temperature (SST) record of the Indo-Pacific Warm Pool (IPWP) for the Common Era by combining five high-resolution records of Globigerinoides ruber Mg/Ca from different locations within the Indonesian Seas. The much broader spatial coverage and enhanced temporal resolution of this composite record allows us to assess whether the observed 20th century warming and the rate of 20th century temperature change within IPWP have been unprecedented in the past two millennia. The novelty of this study is in our approach to uncertainty quantification, which entails Monte-Carlo simulations that simultaneously take into account both age model and proxy uncertainties. First, we used a Monte-Carlo process (n=10,000) to generate possible age models for each sedimentary record used in the composite. This Monte-Carlo approach takes into consideration the analytical uncertainty in the 14C and 210Pb measurements used for chronology, the uncertainty in the calibration curve and the reservoir age, and the subjective nature of the interpolation scheme. Second, we take into consideration two sources of error in the SST estimates: the analytical uncertainty for the Mg/Ca results, which was assumed to be normally distributed and independent from sample to sample, and the uncertainty in the calibration equation, which was assumed to be dependent (i.e. each Monte-Carlo record is converted using a solution of the calibration equation). To do so, we use a Bayesian approach to enumerate possible solutions of the calibration equation. Finally, we binned the resulting SSTs into 20-year, 50-year, and 100-year non

Addresses various issues in educational computing, suggesting that the real crisis facing schools is not a software shortage but a hardware crunch accompanied by misconceptions about how computers can help children learn. (Author/JN)

The development of organic and physical chemistry as specialist fields, during the middle and end of the 19th century respectively, left inorganic behind as a decidedly less highly regarded subfield of chemistry. Despite Alfred Werner's groundbreaking studies of coordination chemistry in the early 20th century, that inferior status remained in place - particularly in the US - until the 1950s, when the beginnings of a resurgence that eventually restored its parity with the other subfields can be clearly observed. This paper explores the extent to which Werner's heritage - both direct, in the form of academic descendants, and indirect - contributed to those advances. PMID:24983802

A comparison of observations with simulations of a coupled ocean-atmosphere general circulation model shows that both natural and anthropogenic factors have contributed significantly to 20th century temperature changes. The model successfully simulates global mean and large-scale land temperature variations, indicating that the climate response on these scales is strongly influenced by external factors. More than 80% of observed multidecadal-scale global mean temperature variations and more than 60% of 10- to 50-year land temperature variations are due to changes in external forcings. Anthropogenic global warming under a standard emissions scenario is predicted to continue at a rate similar to that observed in recent decades. PMID:11118145

The 20th International Congress of Theoretical and Applied Mechanics, ICTAM2000, was held in Chicago, IL, from August 27 - September 2, 2000. It was 32 years since the last of these congresses had been held in USA. A record number of researchers in the mechanical engineering sciences attended and presented their work. The Congress provided an opportunity for the US mechanics community to act as international hosts. Several universities, professional societies, private foundations and individuals, and Federal agencies provided financial support for the Congress.

A million (two to the 20th power) channel, 300 MHz bandwidth, digital spectrum analyzer was considered. The design, fabrication, and maintenance philosophy of the modular, pipelined, fast fourier transform (FFT) hardware are described. The spectrum analyzer will be used to examine the region from 1.4 GHz to 26 GHz for radio frequency interference which may be harmful to present and future tracking missions of the Deep Space Network. The design has application to the search for extraterrestrial intelligence signals and radio science phenomena.

References to Dr Charles Thomas Jackson in 20th century anaesthesia literature and biographical dictionaries and encyclopedias emphasize his maniacal insanity and its relation to his usurpations of the discoveries of others, including the controversy with William TG Morton concerning the honour of the discovery of surgical anaesthesia. In 1873, seven years before his death, he experienced sudden collapse and paralysis requiring hospitalization. Seminal 19th century brain research before his hospitalization correlated the signs and symptoms of his illness with pathology found at his autopsy. PMID:17641787

Significant climatic changes over Northern Eurasia during the 20th century have been reflected in numerous variables of economic, social, and ecological interests, including the natural frequency of forest fires. For the former USSR, we are now using the Global Daily Climatology Network (Gleason et al. 2002) and a new Global Synoptic Data Network archive, GSDN, created jointly by NCDC an RIHMI. Data from these archives are employed to estimate systematic changes in indices used in the United States and Russia to assess potential forest fire danger. Within the boundaries of the former USSR, each of the archives, GHCN and GSDN, includes more than 2100 stations with only approximately 1500 of them having sufficiently long meteorological time series suitable for participation in our analyses. We use three indices: (1) Keetch-Byram Drought Index, (KBDI; this index uses only daily data on maximum temperature and precipitation and is developed and widely used in the United States); (2) Modified Nesterov, and (3) Zhdanko Indices (these indices are developed and widely used in Russia; their computation requires synoptic daytime data on temperature and humidity and daily precipitation and snow on the ground). Analyses show that after calibration, time series of the days with increased potential forest fire danger constructed using each of these three indices (a) are well correlated and (b) deliver similar conclusions about systematic changes in the weather conditions conducive to forest fires. Specifically, over the entire Eastern half of Northern Eurasia (Siberia and the Russian Far East) we found a statistically significant increase in indices that characterize the weather conditions conducive to forest fires. These areas coincide with the areas of most significant warming during the past several decades south of the Arctic Circle. West of the Ural Mountains, the same indices show a steady decrease in the frequency of the "dry weather summer days" during the past sixty

Many elementary and secondary schools tie with local colleges and universities and use modems to access the computing power available at these higher education facilities. To help alleviate the financial burden of long-distance phone charges, work had begun to use the airways instead of phone lines for computer communication. An interest in…

We describe the implementation of a six-week course to teach Human-Computer Interaction (HCI) to high school students. Our goal was to explore the potential of HCI in motivating students to pursue future studies in related computing fields. Participants in our course learned to make connections between the types of technology they use in their…

For physics, the period from the beginning to the middle of the 20th century was one of great scientific excitement and revolutionary discovery. The analogous era for biochemistry, and its offspring, molecular biology, was the second half of the 20th century. One of the most important and influential leaders of this scientific revolution was Arthur Kornberg. The DNA polymerase, which he discovered in 1955 and showed to have the remarkable capacity to catalyze the template-directed synthesis of DNA, contributed in major ways to the present-day understanding of how DNA is replicated and repaired, and how it is transcribed. The discovery of DNA polymerase also permitted the development of PCR and DNA sequencing, upon which much of modern biotechnology is based. Kornberg's studies of DNA replication, which spanned a period of nearly 30 years, culminated in a detailed biochemical description of the mechanism by which a chromosome is replicated. The final years of Kornberg's life were devoted to the study of polyphosphate, which he was convinced had a crucial role in cellular function. PMID:18467101

We investigated multiple lines of evidence to determine if observed and paleo-reconstructed changes in acid neutralizing capacity (ANC) in Sierra Nevada lakes were the result of changes in 20th century atmospheric deposition. Spheroidal carbonaceous particles (SCPs) (indicator of anthropogenic atmospheric deposition) and biogenic silica and δ(13)C (productivity proxies) in lake sediments, nitrogen and sulfur emission inventories, climate variables, and long-term hydrochemistry records were compared to reconstructed ANC trends in Moat Lake. The initial decline in ANC at Moat Lake occurred between 1920 and 1930, when hydrogen ion deposition was approximately 74 eq ha(-1) yr(-1), and ANC recovered between 1970 and 2005. Reconstructed ANC in Moat Lake was negatively correlated with SCPs and sulfur dioxide emissions (p = 0.031 and p = 0.009). Reconstructed ANC patterns were not correlated with climate, productivity, or nitrogen oxide emissions. Late 20th century recovery of ANC at Moat Lake is supported by increasing ANC and decreasing sulfate in Emerald Lake between 1983 and 2011 (p < 0.0001). We conclude that ANC depletion at Moat and Emerald lakes was principally caused by acid deposition, and recovery in ANC after 1970 can be attributed to the United States Clean Air Act. PMID:25078969

This study asks two related questions about the shifting landscape of marriage and reproduction in US society over the course of the last century with respect to a range of health and behavioral phenotypes and their associated genetic architecture: (i) Has assortment on measured genetic factors influencing reproductive and social fitness traits changed over the course of the 20th century? (ii) Has the genetic covariance between fitness (as measured by total fertility) and other traits changed over time? The answers to these questions inform our understanding of how the genetic landscape of American society has changed over the past century and have implications for population trends. We show that husbands and wives carry similar loadings for genetic factors related to education and height. However, the magnitude of this similarity is modest and has been fairly consistent over the course of the 20th century. This consistency is particularly notable in the case of education, for which phenotypic similarity among spouses has increased in recent years. Likewise, changing patterns of the number of children ever born by phenotype are not matched by shifts in genotype–fertility relationships over time. Taken together, these trends provide no evidence that social sorting is becoming increasingly genetic in nature or that dysgenic dynamics have accelerated. PMID:27247411

Little information is available on the 20th century mortality rates of rural black South African groups, such as the Venda. The purpose of this study was to apply abridged life tables in order to estimate life expectancy from both skeletal remains and death registry information of modern South African communities. Comparisons were also made with prehistoric and contemporary groups as a means to better evaluate life expectancy for this time period. The sample consisted of 160 skeletons of known Venda origin and burial registry information for 1364 black South Africans from the Rebecca Street and Mamelodi Cemeteries in Pretoria, South Africa. Standard anthropological techniques were applied to determine sex and estimate age from the skeletal remains. The stationary and non-stationary life table models were used to analyse the data. A high rate of child mortality, low juvenile and adult mortality with a steady increase in mortality after the age of 30 years was observed for both the Venda and the cemetery samples. Throughout the 20th century, life expectancy was shown to increase for black South Africans. However, due to the widespread HIV infection/AIDS of the 21st century, infant and young adult mortality rates continue to rise at such a speed that the decline in mortality seen for South Africans in the last 50 years will most likely to be lost in the next decade due to this disease. PMID:18555996

This study asks two related questions about the shifting landscape of marriage and reproduction in US society over the course of the last century with respect to a range of health and behavioral phenotypes and their associated genetic architecture: (i) Has assortment on measured genetic factors influencing reproductive and social fitness traits changed over the course of the 20th century? (ii) Has the genetic covariance between fitness (as measured by total fertility) and other traits changed over time? The answers to these questions inform our understanding of how the genetic landscape of American society has changed over the past century and have implications for population trends. We show that husbands and wives carry similar loadings for genetic factors related to education and height. However, the magnitude of this similarity is modest and has been fairly consistent over the course of the 20th century. This consistency is particularly notable in the case of education, for which phenotypic similarity among spouses has increased in recent years. Likewise, changing patterns of the number of children ever born by phenotype are not matched by shifts in genotype-fertility relationships over time. Taken together, these trends provide no evidence that social sorting is becoming increasingly genetic in nature or that dysgenic dynamics have accelerated. PMID:27247411

The Fraser River Basin (FRB) is the largest river draining to the Pacific Ocean in British Columbia (BC), Canada, and it provides the world's most abundant salmon populations. With recent climate change, the shifting hydrologic regime of the FRB is evaluated using hydrological modeling results over the period 1949 to 2006. To quantify the contribution of snowmelt to runoff generation, the ratio RSR, defined as the division of the sum of the snowmelt across the watershed by the integrated runoff over the water year, is employed. Modeled results for RSR at Hope, BC — the furthest downstream hydrometric station of the FRB — show a significant decrease (from 0.80 to 0.65) in the latter part of the 20th century. RSR is found to be mainly suppressed by a decrease of the snowmelt across the FRB with a decline with 107 mm by 26 % along the simulation period. There is also a prominent shift in the timing of streamflow, with the spring freshet at Hope, BC advancing 30 days followed by reduced summer flows for over two months. The timing of the peak spring freshet becomes even earlier when moving upstream of the FRB owing to short periods of time after melting from the snow source to the rivers.

At the beginning of the 20th century the veterinary relations between Turkey and Germany intensified at the military level. After the Balkan Wars there were urgent attempts to reorganize the army, which lead to the visit of a German military mission under General L. von Sanders, who was appointed to review the army. Along with other members of the German military mission A. Thieme (1881-1949) belonged to the staff of the Turkish Army's Supreme Command as orderly officer and as advisory army veterinarian. During the First World War (1914-1918) there were 44 German veterinary officers, 23 in Turkish and 21 in German uniform. The veterinary officers in Turkish formations were assigned to the military mission, where the veterinary major Dr. K. Dreyer served as official adviser. Moreover mixed German-Turkish special units were formed under German leadership in Turkey, which included German veterinary officers in German uniform. The higher salaries were paid to the German veterinary officers than their Turkish colleagues by the Turkish government. As far as can gathered three German officers perished in Turkey. PMID:1954861

In mid-20th century, several streams of knowledge converged to create the new academic discipline of cardiovascular disease epidemiology and the new practice of preventive cardiology. One stream was modern cardiology, with the ability to diagnose myocardial infarction, to characterize and count its victims, and to report vital statistics on cardiovascular causes of death. Another stream came from burgeoning clinical and laboratory research and greater understanding of the underlying processes of atherosclerosis and hypertension. A third stream came from the observations of intellectually curious "medical Marco Polos," who brought back from travels their tales of unusual population frequencies of heart attacks, along with ideas about sociocultural causes. This led to more formal research about cardiovascular disease risk and causes among populations and about mechanisms in the clinic and laboratory. The broad river of investigation thus formed produced a risk paradigm of the multiple biologic, behavioral, and societal factors in causal pathways to the common cardiovascular diseases. An evidence base was built for sound clinical and public health approaches to prevention. Here, the author tells brief stories about 5 early and particularly observant world travelers and their influence on knowledge and thinking about prevention. PMID:22470931

The HUPO Brain Proteome Project (HUPO BPP) held its 20th workshop in Yokohama, Japan, September 15, 2013. The focus of the autumn workshop was on new insights and prospects of neurodegenerative diseases. PMID:24903697

Presented are abstracts from volume two of the World Conference on Computers in Education, 1985. Four topics are: (1) cognitive and visual style; (2) computer graphics and descriptive geometry; (3) LOGO and educational research; and (4) algorithms, programing, and computer literacy. (JM)

In this thesis we study synchronization phenomena in natural and artificial coupled multi-component systems, applicable to the scalability of parallel discrete-event simulation for systems with asynchronous dynamics. We also study the role of various complex communication topologies as synchronization networks. We analyze the properties of the virtual time horizon or synchronization landscape (corresponding to the progress of the processing elements) of these networks by using the framework of non-equilibrium surface growth. When the communication topology mimics that of the short-range interacting underlying system, the virtual time horizon exhibits Kardar-Parisi-Zhang-like kinetic roughening. Although the virtual times, oil average, progress at a nonzero rate, their statistical spread diverges with the number of processing elements, hindering efficient data collection. We show that when the synchronization topology is extended to include quenched random communication links (small-world links) between the processing elements, they make a close-to-uniform progress with a nonzero rate, without global synchronization. This leads to a fully scalable parallel simulation for underlying systems with asynchronous dynamics and short-range interactions. We study both short-range and small-world synchronization topologies in one- and two-dimensional systems. We also provide a coarse-grained description for the small-world-synchronized virtual-time horizon and compare the findings to those obtained by "simulating the simulations" based on the exact algorithmic rules. We also present numerical results for the evolution of the virtual-tithe horizon oil scale-free Barabasi-Albert networks serving as communication topology among the processing elements. Finally, we investigate to what extent small-world couplings (extending the original local relaxational dynamics through the random links) lead to the suppression of extreme fluctuations in the synchronization landscape. In the

Flood trends were investigated in four stations of the lower Mekong River. Two types of changes were accounted for: trend in the mean and trend in the variance of the time series. A trend in the mean implies that the average flood events changed with time. A trend in variance implies that the frequency of low and high magnitude floods changed with time (Merz et al., 2012). Results showed that average flood events decreased during the 20th century. However, due to an increase in variance, the frequency of high magnitude floods increased towards the end of the 20th century (Delgado et al., 2010). This increase could not be detected by usual trend tests like Mann Kendall test or the ordinary least squares regression. The results agree with Katz and Brown (1992), who showed that variance changes are more important that changes in mean, when it comes to flood frequency trends. To investigate possible causes for the detected changes in flood variance, we looked at several large scale atmospheric circulation patterns cited in the literature. The Western Pacific monsoon index (Wang, 2001) showed the greatest resemblance with the flood data. A test of step change in variance was conducted which revealed a coinciding step change in variance between annual maximum discharge and the Western Pacific monsoon. A statistical model where monsoon variance forces flood frequency in the 20th century was tested. The results were statistically significant. This has the advantadge of by-passing the use of precipitation, which in this region is collected in a rather sparse network. Concerning climate change projections, a dynamic index like the Western Pacific monsoon index is better simulated by climate models than tropical precipitation (Wang, 2004, Douville et al. 2005). Another important result is the attribution of the detected changes. The Mekong River basin is located in a transition zone between the Indian and the Pacific oceans. Our results showed that the interannual variability

Analysis of some characteristics of air temperature, air humidity, precipitation and snow cover at selected meteorological stations located in different regions of Slovakia was performed for the period 1951 to 2012. Stations represent lowland regions (up to 300 meters), mid-altitude regions (300 to 800 meters) and high altitude mountain regions (above 1000 m). Series of highest annual maximum air temperature show obvious gradual increase from 80s of the 20th century on all selected stations and also the occurrence of new record values in the last years of the analyzed period 1951 -2012. In most stations the absolute maximum air temperature for 1951 - 2012 period was recorded during July 2007, whereas the highest annual maximum temperatures were recorded predominantly during the month of August in Slovakia. Values of annual maximum of mean daily temperature show evident upward trend and at some stations the highest mean daily temperature was recorded just in recent years (e.g. at station Sliač in 2012). Positive trends of the lowest annual minimum temperature at selected stations in the period 1951-2012 are not so significant and it is evident that the lowest annual minimum air temperature didn't drop as low as in the past, respectively they didn't reach record values as in the 50s or 80s of the 20-th century. Changes of minimum and maximum monthly precipitation totals during the year indicate that the number of cases with extremely low monthly total was increasing. Although frequency of occurrence of extremely high monthly precipitation totals was rather chaotic, in some recent years the observed values of monthly precipitation totals represented the absolutely highest monthly values for entire analyzed period 1951-2012. On the other hand, towards the end of this period, cases when minimum monthly precipitation totals were close to zero also became more common. Very significant is the trend in the lowest annual relative humidity in the 1951-2012 period

In this study, we provide a determination of the 20th Century (1900--2002) global sea level rise, the associated error budgets, and the quantifications of the various geophysical sources of the observed sea level rise, using data and geophysical models. We analyzed significant geographical variations of the global sea level including those caused by the steric component (heat and salinity) in the ocean, and the self-gravitational signal as a result of ice sheets melting, including the effects of glacial isostatic adjustment (GIA) since the Pleistocene. In particular, relative sea level data from long-term (longest is 150 year records) and over 600 tide gauge sites globally from PSMSL and other sources, and geocentric sea level data from multiple satellite altimetry (1985--2005) have been used to determine and characterize 20th century global sea level rise. Altimeter and selected tide gauge sea level data have been used for the 20th century sea level determination, accounting for relative biases between the altimeters, effects of sea level corresponding to oceanic thermal expansion, vertical motions affecting tide gauge measurements, self gravitations, and barotropic ocean response. This study is also characterized by the roles of the polar ocean in the global sea level study and addressing the question whether there is a detectable sea level rise acceleration during the last decade. Vertical motions have been estimated by combining geocentric sea level measurements from satellite altimetry (TOPEX/POSEIDON) and long-term relative (crust-fixed) sea level records from global tide gauges using the Gauss-Markov (GM) model with stochastic constraints. The study provided a demonstration of improved vertical motion solutions in semi-enclosed seas and lakes, including Fennoscandia and the Great Lakes region, showing excellent agreement with independent GPS observed radial velocities, or with predictions from GIA models. In general, the estimated uncertainty of the observed

From a programming perspective, Alan Turing's epochal 1936 paper on computable functions introduced several new concepts, including what is today known as self-interpreters and programs as data, and invented a great many now-common programming techniques. We begin by reviewing Turing's contribution from a programming perspective; and then systematize and mention some of the many ways that later developments in models of computation (MOCs) have interacted with computability theory and programming language research. Next, we describe the 'blob' MOC: a recent stored-program computational model without pointers. In the blob model, programs are truly first-class citizens, capable of being automatically compiled, or interpreted, or executed directly. Further, the blob model appears closer to being physically realizable than earlier computation models. In part, this is due to strong finiteness owing to early binding in the program; and a strong adjacency property: the active instruction is always adjacent to the piece of data on which it operates. The model is Turing complete in a strong sense: a universal interpretation algorithm exists that is able to run any program in a natural way and without arcane data encodings. Next, some of the best known among the numerous existing MOCs are described, and we develop a list of traits an 'ideal' MOC should possess from our perspective. We make no attempt to consider all models put forth since Turing's 1936 paper, and the selection of models covered concerns only models with discrete, atomic computation steps. The next step is to classify the selected models by qualitative rather than quantitative features. Finally, we describe how the blob model differs from an 'ideal' MOC, and identify some natural next steps to achieve such a model. PMID:22711860

The use of structured knowledge representations—ontologies and terminologies—has become standard in biomedicine. Definitions of ontologies vary widely, as do the values and philosophies that underlie them. In seeking to make these views explicit, we conducted and summarized interviews with a dozen leading ontologists. Their views clustered into three broad perspectives that we summarize as mathematics, computer code, and Esperanto. Ontology as mathematics puts the ultimate premium on rigor and logic, symmetry and consistency of representation across scientific subfields, and the inclusion of only established, non-contradictory knowledge. Ontology as computer code focuses on utility and cultivates diversity, fitting ontologies to their purpose. Like computer languages C++, Prolog, and HTML, the code perspective holds that diverse applications warrant custom designed ontologies. Ontology as Esperanto focuses on facilitating cross-disciplinary communication, knowledge cross-referencing, and computation across datasets from diverse communities. We show how these views align with classical divides in science and suggest how a synthesis of their concerns could strengthen the next generation of biomedical ontologies. PMID:21980276

The author of this column describes several instances in which secure data on computers were compromised. In each of these instances, a different route was involved in gaining access to the secure data--one by office-based theft, one by hacking, and one by burglary. Is is proposed that the most difficult factor to guarantee in the protection of…

This proceedings contains papers presented at the 20th anniversary conference of ASCE's Water Resources Planning and Management Division held in Seattle, Washington, May 1-5, 1993. The conference theme is an acknowledgement of the need for water resources professionals to face major challenges in managing diminishing supplies for ever-increasing demands, in protecting valuable watersheds from urban and agricultural pollution, and in building and maintaining critical infrastructure with limited financial resources. The papers in this proceedings reflect a practical, problem-solving focus with emphasis on novel solutions for current and near future challenges. Included are papers from three symposia held as part of the conference: (1) Urban runoff and the Environment, (2) Water Supply and Conservation, and (3) the National Drought Study. Other topics covered include: computer-aided decision support systems, the Endangered Species Act impact on major water systems, international disasters, geographic information systems, global warming, and hydropower planning. Individual papers are processed separately for inclusion in the appropriate data bases.

Diagnosis of observational and climate model data reveals that the two major U.S. droughts of the 20th Century had distinct causes. Drought severity over the Southern Plains during 1946-1956 is very likely attributable to remote influences of global sea surface temperatures (SSTs). The Southern Plains and adjacent Southwest are regions particularly sensitive to SST variability, and strong La Niña events that occurred during 1946-1956 exposed that region's drought vulnerability. Drought severity over the Northern Plains during 1932-1939 was likely triggered instead by random atmospheric variability. The Northern Plains lies within a region of comparatively low sensitivity to SST variability, and that region's drought exhibited little sensitivity to SST conditions during the Dust Bowl period. Our results indicate that the southern portions of the Great Plains lie within an epicenter of potentially skillful drought predictions for which an ocean observing system is also a vital drought early warning system.

This study evaluates changes in genetic penetrance—defined as the association between an additive polygenic score and its associated phenotype—across birth cohorts. Situating our analysis within recent historical trends in the U.S., we show that, while height and BMI show increasing genotypic penetrance over the course of 20th Century, education and heart disease show declining genotypic effects. Meanwhile, we find genotypic penetrance to be historically stable with respect to depression. Our findings help inform our understanding of how the genetic and environmental landscape of American society has changed over the past century, and have implications for research which models gene-environment (GxE) interactions, as well as polygenic score calculations in consortia studies that include multiple birth cohorts. PMID:27456657

The EMBnet Conference 2008, focusing on 'Leading Applications and Technologies in Bioinformatics', was organized by the European Molecular Biology network (EMBnet) to celebrate its 20th anniversary. Since its foundation in 1988, EMBnet has been working to promote collaborative development of bioinformatics services and tools to serve the European community of molecular biology laboratories. This conference was the first meeting organized by the network that was open to the international scientific community outside EMBnet. The conference covered a broad range of research topics in bioinformatics with a main focus on new achievements and trends in emerging technologies supporting genomics, transcriptomics and proteomics analyses such as high-throughput sequencing and data managing, text and data-mining, ontologies and Grid technologies. Papers selected for publication, in this supplement to BMC Bioinformatics, cover a broad range of the topics treated, providing also an overview of the main bioinformatics research fields that the EMBnet community is involved in. PMID:19534734

Cajal-Retzius (CR) cells were discovered at the end of the 19th century but, surprisingly, the exploration of their physiological properties is only now beginning, as we near the end of the 20th century. A few papers addressing these properties have appeared recently, but incomplete data generally give the arguably misleading impression that CR cells are similar to other neocortical neurons, and therefore may perform analogous functions. It is one of the motives of this review to dispel such conceptions. Although CR cells display features of 'regular' neurons, including excitability and responsiveness to neurotransmitters, their function is probably limited to the primary implementation of cortical circuits. A strong indication in support of this idea is the fact that CR cells appear at the onset of neocorticogenesis and disappear at the end of neuronal migration. PMID:10600996

The transformations of air pollution in the 20th century are well known. The century opened with urban atmospheres polluted by the combustion products of burning coal: smoke and sulfur dioxide. At the millennium these pollutants had almost vanished, replaced by the pollutants, both primary and secondary, a function of fossil-fuelled vehicles. However transitions in terms of health outcomes have been equally dramatic. Fine particulate matter causes notable cardiovascular problems such as increased incidence of stroke and heart attack, although the mechanism remains somewhat unclear. Cancer inducing air pollutants remain a concern, but in addition more recently there has been a rising interest in the presence of neurotoxins and endocrine disrupting substances in the environment.

Recent estimates of the contribution of glaciers to sea-level rise during the 20th century are strongly divergent. Advances in data availability have allowed revisions of some of these published estimates. Here we show that outside of Antarctica, the global estimates of glacier mass change obtained from glacier-length-based reconstructions and from a glacier model driven by gridded climate observations are now consistent with each other, and also with an estimate for the years 2003-2009 that is mostly based on remotely sensed data. This consistency is found throughout the entire common periods of the respective data sets. Inconsistencies of reconstructions and observations persist in estimates on regional scales.

Roald Amundsen (1872-1928) is best known as a polar explorer, the first to lead a team to the South Pole in 1911. He did, however, have a serious interest in science, in particular, in geomagnetism. His expedition through the Northwest Passage above Canada in 1903 to 1906 and his Maud expedition through the Arctic ice in 1918 to 1925 included full complements of magnetic instrumentation. He and his magnetic researchers collaborated with the Carnegie Institution of Washington's Department of Terrestrial Magnetism and with the Prussian Geomagnetic Observatory in Potsdam for training, instruments, and research programs. Amundsen's expeditions provided magnetic and other geophysical data for important geographical regions, while gaining support for polar and geophysical research generally. His work is part of a broader 20th-century story that includes the International Polar Years and the International Geophysical Year.

Estimates of the contribution of glaciers to sea-level rise during the 20th century that were published in recent years are strongly divergent. Advances in data availability have allowed revisions of some of these published estimates. Here we show that outside of Antarctica, the global estimates of glacier mass loss obtained from glacier-length-based reconstructions and from a glacier model driven by gridded climate observations are now consistent with each other, and also with an estimate for the years 2003-2009 that is mostly based on remotely sensed data. This consistency is found throughout the entire common periods of the respective data sets. Inconsistencies of reconstructions and observations persist in estimates on regional scales.

Chlorofluorocarbon-11 inventories for the deep Southern Ocean appear to confirm physical oceanographic and geochemical studies in the Southern Ocean, which suggest that no more than 5 x 10(6) cubic meters per second of ventilated deep water is currently being produced. This result conflicts with conclusions based on the distributions of the carbon-14/carbon ratio and a quasi-conservative property, PO(4)(*), in the deep sea, which seem to require an average of about 15 x 10(6) cubic meters per second of Southern Ocean deep ventilation over about the past 800 years. A major reduction in Southern Ocean deep water production during the 20th century (from high rates during the Little Ice Age) may explain this apparent discordance. If this is true, a seesawing of deep water production between the northern Atlantic and Southern oceans may lie at the heart of the 1500-year ice-rafting cycle. PMID:10550046

The use of radioisotope thermoelectric generators (RTGs) as energy conversion devices for spacecraft designed for weak-sunlight environments is discussed. The two upcoming missions Galileo and Ulysses will both use general-purpose heat source RTGs. Two other missions that are planned for the mid-nineties and will carry RTGs onboard are the comet rendezvous asteroid flyby and Cassini. Another mission that might become a program start in the last decade of the 20th century is Solarprobe, which is most likely to use modular RTGs. Several other missions that are in different planning stages that are in need of RTGs to meet their power requirements are the Mars rover sample return, planetary (Mars) penetrators, microspacecraft, and the Mars Egg. All of these missions are discssed, stressing their RTG requirements.

Zaumen (1976) found that spontaneous pair production in a uniform magnetic field should be a feasible process for field strengths at least of the order of 10 to the 20th power gauss. This note points out that a magnetic field of this order of magnitude is most unlikely to occur in realistic astrophysical situations because of the large dynamical and quantum-mechanical effects such a field would produce. It is suggested that Zaumen's calculation would probably have little bearing on the suspected evolution of astrophysical systems since other processes (either dynamical or quantum-mechanical) apparently limit the field strength before such high magnetic fields would be reached. An upper limit of about 10 to the 16th power gauss is obtained by considering the isotropy of the 3-K blackbody radiation, the formation of collapsed objects in very high magnetic fields, and magnetic bremsstrahlung processes in quantum electrodynamics.

Historical reviews suggest that tanning first became fashionable in the 1920s or 1930s. To quantitatively and qualitatively examine changes in tanning attitudes portrayed in the popular women's press during the early 20th century, we reviewed summer issues of Vogue and Harper's Bazaar for the years 1920, 1927, 1928, and 1929. We examined these issues for articles and advertisements promoting skin tanning or skin bleaching and protection. We found that articles and advertisements promoting the fashionable aspects of tanned skin were more numerous in 1928 and 1929 than in 1927 and 1920, whereas those promoting pale skin (by bleaching or protection) were less numerous. These findings demonstrate a clear shift in attitudes toward tanned skin during this period. PMID:19846688

Although there are a number of descriptions of 'blood infusion' in antiquity, it was the publication of the discovery of the circulation of blood in 1628 by William Harvey and the work of Christopher Wren and Robert Boyle in 1663 on the infusion of different materials into dogs that paved the way to the possible practical attempts at actual blood transfusion. Although these early experiments, principally by Richard Lower in England and Jean Denis in France provided valuable information regarding inter-species incompatibility and the problems of blood coagulation, it was not until the work of James Blundell in the early part of the 19th century that blood transfusion was used as a means of blood replacement. However, blood transfusion was not to become an accepted therapeutic possibility until the discovery of practical anticoagulation and the ABO blood groups at the start of the 20th century. PMID:23016954

This study evaluates changes in genetic penetrance-defined as the association between an additive polygenic score and its associated phenotype-across birth cohorts. Situating our analysis within recent historical trends in the U.S., we show that, while height and BMI show increasing genotypic penetrance over the course of 20(th) Century, education and heart disease show declining genotypic effects. Meanwhile, we find genotypic penetrance to be historically stable with respect to depression. Our findings help inform our understanding of how the genetic and environmental landscape of American society has changed over the past century, and have implications for research which models gene-environment (GxE) interactions, as well as polygenic score calculations in consortia studies that include multiple birth cohorts. PMID:27456657

By all sensible metrics, North Atlantic tropical cyclone activity underwent a pronounced decline from the middle of the 20th Century through the 1980s, and then recovered. A rich literature is devoted to the causes is this hurricane drought, with some arguing that it is part of a natural, multi-decadal oscillation of North Atlantic climate, and others pointing to time-varying radiative forcing as the main cause of the drought. In this work I show that the net variability of Atlantic tropical cyclone activity in the second half of the 20th Century has two spectral peaks: One around a single decade and another at a period close to 70 years. I hypothesize that the longer period variability, of which the drought was a part, was principally owing to time-varying radiative forcing, while the quasi-decadal signal is part of a natural oscillation of North Atlantic climate. To test this hypothesis, I present evidence that the two main contributors to time-varying radiative forcing over the period were CO2 variations and aerosol forcing brought about by a combination of sulfate originating in European sulfur emissions and natural mineral dust from the Sahara, and then show that most of the longer-period variations can be statistically explained by a combination of these two forcing agents, at the same time demonstrating that the spatial pattern on this variability is consistent with the radiative forcing hypothesis. On the other hand, the spatial pattern of the quasi-decadal variations is closely aligned with that of the EOFs of natural variability in a large suite of unforced, coupled climate models, suggesting that this quasi-decadal signal is part of a natural oscillation of the North Atlantic climate.

The statistical characteristics of the atmospheric internal variability (hereafter weather noise) for surface pressure (PS) in 20th century simulations of a coupled general circulation model are documented. The weather noise is determined from post-industrial (1871-1998) Community Climate System Model 3 simulations by removing the SST and externally forced responses from the total fields.The forced responses are found from atmosphere-only simulations forced by the SST and external forcing of the coupled runs. The spatial patterns of the main modes of weather noise variability of the noise are found for boreal winter and summer from empirical orthogonal function (EOF) analyses performed globally, and for various regions, including the North Atlantic, the North Pacific, and the equatorial Pacific. The temporal characteristics of the modes are illustrated by power spectra and probability density functions (PDF) of the principal components (PC). Our findings show that, for two different realizations of weather noise, the variability is dominated by large scale spatial structures of the weather noise that resemble observed patterns, and that their relative amplitudes in the CGCM and AGCM simulations are very similar. The regional expression of the seasonally dependent AO-like or AAO-like dominant global pattern is also found in the regional analyses, giving similar PCs. The PCs in the CGCM and the corresponding SST forced AGCM simulations are uncorrelated, but the spectra and PDFs of the CGCM and AGCM PCs are similar. The temporal structures of the PCs are white at timescales larger than few months, so that these modes can be thought of as stochastic forcings (in time) for the climate system. The PDFs of the weather noise PCs are not statistically distinguishable from Gaussian distributions with the same standard deviation. The PDFs do not change substantially between the first and second half of the 20th century.

The author makes an attempt at considering the most important achievements in psychiatry which have taken place in the global scale during the passing century, and the direction taken up by the development of global psychiatry in the coming century. The 20th century was characterised not only by the impressive development of science, thanks to which completely new possibilities opened up for global psychiatry, but also by the presence of extreme events that took place as a result of false ideologies such as fascism and communism. In the 20th century concepts like homicide, holocaust, the extermination of the mentally ill, experiments on people that are prohibited by the ethics of medicine and the elementary rules of humanity, etc. The paper includes the achievements of diagnosis and therapy of psychiatric disorders and the latest organisational solutions as well as the perspectives for further development of psychiatry. The author also implies the aims that psychiatry has to take up due to the numerous threats from our civilization: the technical-technological development, pollution of the natural environment, the negative changes in human values, the rising brutality in interhuman relations due to racial, national and religious conflicts and terrorism, the disappearing feeling of being safe in society, vision of hunger and poverty in many countries on a few continents, the danger of an epidemic outbreak of new unknown diseases caused by viral mutations, or genetics--the possible negative effects of genetic engineering (cloning of humans), etc. The author tries to define the role of psychiatry in preventing the threats of civilization. PMID:9920993

Fire is an integral Earth System process that interacts with climate in multiple ways. Here we assessed the parametrization of fires in the Community Land Model (CLM-CN) and improved the ability of the model to reproduce contemporary global patterns of burned areas and fire emissions. In addition to wildfires we extended CLM-CN to account for fires related to deforestation. We compared contemporary fire carbon emissions predicted by the model to satellite-based estimates in terms of magnitude and spatial extent as well as interannual and seasonal variability. Long-term trends during the 20th century were compared with historical estimates. Overall we found the best agreement between simulation and observations for the fire parametrization based on the work by Arora and Boer (2005). We obtained substantial improvement when we explicitly considered human caused ignition and fire suppression as a function of population density. Simulated fire carbon emissions ranged between 2.0 and 2.4 Pg C/year for the period 1997 2004. Regionally the simulations had a low bias over Africa and a high bias over South America when compared to satellite-based products. The net terrestrial carbon source due to land use change for the 1990s was 1.2 Pg C/year with 11% stemming from deforestation fires. During 2000 2004 this flux decreased to 0.85 Pg C/year with a similar relative contribution from deforestation fires. Between 1900 and 1960 we predicted a slight downward trend in global fire emissions caused by reduced fuels as a consequence of wood harvesting and also by increases in fire suppression. The model predicted an upward trend during the last three decades of the 20th century as a result of climate variations and large burning events associated with ENSO-induced drought conditions.

Identifies reasons for using computers to teach world history. Discusses how instructors can acquire and use digital classroom resources. Describes how to develop and use online courses and course Web pages. (PAL)

The emergence of new technologies such as three-dimensional virtual worlds brings new opportunities for teaching and learning. We conducted an action research approach to the analysis of how teaching and learning of computer programming at the university level could be developed within the Second Life virtual world. Results support the notion that…

Welcome to our special issue on fibre optic sensors. Fibre optic sensors were first suggested in the patent literature in the mid 1960s as an innovative means for making measurements. This proposed a surface finish measurement tool with high precision and resulted in an instrument that remains available today. Much has happened since, with significant innovation in the techniques through which light propagating whilst guided in a fibre can be unambiguously, repeatedly and predictably modulated in response to an external phenomenon. The technique offers not only the precision mentioned earlier but also inherent electromagnetic immunity, the capability to sense at long distances, light weight, small size and a multiplicity of network architectures, all of which can be interrogated from a single point. Even so, fibre sensors is a niche technology, attractive only when its very special features offer substantial user benefit. There are, however, many such niches exemplified in the electrical power supply industry, in gyroscopes for navigational instruments, in hydrophones and geophones. Then there are the distributed sensing architectures that enable useful measurements of pressure, strain and temperature fields affecting the optical properties of the fibre itself to map these parameter fields as a function of position along lengths of fibre to many tens of kilometres. The fibre sensing concept spawned its own research community, and the international conference on Optical Fibre Sensors first appeared in 1983 in London then emerged into a series travelling from Europe to the Americas and into the Asia-Pacific region. The 20th in the series took place in Edinburgh at the end of 2009 and this special issue of Measurement Science and Technology presents extended versions of some of the papers that first appeared at the conference. The science and technology of fibre sensing have evolved significantly over the history of the conference, drawing on developments in optical

The investigation of ionosphere response to solar eclipses was carried out. Maximum observable frequencies were analyzed during two eclipses on 29th March, 2006 and on 20th March, 2015 on several oblique sounding paths which were within the range of solar flux obscuration. The model describing local changes in the ionosphere, caused by the obscuration of solar flux during eclipse, is suggested. The computer simulation of HF radiowave propagation during the eclipses was carried out on the basis of this model, while quiet ionosphere was described by IRI-2012 model. It is shown that this approach gives adequate description of HF channel during eclipses for all propagation paths under consideration while the parameters of the model were the same for all paths. As the result of computer simulation time delays of ionosperic responses during eclipses were obtained (~1800-2000 s). It was found that maximum depletion of electron concentration reached 85% in D-region for both eclipses. The electron density depletions at height of F2-peak were 48% and 34% for eclipse on 29th March, 2006 and on 20th March, 2015 respectively.

The processes that permitted a few SIV strains to emerge epidemically as HIV groups remain elusive. Paradigmatic theories propose factors that may have facilitated adaptation to the human host (e.g., unsafe injections), none of which provide a coherent explanation for the timing, geographical origin, and scarcity of epidemic HIV strains. Our updated molecular clock analyses established relatively narrow time intervals (roughly 1880–1940) for major SIV transfers to humans. Factors that could favor HIV emergence in this time frame may have been genital ulcer disease (GUD), resulting in high HIV-1 transmissibility (4–43%), largely exceeding parenteral transmissibility; lack of male circumcision increasing male HIV infection risk; and gender-skewed city growth increasing sexual promiscuity. We surveyed colonial medical literature reporting incidences of GUD for the relevant regions, concentrating on cities, suffering less reporting biases than rural areas. Coinciding in time with the origin of the major HIV groups, colonial cities showed intense GUD outbreaks with incidences 1.5–2.5 orders of magnitude higher than in mid 20th century. We surveyed ethnographic literature, and concluded that male circumcision frequencies were lower in early 20th century than nowadays, with low rates correlating spatially with the emergence of HIV groups. We developed computer simulations to model the early spread of HIV-1 group M in Kinshasa before, during and after the estimated origin of the virus, using parameters derived from the colonial literature. These confirmed that the early 20th century was particularly permissive for the emergence of HIV by heterosexual transmission. The strongest potential facilitating factor was high GUD levels. Remarkably, the direct effects of city population size and circumcision frequency seemed relatively small. Our results suggest that intense GUD in promiscuous urban communities was the main factor driving HIV emergence. Low circumcision rates

The use for mathematical models of natural phenomena has underpinned science and engineering for centuries, but until the advent of modern computers and computational methods, the full utility of most of these models remained outside the reach of the engineering communities. Since World War II, advances in computational methods have transformed the way engineering and science is undertaken throughout the world. Today, theories of mechanics of solids and fluids, electromagnetism, heat transfer, plasma physics, and other scientific disciplines are implemented through computational methods in engineering analysis, design, manufacturing, and in studying broad classes of physical phenomena. The discipline concerned with the application of computational methods is now a key area of research, education, and application throughout the world. In the early 1980's, the International Association for Computational Mechanics (IACM) was founded to promote activities related to computational mechanics and has made impressive progress. The most important scientific event of IACM is the World Congress on Computational Mechanics. The first was held in Austin (USA) in 1986 and then in Stuttgart (Germany) in 1990, Chiba (Japan) in 1994, Buenos Aires (Argentina) in 1998, Vienna (Austria) in 2002, Beijing (China) in 2004, Los Angeles (USA) in 2006 and Venice, Italy; in 2008. The 9th World Congress on Computational Mechanics is held in conjunction with the 4th Asian Pacific Congress on Computational Mechanics under the auspices of Australian Association for Computational Mechanics (AACM), Asian Pacific Association for Computational Mechanics (APACM) and International Association for Computational Mechanics (IACM). The 1st Asian Pacific Congress was in Sydney (Australia) in 2001, then in Beijing (China) in 2004 and Kyoto (Japan) in 2007. The WCCM/APCOM 2010 publications consist of a printed book of abstracts given to delegates, along with 247 full length peer reviewed papers published with

A cluster of exceptionally large earthquakes in the interior of Asia occurred from 1905 to 1967: the 1905 M8.0 Tsetserleg and M8.4 Bolnai earthquakes, the 1931 M8.0 Fu Yun earthquake, the 1957 M8.1 Gobi-Altai earthquake, and the 1967 M7.1 Mogod earthquake. Each of the larger (M>8) earthquakes involved strike-slip faulting averaging more than 5 meters and rupture lengths of several hundred km. Available geologic data indicate that long-term slip rates are small, no more than 1-2 mm/yr, on each of the East-West trending faults which ruptured in the Bolnai and Gobi-Altai earthquakes, suggesting that earthquakes as large as those which occurred in the 20th century have repeat times of several 1000s of years. This raises the question as to how so many large earthquakes in this region could occur within a 62 year time interval, especially since distances of about 400 km separate the respective rupture areas. We propose that the occurrences of these and many smaller earthquakes are related and that their locations and style of faulting are controlled to a large extent by stress changes generated by the compounded static deformation of the preceding earthquakes and subsequent viscoelastic relaxation of the lower crust and upper mantle beneath Mongolia. We first estimate a spherically layered viscoelastic model constrained by the 1997-2001 GPS velocity field in Western Mongolia [Vergnolle et al., 2001]. A salient feature of the viscoelastic model is a low-viscosity ductile region occupying roughly the top 200 km of the mantle. Using the succession of 20th century earthquakes as sources of deformation, we then analyze the time-dependent change in Coulomb failure stress (Δ CFF). Stress evolution on the viscoelastic model consists of localized large stress steps at the times of occurrence of the source earthquakes followed by diffusion of stress pulses away from these sources. At remote interaction distances, static Δ CFF values are very small. However, modeled postseismic

With the advent of the British National Curriculum, computer-based modeling has become an integral part of the school curriculum. This book is about modeling in education and providing children with computer tools to create and explore representations of the world. Members of the London Mental Models Group contributed their research: (1)…

In case of fire in a building, how will people behave in the crowd? The behavior of each individual affects the behavior of others and, conversely, each one behaves considering the crowd as a whole and the individual others. In this article, I propose a three-step method to explore a brand new way to study behavior dynamics. The first step relies on the creation of specific situations with standard techniques (such as mental imagery, text, video, and audio) and an advanced technique [Virtual Reality (VR)] to manipulate experimental settings. The second step concerns the measurement of behavior in one, two, or many individuals focusing on parameters extractions to provide information about the behavior dynamics. Finally, the third step, which uses the parameters collected and measured in the previous two steps in order to simulate possible scenarios to forecast through computational models, understand, and explain behavior dynamics at the social level. An experimental study was also included to demonstrate the three-step method and a possible scenario. PMID:26594193

Santorini, the famous stratovolcano in the Aegean Sea, erupted three time periods during the 20th century (1925-1928, 1939-1941, 1950) and since then remains dormant. This study tried to combine and evaluate new and published volcanological, petrological, geochemical, environmental and sociological data of these three phases of Santorini's activity, which practically restricted to the caldera center on the Nea Kameni Islet. After field work on the formed dacite flows, pyroclastics and domes, representative rock samples and enclaves were collected and investigated for their texture, physical parameters, mineralogy and chemical composition by polarizing light microscope, scanning electron microscope (SEM-EDS), XRD, Raman spectroscopy and ICP-MS. The petrogenetic evaluation of the data obtained suggests slight but significant changes in the solid and aerial phases produced during the three explosion stages, which can be attributed to minor variations in the magmatic differentiation and magma chamber physicochemical conditions. These variations were also expressed by decrease of duration and intensity of the eruptions, as well as in their volume of ejecta and lava. Probably, the subsequent relatively long dormant period of the volcano is also related to this tension of decrease. The first compared results were collected from scientific literature, old photos as well as local and regional press and state documents from the different periods of volcanism, record the past hazard case scenarios and civil defense planning of the individual eruptions. As part of the disaster management a pilot survey, in which personal interviews with aged local islanders that were eye-witnesses of the events and elderly people or tourists that they indirectly experienced or have heard about them, was also conducted. This event-tracing, along with air pollution software models using volcanological data have shown the social impacts and the environmental consequences of the volcanic

studied regions. Comparing all the flood reconstructions over the last 250 years aims at tracking these atmospheric pathways and their possible changes over time. Strong similarities in flood frequency are observed from 1750 to 1900 for the western part (Cévennes, Southern French Alps), while no convincing correlation appears between the other records. Around 1900, a drastic change appears with strong similarities between records of the eastern part (Southern Alps, SE French Alps and NW Italian Alps). In particular, the flood frequency largely increased in the Cévennes during the first part of the 20th century, while this period is one of the most 'quiet' in all other records. Hence, these results suggest a reorganization of the flood-prone atmospheric patterns at the onset of the 20th century.

The Tibetan Plateau has experienced significant increases in temperature over the 20th century, but trends in precipitation are less clear, as station precipitation records are sparse and satellite observations only extend back to 1979. Here we use the sediment record from Ngamring Co, a closed-basin, freshwater lake in south-central Tibet, to assess summer precipitation over the last century. Ngamring Co is located in a watershed without glaciers, so recent changes in runoff and lake level are independent of the influence of glacial ice volume. The first principal component of the Ngamring Co grain size dataset is highly correlated with median grain size and covaries significantly with local July-August precipitation from the CPC Merged Analysis of Precipitation (CMAP). From 1979 to 2007, median grain size decreases with increasing July-August precipitation and increases with decreasing July-August precipitation. There is prominent multidecadal variability in the 20th century grain size record, including a gradual decline in median grain size from 1900 to 1930, a gradual increase in median grain size from 1930 to 1990, and a rapid decline in median grain size since 1990. Median grain size values from 2000-2007 are the lowest in the 107-year record, suggesting the most abundant monsoon precipitation in the last 107 years has occurred in the last decade. Satellite images of the lake also confirm an increase in lake area since the early 1990s, but also show the greatest lake area occurred in the 1970s. Thus, although summer precipitation and lake area do covary, precipitation does not seem to control grain size by influencing lake area. We presently hypothesize that increased July-August precipitation causes enhanced erosion of the fine-grained sediment (likely paleolake sediments and loess) that blankets the slopes surrounding the lake. The ensuing runoff and deposition of this sediment into the lake then results in a decline in median grain size within the lake

The IAP/LASG GOALS coupled model is used to simulate the climate change during the 20th century using historical greenhouse gases concentrations, the mass mixing ratio of sulfate aerosols simulated by a CTM model, and reconstruction of solar variability spanning the period 1900 to 1997. Four simulations, including a control simulation and three forcing simulations, are conducted. Comparison with the observational record for the period indicates that the three forcing experiments simulate reasonable temporal and spatial distributions of the temperature change. The global warming during the 20th century is caused mainly by increasing greenhouse gas concentration especially since the late 1980s; sulfate aerosols offset a portion of the global warming and the reduction of global temperature is up to about 0.11°C over the century; additionally, the effect of solar variability is not negligible in the simulation of climate change over the 20th century.

Zika virus (ZIKV) is a mosquito-borne flavivirus first isolated in Uganda in 1947. Although entomological and virologic surveillance have reported ZIKV enzootic activity in diverse countries of Africa and Asia, few human cases were reported until 2007, when a Zika fever epidemic took place in Micronesia. In the context of West Africa, the WHO Collaborating Centre for Arboviruses and Hemorrhagic Fever at Institut Pasteur of Dakar (http://www.pasteur.fr/recherche/banques/CRORA/) reports the periodic circulation of ZIKV since 1968. Despite several reports on ZIKV, the genetic relationships among viral strains from West Africa remain poorly understood. To evaluate the viral spread and its molecular epidemiology, we investigated 37 ZIKV isolates collected from 1968 to 2002 in six localities in Senegal and Côte d'Ivoire. In addition, we included strains from six other countries. Our results suggested that these two countries in West Africa experienced at least two independent introductions of ZIKV during the 20th century, and that apparently these viral lineages were not restricted by mosquito vector species. Moreover, we present evidence that ZIKV has possibly undergone recombination in nature and that a loss of the N154 glycosylation site in the envelope protein was a possible adaptive response to the Aedes dalzieli vector. PMID:24421913

Various studies have documented the effects of modern ]day irrigation on regional and global climate, but none, to date, have considered the time ]varying impact of steadily increasing irrigation rates on climate during the 20th century. We investigate the impacts of observed irrigation changes over this century with two ensemble simulations using an atmosphere general circulation model. Both ensembles are forced with transient climate forcings and observed sea surface temperatures from 1902 to 2000; one ensemble includes irrigation specified by a time ]varying data set of irrigation water withdrawals. Early in the century, irrigation is primarily localized over southern and eastern Asia, leading to significant cooling in boreal summer (June.August) over these regions. This cooling spreads and intensifies by century fs end, following the rapid expansion of irrigation over North America, Europe, and Asia. Irrigation also leads to boreal winter (December.February) warming over parts of North America and Asia in the latter part of the century, due to enhanced downward longwave fluxes from increased near ]surface humidity. Precipitation increases occur primarily downwind of the major irrigation areas, although precipitation in parts of India decreases due to a weaker summer monsoon. Irrigation begins to significantly reduce temperatures and temperature trends during boreal summer over the Northern Hemisphere midlatitudes and tropics beginning around 1950; significant increases in precipitation occur in these same latitude bands. These trends reveal the varying importance of irrigation ]climate interactions and suggest that future climate studies should account for irrigation, especially in regions with unsustainable irrigation resources.

Methane and ethane are the most abundant hydrocarbons in the atmosphere and they impact both atmospheric chemistry and climate. Both gases are emitted from fossil fuels and biomass burning, while methane alone has large sources from wetlands, agriculture, landfills and wastewater. Here we use measurements in firn air from Greenland and Antarctica to reconstruct the atmospheric variability of ethane during the 20th century. Ethane levels rose from early in the century until the 1980’s when the trend reverses, with a period of decline over the next 20 years. This variability is primarily driven by changes in ethane emissions from fossil fuels that peaked in the 1960’s and 1970’s at 14-16 Tg/y and dropped to 8-10 Tg/y before the end of the century. The reduction in fossil-fuel sources is likely related to changes in light hydrocarbon recovery during petroleum production and use. The ethane-based emission history implies that the decline in the fossil-fuel source of methane may have started prior to the 1980’s and that the magnitude of the decline is larger than previous estimates.

The unique observing conditions allowed by total solar eclipses made them a highly desirable target of 19th and early 20th century astronomical expeditions, particularly after 1842. Due to the narrowness of the lunar shadow at the Earth's surface this usually implied traveling to faraway locations with all the subsequent inconveniences, in particular, high costs and complex logistics. A situation that improved as travel became faster, cheaper and more reliable. The possibility to observe an eclipse in one's own country implied no customs, no language barriers, usually shorter travelling distances and the likely support of local and central authorities. The eclipse proximity also provided a strong argument to pressure the government to support the eclipse observation. Sometimes the scientific elite would use such high profile events to rhetorically promote broader goals. In this paper we will analyse the motivation, goals, negotiating strategies and outcomes of the Portuguese eclipse expeditions made between 1860 and 1914. We will focus, in particular, on the observation of the solar eclipses of 22 December 1870 and 17 April 1912. The former allowed the start-up of astrophysical studies in the country while the movie obtained at the latter led Francisco da Costa Lobo to unexpectedly propose a polar flattening of the Moon.

Solar radiation sensors can be carried on standard weather balloon packages and provide additional information about the atmosphere's vertical structure beyond the traditional thermodynamic measurements [1]. An interesting set of circumstances for such sensors occurs during a solar eclipse, which provides a rapidly changing solar radiation environment within the duration of a typical free balloon flight. Coordinating several launches of solar radiation measuring radiosondes brings a good likelihood of at least one being above any cloud during the maximum eclipse, allowing solar eclipse radiation measurements for comparison with theory. For the 20th March 2015 solar eclipse, a coordinated campaign of balloon-carried solar radiation measurements was undertaken from Reading (51.44N, 0.94W), Lerwick (60.15N, 1.13W) and Reykjavik (64.13N, 21.90W), straddling the path of the eclipse. All three balloons reached sufficient altitude at the eclipse time for eclipse-induced variations in solar radiation and solar limb darkening to be measured above cloud. Because the sensor platforms were free to swing, techniques have been evaluated to correct the measurements for their changing orientation. These approaches, which are essentially independent, give values that agree with theoretical expectations for the eclipse-induced radiation changes. [1] K.A. Nicoll and R.G. Harrison, Balloon-borne disposable radiometer Rev Sci Instrum 83, 025111 (2012) doi: 10.1063/1.3685252

Tornadoes have occurred in the territory of the Czech Republic throughout history. Although their frequency and intensity are not as high as in the USA, they can cause severe damage as well. That is why a systematic effort to document individual occurrences of this dangerous meteorological phenomenon as far as possible back into the past began in the 1990s. The aim of this investigation is to extend the first European catalogue of tornadoes originally published by Wegener [Wegener A., 1917. Wind-und Wasserhosen in Europe], by the addition of these cases from the Czech Republic. This paper adds further to Setvák, Šálek and Munzar [ Setvák M., Šálek M., Munzar J., 2003. Tornadoes within the Czech Republic—from medieval chronicles to the internet society. Atmos. Res. 67-68, 589-605], who reported the earliest documented tornado in the land of the Czech Republic which occurred in AD 1119 in Prague. In so doing, it presents recently discovered tornado cases from the 16th to the early 20th centuries, found in a variety of historical sources since the last ECSS conference held in Prague in 2002. In particular we will focus on the case from Jablonec nad Nisou (northern Bohemia) in 1925, which was probably the first case in the Czech Republic with accompanying photographic documentation of damage caused by a tornado.

Zika virus (ZIKV) is a mosquito-borne flavivirus first isolated in Uganda in 1947. Although entomological and virologic surveillance have reported ZIKV enzootic activity in diverse countries of Africa and Asia, few human cases were reported until 2007, when a Zika fever epidemic took place in Micronesia. In the context of West Africa, the WHO Collaborating Centre for Arboviruses and Hemorrhagic Fever at Institut Pasteur of Dakar (http://www.pasteur.fr/recherche/banques/CRORA/) reports the periodic circulation of ZIKV since 1968. Despite several reports on ZIKV, the genetic relationships among viral strains from West Africa remain poorly understood. To evaluate the viral spread and its molecular epidemiology, we investigated 37 ZIKV isolates collected from 1968 to 2002 in six localities in Senegal and Côte d'Ivoire. In addition, we included strains from six other countries. Our results suggested that these two countries in West Africa experienced at least two independent introductions of ZIKV during the 20(th) century, and that apparently these viral lineages were not restricted by mosquito vector species. Moreover, we present evidence that ZIKV has possibly undergone recombination in nature and that a loss of the N154 glycosylation site in the envelope protein was a possible adaptive response to the Aedes dalzieli vector. PMID:24421913

Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600-1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.

Measles mortality fell prior to the introduction of vaccines or antibiotics. By examining historical mortality reports we sought to determine how much measles mortality was due to epidemiological factors such as isolation from major population centres or increased age at time of infection. Age-specific records were available from Aberdeen; Scotland; New Zealand and the states of Australia at the end of the 19th and beginning of the 20th centuries. Despite the relative isolation of Australia, measles mortality was concentrated in very young children similar to Aberdeen. In the more isolated states of Tasmania, Western Australia and Queensland adults made up 14-15% of measles deaths as opposed to 8-9% in Victoria, South Australia and New South Wales. Mortality in Iceland and Faroe Islands during the 1846 measles epidemic was used as an example of islands isolated from respiratory pathogens. The transition from crisis mortality across all ages to deaths concentrated in young children occurred prior to the earliest age-specific mortality data collected. Factors in addition to adult age of infection and epidemiological isolation such as nutritional status and viral virulence may have contributed to measles mortality outcomes a century ago. PMID:25865777

A high-resolution record of polycyclic aromatic hydrocarbon (PAH) deposition in Rhode Island over the past approximately 180 years was constructed using a sediment core from the anoxic Pettaquamscutt River basin. The record showed significantly more structure than has hitherto been reported and revealed four distinct maxima in PAH flux. The characteristic increase in PAH flux at the turn of the 20th century was captured in detail, leading to an initial maximum prior to the Great Depression. The overall peak in PAH flux in the 1950s was followed by a maximum that immediately preceded the 1973 Organization of Petroleum Exporting Countries (OPEC) oil embargo. During the most recent portion of the record, an abrupt increase in PAH flux between 1996 and 1999 has been found to follow a period of near constant fluxes. Because source-diagnostic ratios indicate that petrogenic inputs are minor throughout the record, these trends are interpreted in terms of past variations in the magnitude and type of combustion processes. For the most recent PAH maximum, energy consumption data suggest that diesel fuel combustion, and hence traffic of heavier vehicles, is the most probable cause for the increase in PAH flux. Systematic variations in the relative abundance of individual PAHs in conjunction with the above changes in flux are interpreted in relation to the evolution of combustion processes. Coronene, retene, and perylene are notable exceptions, exhibiting unique down-core profiles. PMID:12542290

The Sahel, the transition zone between the Saharan desert and the rainforests of Central Africa and the Guinean Coast, experienced a severe drying trend from the 1950s to the 1980s, from which there has been partial recovery. Continuation of either the drying trend or the more recent ameliorating trend would have far-ranging implications for the economy and ecology of the region. Coupled atmosphere/ocean climate models being used to simulate the future climate have had difficulty simulating Sahel rainfall variations comparable to those observed, thus calling into question their ability to predict future climate change in this region. We describe simulations using a new global climate model that capture several aspects of the 20th century rainfall record in the Sahel. An ensemble mean over eight realizations shows a drying trend in the second half of the century of nearly half of the observed amplitude. Individual realizations can be found that display striking similarity to the observed time series and drying pattern, consistent with the hypothesis that the observations are a superposition of an externally forced trend and internal variability. The drying trend in the ensemble mean of the model simulations is attributable to anthropogenic forcing, partly to an increase in aerosol loading and partly to an increase in greenhouse gases. The model projects a drier Sahel in the future, due primarily to increasing greenhouse gases. PMID:16322101

Since Dr. Fogh-Andersen's legendary 1942 thesis, the Danish facial cleft population has been one of the most extensively studied in terms of epidemiology and genetic-epidemiology. The etiology of cleft lip and/or palate (CLP) is still largely an enigma, and different results concerning environmental and genetic risk factors are obtained in different countries and regions. This may be due to etiological heterogeneity between settings. Therefore, an in-depth studied area with an ethnically homogeneous population, such as Denmark, has provided one of the best opportunities for progress in CLP etiological research. The present review summarizes epidemiological and genetic-epidemiological studies conducted in the 20th century Danish facial cleft population. Furthermore, analyses of sex differences, time trends and seasonality for more than 7000 CLP cases born in Denmark in the period 1936 to 1987 are presented. The review also points toward the excellent opportunities for continued etiological CLP research in Denmark in the 21st century using already established resources and an on-going prospective cohort study of 100,000 pregnant women. PMID:10213053

This paper presents the results of the dental analysis performed on a Khoesan skeletal sample representing the late 19th and early 20th century Cape Colony in southern Africa. Skeletal material from two European collections (Vienna and Paris) was selected to compile a total sample of 116 specimens. Dental pathology frequencies were calculated for caries (28.4%), antemortem tooth loss (37.9%), periapical abscesses (29.3%), periodontal disease (26.7%), calculus (44.0%) and impacted canines (4.3%). Attrition scores indicated that the group under study had an average rate of attrition compared to other southern African populations. Frequency and intensity data were compared to several other samples from both the pre-contact and contact phases by means of chi-squared analysis. The outcome of the study suggested that the group under study was most likely in a state of transition between a diet and lifestyle of hunting-and-gathering and agriculture. Results were also consistent with those of groups from a low socio-economic status. PMID:25882044

Sediment processes in estuaries are controlled by the interaction of factors that include tides, fresh water inputs, bed morphology, sediment supply, and hydrodynamics. The interaction of these factors strongly influences the pattern of sediment deposition. The ability to quantify sediment deposition on a regional scale will improve the understanding of the underlying processes, and provide valuable information for managing estuarine systems. This paper describes our approach for obtaining the deposition pattern and quantifying the amount of 20th century impacted sediments in the Haverstraw Bay section of the Hudson River Estuary. Through the combination of high-resolution seismic data and rapidly acquired geochemical information from numerous sediment cores, we estimate that our study site experiences an average sediment accumulation rate of ˜3 mm/y and that ˜75,000 t/y or ˜10% of the annual total sediment input measured at the Poughkeepsie, NY gauging station (USGS) is stored in this reach of the Hudson River on ˜100 y timescales. A detailed analysis of the depositional pattern indicates that the accumulation rate varies considerably throughout the study area ranging from non-depositional to >8 mm/y. Our data also clearly indicate that the dredged channel in Haverstraw Bay is currently the main focus of deposition in this area.

The purpose of this investigation is to demonstrate the usability of objective methods to study the variability of precipitation and hence to contribute to a better understanding of spatial and seasonal variability of Austria's precipitation climate during the 20th century.This will be achieved by regionalizing the intra-annual variability of seasonal precipitation distributions during three non-overlapping 33 year samples (1901-33, 1934-66, 1967-99). Monthly precipitation totals were extracted at 31 Austrian stations from a homogenized long-term climate dataset provided by the Austrian weather service. Three statistical techniques, namely cluster analysis (CLA), rotated empirical orthogonal functions (REOFs) and an unsupervised learning procedure of artificial neural networks (ANNs), were utilized to find homogeneous precipitation regions.The results of summer (June, July, August (JJA)) and winter (December, January, February (DJF)) seasons are presented. The resulting homogeneous precipitation regions depend on season, period and method in this order. Hence, differences introduced by using different methods are small compared with those inferred by investigating different episodes and especially with those related to the seasons.During winter, three homogeneous precipitation regions are found, independent from the period considered. These regions can be assigned to different airflows dominating Austria's climate and triggering precipitation events during the cold season. The situation during summer is more complicated. Thus, at least four clusters are necessary to record the circumstances, which are caused by spatially inhomogeneous convective events such as thunderstorms.

Droughts are becoming the most expensive natural disasters in China and have exerted serious impacts on local economic development and ecological environment. The fifth phase of the Coupled Model Intercomparison Project (CMIP5) provides a unique opportunity to assess scientific understanding of climate variability and change over a range of historical and future period. In this study, fine-resolution multimodel climate projections over China are developed based on 7 CMIP5 climate models under RCP8.5 emissions scenarios by means of Bilinear Interpolation and Bias Correction. The results of downscaled CMIP5 models are evaluated over China by comparing the model outputs with the England Reanalysis CRU3.1 from 1951 to 2000. Accordingly, the results from the output of downscaled models are used to calculate the Standardized Precipitation Index (SPI). Time series of SPI has been used to identify drought from 20th century to 21st century over China. The results show that, most areas of China are projected to become wetter as a consequence of increasing precipitation under RCP8.5 scenarios. Detailed examination shows that the SPI show a slightly increasing trend in the future period for the most parts of China, but drought in Southwest region of China will become the norm in the future RCP8.5 scenarios.

We present paleoclimate evidence for rapid (< 100 years) shifts of ??? 2-4??C in Chesapeake Bay (CB) temperature ???2100, 1600, 950, 650, 400 and 150 years before present (years BP) reconstructed from magnesium/calcium (Mg/Ca) paleothermometry. These include large temperature excursions during the Little Ice Age (???1400-1900 AD) and the Medieval Warm Period (???800-1300 AD) possibly related to changes in the strength of North Atlantic thermohaline circulation (THC). Evidence is presented for a long period of sustained regional and North Atlantic-wide warmth with low-amplitude temperature variability between ???450 and 1000 AD. In addition to centennial-scale temperature shifts, the existence of numerous temperature maxima between 2200 and 250 years BP (average ???70 years) suggests that multi-decadal processes typical of the North Atlantic Oscillation (NAO) are an inherent feature of late Holocene climate. However, late 19th and 20th century temperature extremes in Chesapeake Bay associated with NAO climate variability exceeded those of the prior 2000 years, including the interval 450-1000 AD, by 2-3??C, suggesting anomalous recent behavior of the climate system. ?? 2002 Elsevier Science B.V. All rights reserved.

Pollock, Warhol, Basquiat and Haring made an international reputation for themselves with their art foremost of the American artists of the 20th century, and became pop cultural icons for the man in the street and for the media as well. Accordingly to the habits of the consumer society their art and even themselves become product and consumer's goods. Their not mistaken, individual style - which also became their trademark - makes that possible. The connection between the four artists is that each of them had a dependent personality, their fine art activity was arguable in their period, and after all themselves and his artworks get into the increased attention of the media. These four artists embody the brand-new artist type, who steps into a star status. Besides the artworks the artist also get into the focus of interest. Through psychological aspect their artworks tell a lot about their way of life, their personality, and the social estate around them. Four of them were catalysts, they set new art trends. The influence of Basquiat and Haring stretched over to the 21st century, and keeps going in the graffiti street-art which gets into the "high art" at last, and captivates the art galleries and critics as well. PMID:20938058

Nowhere is the problem of understanding the complex linkages between organisms and their environments more apparent than in the science of plants. Today, efforts by scientists to predict and manage the biological consequences of shifting global and regional climates depend on understanding how organisms respond morphologically, physiologically, and behaviorally to changes in their environments. Investigating organismal "adaptability" (or "plasticity") is rarely straightforward, prompting controversy and discourse among and between ecologists and agricultural scientists. Concepts like agro-climatic adaptation, phenotypic plasticity, and genotype-environment interaction (GxE) are key to those debates, and their complex histories have imbued them with assumptions and meanings that are consequential but often opaque. This special section explores the diverse ways in which organismal adaptability has been conceptualized and investigated in the second half of the 20th century, and the multifarious political, economic, environmental, and intellectual contexts in which those conceptions have emerged and evolved. The papers in this section bring together perspectives from the histories of agriculture, population ecology, evolutionary theory, and plant physiology, cutting across Asian, North American, and British contexts. As a whole, this section highlights not only the diversity of meanings of "adaptability" and "plasticity," but also the complex linkages between those meanings, the scientific practices and technologies in which they are embedded, and the ends toward which those practices and technologies are employed. PMID:25641218

This study reports on the 1871-2010 trends in significant wave heights (Hs) in the North Atlantic, as statistically reconstructed from the 20th century reanalysis (20CR) ensemble of mean sea level pressure (SLP) fields. The 20CR SLP data set for the North Atlantic has been reported to be homogeneous since 1871, although it has discontinuities before 1949 in other regions. A multivariate regression model with lagged dependent variable is used to represent the SLP-Hs relationship. It is calibrated and validated using the ERA-Interim reanalysis of Hs and SLP for the period 1981-2010.Trends in the reconstructed annual mean and maximum Hs are found to be consistent with those derived from two dynamical wave reanalysis data sets (MSC50 and ERA40), which indicates robustness of the trend estimates. The trend patterns of extreme Hs generally feature increases in the northeast North Atlantic with decreases in the mid-latitudes; but there are seasonal variations. The main features of the patterns of trends over the last half century or so are also seen in the last 140-yr period (1871-2010). However, the trend magnitudes are much greater in the last half century than in the 140 years.

René Leriche (1879-1955) was a 20(th) century French surgeon generally known in medicine for the syndrome that carries his name, namely the Leriche syndrome in the aorto-iliac occlusive disease. This paper is published to mark the commemoration of 60 year since Leriche's death. Although Dr. Leriche's legacy resides in the domain of vascular medicine, his research enclosed bone pathology and surgical management of pain. Having his surgical training done under professors Mathieu Jaboulay and Antonin Poncet, his friendship and association with Alexis Carrel and William Halsted have contributed to René Leriche's development as a surgeon, researcher and thinker. Following the footsteps of his mentors, he produced outstanding clinical and academic work which had earned him a good reputation among his students and colleagues. Surgeons such as Jean Kunlin, Jao Cid dos Santos, Michael DeBakey, René Fontaine and others came to study and learn from him. These future generations of surgeons would themselves bring much contribution to the understanding and treatment of vascular diseases and medicine in general. René Leriche pioneered medicine with his research and ideas. His assiduous work of teaching, research and clinical practice made his influence last to our present. PMID:27004042

René Leriche (1879–1955) was a 20th century French surgeon generally known in medicine for the syndrome that carries his name, namely the Leriche syndrome in the aorto-iliac occlusive disease. This paper is published to mark the commemoration of 60 year since Leriche’s death. Although Dr. Leriche’s legacy resides in the domain of vascular medicine, his research enclosed bone pathology and surgical management of pain. Having his surgical training done under professors Mathieu Jaboulay and Antonin Poncet, his friendship and association with Alexis Carrel and William Halsted have contributed to René Leriche’s development as a surgeon, researcher and thinker. Following the footsteps of his mentors, he produced outstanding clinical and academic work which had earned him a good reputation among his students and colleagues. Surgeons such as Jean Kunlin, Jao Cid dos Santos, Michael DeBakey, René Fontaine and others came to study and learn from him. These future generations of surgeons would themselves bring much contribution to the understanding and treatment of vascular diseases and medicine in general. René Leriche pioneered medicine with his research and ideas. His assiduous work of teaching, research and clinical practice made his influence last to our present. PMID:27004042

Objective. To study the computed tomography(CT) images of royal Ancient Egyptian mummies dated to the 18th to early 20th Dynasties for the claimed diagnoses of ankylosing spondylitis (AS) and diffuse idiopathic skeletal hyperostosis (DISH) and to correlate the findings with the archaeology literature.Methods. We studied the CT images of 13 royal Ancient Egyptian mummies (1492–1153 BC) for evidence of AS and DISH and correlated our findings with the archaeology literature.Results. The findings of the CT scans excluded the diagnosis of AS, based on the absence of sacroiliac joint erosions or fusion of the facet joints. Four mummies fulfilled the diagnostic criteria for DISH:Amenhotep III (18th Dynasty), Ramesses II, his son Merenptah, and Ramesses III (19th to early 20th Dynasties).The diagnosis of DISH, a commonly a symptomatic disease of old age, in the 4 pharaohs is in concordance with their longevity and active lifestyles.Conclusion. CT findings excluded the diagnosis of AS in the studied royal Ancient Egyptian mummies and brought into question the antiquity of the disease. The CT features of DISH during this ancient period were similar to those commonly seen in modern populations,and it is likely that they will also be similar in the future.The affection of Ramesses II and his son Merenptah supports familial clustering of DISH. The process of mummification may induce changes in the spine that should be considered during investigations of disease in ancient mummies. PMID:25329920

Much of the 20th century multi-decadal variability in the NAO-winter precipitation relationship over the N. Atlantic / European sector can be ascribed to the combined effects of the North Atlantic Oscillation (NAO) and either the East Atlantic pattern (EA) or the Scandinavian pattern (SCA). The NAO, EA and SCA indices employed here are defined as the three leading vectors of the cross-correlation matrix calculated from monthly sea-level pressure anomalies for 138 complete winters from the 20CRv2 dataset (Compo et al., 2011). Winter precipitation data over Europe for the entire 20th century is derived from the high resolution CRU-TS3.1 climate dataset (Mitchell and Jones, 2005). Here we document for the first time, that different NAO/EA and NAO/SCA combinations systematically influence winter precipitation conditions in Europe as a consequence of NAO dipole migrations. We find that the zero-correlated line of the NAO-winter precipitation relationship migrates southwards when the EA is in the opposite phase to the NAO. This can be related to a south-westwards migration of the NAO dipole under these conditions, as shown by teleconnectivity maps. Similarly, a clockwise movement of the NAO-winter climate correlated areas occurs when the phase of the SCA is opposite to that of the NAO, reflecting a clockwise movement of the NAO dipole under these conditions. An important implication of these migrations is that they influence the spatial and temporal stationarity of climate-NAO relationships. As a result, the link between winter precipitation patterns and the NAO is not straightforward in some regions such as the southern UK, Ireland and France. For instance, much of the inter-annual variability in the N-S winter precipitation gradient in the UK, originally attributed to inter-annual and inter-decadal variability of the NAO, reflects the migration of the NAO dipole, linked to linear combinations of the NAO and the EA. Our results indicate that when the N-S winter

Switzerland has experienced a number of severe precipitation events during the last few decades, such as during the 14-16 November of 2002 or during the 21-22 August of 2005. Both events, and subsequent extreme floods, caused fatalities and severe financial losses, and have been well studied both in terms of atmospheric conditions leading to extreme precipitation, and their consequences [e.g. Hohenegger et al., 2008, Stucki et al., 2012]. These examples highlight the need to better characterise the frequency and severity of flooding in the Alpine area. In a larger framework we will ultimately produce a high-resolution data set covering the entire 20th century to be used for detailed hydrological studies including all atmospheric parameters relevant for flooding events. In a first step, we downscale the aforementioned two events of 2002 and 2005 to assess the model performance regarding precipitation extremes. The complexity of the topography in the Alpine area demands high resolution datasets. To achieve a sufficient detail in resolution we employ the Weather Research and Forecasting regional climate model (WRF). A set of 4 nested domains is used with a 2-km resolution horizontal resolution over Switzerland. The NCAR 20th century reanalysis (20CR) with a horizontal resolution of 2.5° serves as boundary condition [Compo et al., 2011]. First results of the downscaling the 2002 and 2005 extreme precipitation events show that, compared to station observations provided by the Swiss Meteorological Office MeteoSwiss, the model strongly underestimates the strength of these events. This is mainly due to the coarse resolution of the 20CR data, which underestimates the moisture fluxes during these events. We tested driving WRF with the higher-resolved NCEP reanalysis and found a significant improvement in the amount of precipitation of the 2005 event. In a next step we will downscale the precipitation and wind fields during a 6-year period 2002-2007 to investigate and

Lens and cataract research from a clinical, biophysical, biological and mainly biochemical point of view has a long tradition. Already since the beginning of the 20th century research relating to the chemical composition and metabolism of the lens was conducted. With these analyses an attempt was made to understand the existence and maintenance of lens transparency and the mechanisms leading to lens opacities. Around the middle of the century the stationary analyses measuring the content of certain substances in the lens were more and more replaced by the search for dynamic metabolic processes responsible for lens growth, maintenance of transparency and possibly active participation in lens function (such as accommodation). Also the disturbances as a result of ageing or the formation of lens opacities have been investigated and resulted partially in the elucidation of reaction chains, leading from a trigger to the formation of a cataract. Lens biochemistry is no longer a closed book to us, but there are still many question marks. Why were we not able to solve more problems around lens and cataract? The research effort with a remarkable financial input and a great number of scientists worldwide during the second half of the century does not correspond to the results obtained. There must be something wrong with our strategy, our interpretation of the results or even both. We would like to stress some points which might be regarded as errors or misunderstandings in the lens research community, thus preventing a better outcome of the enormous investment of work and money. A great disadvantage is the missing cooperation between clinicians and epidemiologists on one hand and basic lens researchers on the other. Especially the ignorance of basic researchers regarding the clinical problems of the lens and of cataracts might be to blame for several 'errors and misunderstandings'. It is not even so long ago since the slitlamp microscope examination of animals belonged to the

The effects at regional scale of decadal fluctuation of the NAO/AO on the 20th cen- tury precipitation over Sardinia will be analyzed. Decadal variations of precipitation will initially be described, by use of the Standardized Anomaly Index (Katz &Glantz, 1986) based on two indicators: the cumulated precipitation (the classical approach) and the number of rainy days. A clear decreasing trend in the last two deacdes, statis- tically significant at the 1% level, will be highlighted. A short survey of connections with MSLP and 500hPA Geopotential Height fields will be used to give an overview of dependence of Sardinia (regional) precipitation on synoptic-scale and planetary scale features. In the following part, three different paradigms of the NAO/AO will be used: the classical two point obscillation, the PCA analysis of MSLP (Thompson &Wallace, 1998) and the centers of action approach (Machel et al., 1998). The results of the anal- ysis of the effects of NAO/AO (described in the former three ways) on precipitation will enable to discuss how such a teleconnection influences regional precipitation on this part of the Mediterranean. Statistical significance of each result will be provided during the presentation. Katz, R., Glantz, M., 1986. "Anatomy of a Rainfall Index". Mon. Wea. Rev., 114, 764-771. Mächel, M., Kapala, A., Flohn, H., 1998. "Behaviour of the Centers of Action above the Atlantic since 1881. Part I: Characteristics of seasonal and interannual Variability". Int. Jou. of Climatol., 18, 1-22. Thopson, D. W. J., Wallace, J. M., 1998. "The Arctic Oscillation signature in the wintertime geopotential height and temperature fields". Geoph. Res. Let., 25, 1297- 1300.

Significant climatic changes over Northern Eurasia during the 20th century have been reflected in numerous variables of economic, social, and ecological interest, including the natural frequency of forest fires. For the former USSR, we are now using the Global Daily Climatology Network and a new Global Synoptic Data Network archive, GSDN, created jointly by U.S. National Climatic Data Center and Russian Research Institute for Hydrometeorological Information. Data from these archives (approximately 1500 of them having sufficiently long meteorological time series suitable for participation in our analyses) are employed to estimate systematic changes in indices used in the United States and Russia to assess potential forest fire danger. We use four indices: (1) Keetch-Byram Drought Index, (KBDI; this index was developed and widely used in the United States); (2) Nesterov, (3) Modified Nesterov, and (4) Zhdanko Indices (these indices were developed and widely used in Russia). Analyses show that after calibration, time series of the days with increased potential forest fire danger constructed using each of these three indices (a) are well correlated and (b) deliver similar conclusions about systematic changes in the weather conditions conducive to forest fires. Specifically, over the Eastern half of Northern Eurasia (Siberia and the Russian Far East) statistically significant increases in indices that characterize the weather conditions conducive to forest fires were found. These areas coincide with the areas of most significant warming during the past several decades south of the Arctic Circle. West of the Ural Mountains, the same indices show a steady decrease in the frequency of "dry weather summer days" during the past 60 yr. This study is corroborated with available statistics of forest fires and with observed changes in drought statistics in agricultural regions of Northern Eurasia.

The 20th century has seen an enormous growth in population and industrialization. These changes are accompanied, among others, by a substantial increase in aerosol emission. To learn more about associated consequences for the climate system we have carried out a comparatively large set of transient sensitivity studies with the global atmosphere only climate model ECHAM5-HAM, using aerosol emission data from NIES (National Institute of Environmental Studies, Japan) and prescribed, observation based sea surface temperatures (SSTs) from the Hadley Center. The sensitivity studies cover the period from 1870 to 2005 and comprise ensembles of simulations (up to 13 members per ensemble), which allow to address the role of different aerosol species, greenhouse gases, and prescribed sea surface temperatures. We present a preliminary analysis of these global simulation data for the Sahel region (land within 20W / 35E / 10N / 20N). The annual cycle as well as the overall temporal evolution of precipitation in the Sahel according to CRU (Climate Research Unit, UK) is captured well by the model simulations: two comparatively wet phases in the 1930s and 1950s, a more or less continuous decline thereafter, and a renewed increase in precipitation since the 1980s. This decline / renewed incline since the 1950s is, however, about twice as strong in the CRU data than in the model data. The sensitivity studies reveal SSTs as a prominent factor for the time evolution of precipitation, while the atmosphere only effect of aerosols plays a minor role for the modeled precipitation. The observation based prescribed SSTs may, however, encapsulate and aerosol effect already.

Concepts and the relations between concepts are the basis for all our scientific understanding and explanation of the wide variety of constituents and phenomena in nature. Some of the fundamental concepts like space, time, matter, radiation, causality, etc. had remained unchanged for almost four hundred years from the time of the dawn of science. However all these underwent a drastic transformation in the 20th century because of two reasons. One, in the light of certain experimental findings two radical theories namely theory of relativity and theory of quantum mechanics replaced the classical theory that had dominated since Newton's time. Secondly, the science-technology spiral resulted in the discovery of very many new features of the universe both on the micro scale and on the mega scale. There was an exponential increase in our knowledge. These new facts could not be fitted into the old concepts. Apart from drastic revision, many new concepts had to be brought in. Despite all this, one very encouraging trend has been to discern a holistic synthesis and unification of the different concepts -- an endeavor that has been helped by experiments over a wide scale of energy and distances and most importantly from theoretical insights triggered by mathematical underpinnings. These developments in physics and astrophysics are pointing to one grand concept, namely, the "quantum vacuum" endowed with certain special properties, as the substratum from which all the constituents of the universe as well as the processes of the universe emerge, including the creation of the universe itself. This is the view, at least of some of the scientists. In this brief article the essence of these approaches toward unification is highlighted. Maybe life sciences can take a clue from these developments in physical sciences.

The present paper describes the main results obtained from the characterization of a wide range of natural and synthetic ochre samples used in Portugal from the 19th to the 20th century, including powder and oil painting samples. The powder ochre samples came from several commercial distributors and from the collection of Joaquim Rodrigo (1912-1997), a leading Portuguese artist, particularly active during the sixties and seventies. The micro-samples of oil painting tubes came from the Museu Nacional de Arte Contemporânea-Museu do Chiado (National Museum of Contemporary Art-Chiado Museum) in Lisbon and were used by Columbano Bordalo Pinheiro (1857-1929), one of the most prominent naturalist Portuguese painters. These tubes were produced by the main 19th century colourmen: Winsor & Newton, Morin et Janet, Maison Merlin, and Lefranc. The samples have been studied using μ-Fourier Transform Infrared Spectroscopy (μ-FTIR), Raman microscopy, μ-Energy Dispersive X-ray fluorescence (μ-EDXRF), and X-ray diffraction (XRD). The analyzed ochres were found to be a mixture of several components: iron oxides and hydroxides in matrixes with kaolinite, gypsum and chalk. The results obtained allowed to identify and characterize the ochres according to their matrix and chromophores. The main chromophores where identified by Raman microscopy as being hematite, goethite and magnetite. The infrared analysis of the ochre samples allowed to divide them into groups, according to the composition of the matrix. It was possible to separate ochres containing kaolinite matrix and/or sulfate matrix from ochres where only iron oxides and/or hydroxides were detected. μ-EDXRF and Raman were the best techniques to identify umber, since the presence of elements such as manganese is characteristic of these pigments. μ-EDXRF also revealed the presence of significant amounts of arsenic in all Sienna tube paints.

Since systematic measurements of Louisiana continental-shelf waters were initiated in 1985, hypoxia (oxygen content <2 mg L-1) has increased considerably in an area termed the dead zone. Monitoring and modeling studies have concluded that the expansion of the Louisiana shelf dead zone is related to increased anthropogenically derived nutrient delivery from the Mississippi River drainage basin, physical and hydrographical changes of the Louisiana Shelf, and possibly coastal erosion of wetlands in southern Louisiana. In order to track the development and expansion of seasonal low-oxygen conditions on the Louisiana shelf prior to 1985, we used a specific low-oxygen foraminiferal faunal proxy, the PEB index, which has been shown statistically to represent the modern Louisiana hypoxia zone. We constructed a network of 13 PEB records with excess 210Pb-derived chronologies to establish the development of low-oxygen and hypoxic conditions over a large portion of the modern dead zone for the last 100 years. The PEB index record indicates that areas of low-oxygen bottom water began to appear in the early 1910s in isolated hotspots near the Mississippi Delta and rapidly expanded across the entire Louisiana shelf beginning in the 1950s. Since ???1950, the percentage of PEB species has steadily increased over a large portion of the modern dead zone. By 1960, subsurface low-oxygen conditions were occurring seasonally over a large part of the geographic area now known as the dead zone. The long-term trends in the PEB index are consistent with the 20th-century observational and proxy data for low oxygen and hypoxia. ?? 2009 US Government.

Geological, paleontological and geomorphologic studies show that the Earth's climate has always been changing since it came into existence. The climate change itself is self-evident. Therefore the far more serious question is how much does mankind strengthen or weaken these changes beyond the natural fluctuation and changes of climate. The aim of the present study was to restore the historical land cover changes and to simulate the meteorological consequences of these changes. Two different land cover maps for Hungary were created in vector data format using GIS technology. The land cover map for 1900 was reconstructed based on statistical data and two different historical maps: the derived map of the 3rd Military Mapping Survey of the Austro-Hungarian Empire and the Synoptic Forestry Map of the Kingdom of Hungary. The land cover map for 2000 was derived from the CORINE land cover database. Significant land cover changes were found in Hungary during the 20th century according to the examinations of these maps and statistical databases. The MM5 non-hydrostatic dynamic model was used to further evaluate the meteorological effects of these changes. The lower boundary conditions for this mesoscale model were generated for two selected time periods (for 1900 and 2000) based on the reconstructed maps. The dynamic model has been run with the same detailed meteorological conditions of selected days from 2006 and 2007, but with modified lower boundary conditions. The set of the 26 selected initial conditions represents the whole set of the macrosynoptic situations for Hungary. In this way, 2×26 "forecasts" were made with 48 hours of integration. The effects of land cover changes under different weather situations were further weighted by the long-term (1961-1990) mean frequency of the corresponding macrosynoptic types, to assume the climatic effects from these stratified averages. The detailed evaluation of the model results were made for three different meteorological

Analysis of spaceborne radiometry has shown that the total solar irradiance variation over the past two activity cycles was approximately proportional to the weighted difference between areas of dark spots and bright faculae and enhanced network. Empirical models of ultraviolet irradiance variation indicate that its behavior is dominated by changes in area of the bright component alone, whose photometric contrast increases at shorter wavelength.This difference in time behavior of total and UV irradiances could help to discriminate between their relative importance in forcing of global warming. Our recent digitization of archival Ca K images from Mt Wilson and NSO provides the first direct measurement of variations in area of the bright component, extending between 1915 and 1999 (previous models have relied on the sunspot number or other proxies to estimate the bright - component contribution). We use these more direct measurements to derive the time behavior of solar total and UV irradiance variation, over this period .We find that they are significantly different;the total irradiance variation accounts for over 80 percent of the variance in global temperature during this period, while the ultraviolet irradiance variation accounts for only about 20 percent. The amplitude of total irradiance variation in our model is smaller than required to influence global warming,in current climate models.Also, the impact of sulfate aerosol variations on the extended cooling between the 1940's and 1970's must be better understood before the significance of correlations between 20th century global warming, and any solar activity index can be properly assessed. Despite these caveats, the lower correlation we find between global temperature and UV,compared to total, irradiance requires consideration in the search for physical mechanisms linking solar activity and climate. This work was supported in part under NASA grant NAG5-7607 to CRI, Inc., and NAG5-10998 to the Applied Physics

Recent scholarship regarding psychiatric epidemiology has focused on shifting notions of mental disorders. In psychiatric epidemiology in the last decades of the 20th century and the first decade of the 21st century, mental disorders have been perceived and treated largely as discrete categories denoting an individual’s mental functioning as either pathological or normal. In the USA, this grew partly out of evolving modern epidemiological work responding to the State’s commitment to measure the national social and economic burdens of psychiatric disorders and subsequently to determine the need for mental health services and to survey these needs over time. Notably absent in these decades have been environmentally oriented approaches to cultivating normal, healthy mental states, approaches initially present after World War II. We focus here on a set of community studies conducted in the 1950s, particularly the Midtown Manhattan study, which grew out of a holistic conception of mental health that depended on social context and had a strong historical affiliation with: the Mental Hygiene Movement and the philosophy of its founder, Adolf Meyer; the epidemiological formation of field studies and population surveys beginning early in the 20th century, often with a health policy agenda; the recognition of increasing chronic disease in the USA; and the radical change in orientation within psychiatry around World War II. We place the Midtown Manhattan study in historical context—a complex narrative of social institutions, professional formation and scientific norms in psychiatry and epidemiology, and social welfare theory that begins during the Progressive era (1890-1920) in the USA. PMID:25031047

In 2002, Munk defined an important enigma of 20th century global mean sea-level (GMSL) rise that has yet to be resolved. First, he listed three canonical observations related to Earth's rotation [(i) the slowing of Earth's rotation rate over the last three millennia inferred from ancient eclipse observations, and changes in the (ii) amplitude and (iii) orientation of Earth's rotation vector over the last century estimated from geodetic and astronomic measurements] and argued that they could all be fit by a model of ongoing glacial isostatic adjustment (GIA) associated with the last ice age. Second, he demonstrated that prevailing estimates of the 20th century GMSL rise (~1.5 to 2.0 mm/year), after correction for the maximum signal from ocean thermal expansion, implied mass flux from ice sheets and glaciers at a level that would grossly misfit the residual GIA-corrected observations of Earth's rotation. We demonstrate that the combination of lower estimates of the 20th century GMSL rise (up to 1990) improved modeling of the GIA process and that the correction of the eclipse record for a signal due to angular momentum exchange between the fluid outer core and the mantle reconciles all three Earth rotation observations. This resolution adds confidence to recent estimates of individual contributions to 20th century sea-level change and to projections of GMSL rise to the end of the 21st century based on them. PMID:26824058

PREPARED AS PART OF "PROJECT IMPROVING AND EXTENDING THE JUNIOR HIGH SCHOOL ORCHESTRA REPERTORY," THIS VOLUME CONTAINS CURRICULAR MATERIALS REPRESENTING THE 20TH CENTURY PERIOD. A MUSICAL HISTORY OF THE PERIOD IS GIVEN, AS WELL AS HISTORIES OF THE COMPOSERS AND THEIR INDIVIDUAL COMPOSITIONS. THE MATERIALS ARE PREPARED FOR 3 DEGREES OF TECHNICAL…

Children's fiction in school libraries have played and still play a role in mediating representations of technology and attitudes towards technology to schoolchildren. In early 20th century Sweden, elementary education, including textbooks and literature that were used in teaching, accounted for the main mediation of technological knowledge…

This collection development aid lists more than 10,000 titles of children's materials available in a variety of formats (in addition to print materials, the guide also includes sound recordings, video cassettes, microcomputer software programs, CD-ROM products, and videodiscs). This 20th anniversary edition contains several special features,…

This study focuses on how human origins were taught in the French Natural Sciences syllabuses of the 19th and 20th centuries. We evaluate the interval between the publication of scientific concepts and their emergence in syllabuses, i.e., didactic transposition delay (DTD), to determine how long it took for scientific findings pertaining to our…

This article presents the findings of a historically-informed comparative study that juxtaposes the lives of three missionary educators in China in the early 20th century with three Christian educators in China today. Data sources included hundreds of letters from the women written in China to their families and friends over several decades and…

This historical study focuses on how John Dewey's theory of education as socialization and Mordecai Kaplan's theory of Judaism as a civilization together served as an ideological base and pedagogical framework for the creation of "progressive," "reconstructed" American Jewish school programs in the early 20th century (1910s-1930s). In the main,…

As numerous Homans' lecturers have attested, Amy Morris Homans was a significant and visionary leader who set the foundation for women's physical education for the first half of the 20th century. Her reign at the Boston Normal School for Gymnastics (BNSG) was ironclad, and through the BNSG she controlled her student's lives, including their…

This paper traces the roots of the current library preservation movement and its evolution during the second half of the 20th century. It also looks at some of the contemporary shifts in thinking about preservation and changes in practice that are being explored by libraries. Finally the paper draws some conclusions as to the scope of programs in…

The length of the vegetation period (VP) plays a central role for the interannual variation of carbon fixation of terrestrial ecosystems. Observational data analysis has indicated that the length of the VP has increased in the last decades in the northern latitudes mainly due to an advancement of bud burst (BB). This phenomenon has been widely discussed in the context of Global Warming because phenology is correlated to temperatures. Analyzing the patterns of spring phenology over the last century in Southern Germany provided two main findings: - The strong advancement of spring phases especially in the decade before 1999 is not a singular event in the course of the 20th century. Similar trends were also observed in earlier decades. Distinct periods of varying trend behavior for important spring phases could be distinguished. - Marked differences in trend behavior between the early and late spring phases were detected. Early spring phases changed as regards the magnitude of their negative trends from strong negative trends between 1931 and 1948 to moderate negative trends between 1948 and 1984 and back to strong negative trends between 1984 and 1999. Late spring phases showed a different behavior. Negative trends between 1931 and 1948 are followed by marked positive trends between 1948 and 1984 and then strong negative trends between 1984 and 1999. This marked difference in trend development between early and late spring phases was also found all over Germany for the two periods 1951 to 1984 and 1984 to 1999. The dominating influence of temperature on spring phenology and its modifying effect on autumn phenology was confirmed in this thesis. However, - temperature functions determining spring phenology were not significantly correlated with a global annual CO2 signal which was taken as a proxy for a Global Warming pattern. - an index for large scale regional circulation patterns (NAO index) could only to a small part explain the observed phenological variability in

The active involvement of young researchers in scientific processes and the acquisition of scientific experience by gifted youth currently have a great value for the development of science. One of the research activities of National Research Tomsk Polytechnic University, aimed at the preparing and formation of the next generation of scientists, is the International Conference of Students and Young Scientists ''Modern Techniques and Technologies'', which was held in 2014 for the twentieth time. Great experience in the organization of scientific events has been acquired through years of carrying the conference. There are all the necessary resources for this: a team of organizers - employees of Tomsk Polytechnic University, premises provided with modern office equipment and equipment for demonstration, and leading scientists - professors of TPU, as well as the status of the university as a leading research university in Russia. This way the conference is able to attract world leading scientists for the collaboration. For the previous years the conference proved itself as a major scientific event at international level, which attracts more than 600 students and young scientists from Russia, CIS and other countries. The conference provides oral plenary and section reports. The conference is organized around lectures, where leading Russian and foreign scientists deliver plenary presentations to young audiences. An important indicator of this scientific event is the magnitude of the coverage of scientific fields: energy, heat and power, instrument making, engineering, systems and devices for medical purposes, electromechanics, material science, computer science and control in technical systems, nanotechnologies and nanomaterials, physical methods in science and technology, control and quality management, design and technology of artistic materials processing. The main issues considered by young researchers at the conference were related to the analysis of contemporary

The 3rd World Congress on Integrated Computational Materials Engineering (ICME) was a forum for presenting the "state-of-the-art" in the ICME discipline, as well as for charting a path for future community efforts. The event concluded with in an interactive panel-led discussion that addressed such topics as integrating efforts between experimental and computational scientists, uncertainty quantification, and identifying the greatest challenges for future workforce preparation. This article is a summary of this discussion and the thoughts presented.

Adriatic and Black Sea are semi-enclosed basins characterized by densely populated coasts, industrial compounds and a rich cultural and historical heritage. It appears to be crucial, for the management and the protection of their coastlines, to understand how much they will be impacted by the global sea level (SL) rise, projected by the end of this century. The aim of this work is to develop a method that allows to estimate to which extent the SL of the two basins will depart from the mean global level. The future evolution of global sea level is not a meaningful indicator at this regional scale and past deviations, due to local factors of the Adriatic and Black Sea levels from the global one, have been observed. The Adriatic Sea is the basin of the Mediterranean Sea best covered by past SL observations. In fact, for the Adriatic Sea is possible to obtain, by statistical method based on PCA and Least square Method, a seamless and long time series (from 1900 to 2009) using records of 7 mareographic stations located along the Italian and Croatian coasts (from PSMSL database). Satellite data of SL are available for the whole Mediterranean from 1993 to 2012 and they show a very high correlation (rho > 0.9) with Adriatic time series based on mareographic records. The SL time series of the 20th century in the Black Sea is computed using data of 4 stations, which are available in the PSMSL (Permanent Service for Mean Sea Level) archive, located on the north-east coast. This time series shows a lower correlation (rho about 0.5) with satellite data than in the case of Adriatic Sea. Further it shows a higher interannual variability. All the time series are considered after the subtraction of the Inverse Barometer (IB) effect. A statistical approach, based on a multivariate linear regression model, is used to investigate the link between SL anomaly, computed as the difference between the regional SL and global SL, and three large scale climate variables (sea level pressure

The world, and China in particular, has seen a tremendous population growth and industrialization in the 20th century. These changes were accompanied, among others, by a substantial increase in aerosol emission. To learn more about associated consequences for the climate system we have carried out a comparatively large set of transient sensitivity studies with the global atmosphere only climate model ECHAM5-HAM, using aerosol emission data from NIES (National Institute of Environmental Studies, Japan) and prescribed, observation based sea surface temperatures (SSTs) from the Hadley Center. The sensitivity studies cover the period from 1870 to 2005 and comprise ensembles of simulations (up to 13 members per ensemble), which allow to address the role of different aerosol species, greenhouse gases, and prescribed sea surface temperatures. Here we analyze these simulation data with particular focus on surface solar radiation, temperature, and the hydrological cycle in China. Physical mechanisms able to explain the results will be discussed. We generally find the strongest effects in the east of the country, where urbanization and industrialization is strongest and emissions increased most. The decrease of surface solar radiation (SSR) under clear sky conditions reaches up to around -8 W / m2 per decade from 1950 to 1990. Comparable values are found for all sky conditions. Dimming ceases in the second half of the 1990s, when we even see a renewed increase in SSR in some regions. Overall, these findings are in line with observation based estimates. Modeled surface temperatures reflect the decrease in SSR but carry also a substantial SST signature. After remaining roughly constant from 1870 to 1900, we find modeled surface temperatures to increase by about 1 degree Celsius till 1950, then decrease again by -0.2 to -1.2 degree Celsius till 1990, before a renewed increase sets in. Precipitation decreases in our model results from 1950 to 2000 by up to 10% or 150 mm per year

The record and analysis of statistical information on disaster occurrence, impacts and losses have been made worldwide in recent years. The development of natural disasters databases is crucial for risk management purposes, because it allows improving systems of indicators on disaster risk and vulnerability at national and sub-national scales. During the last century, Portugal was affected by several destructive natural disasters, namely of hydrologic (floods) and geomorphologic (landslides) origin. However, only recently risk prevention and management was assumed to be a national priority by the Portuguese Government. The basic information on past floods and landslides that occurred in Portugal is disperse and incomplete, and this is a shortcoming for the implementation of effective disaster mitigation measures, particularly when it is expectable an increase of frequency, magnitude, dimension and complexity of the hydro-geomorphologic phenomena resulting from climate change. In this work we present a preliminary assessment of hydro-geomorphologic disasters occurred in Portugal during the 20th century, based on the systematic survey of daily national newspapers. We included into a database those floods and landslides that produced, alternatively, dead people, injured people, missing people, evacuated and homeless. A total of 937 hydro-geomorphologic events were registered. In addition to physical and material damages, these events produced economic losses amounting to millions Euros. Our attention will focus on the geographic distribution and the temporal dimension of disastrous floods and landslides occurred in Portugal, and the temporal trends of hydro-geomorphologic disasters will be presented. The preliminary results shown that disastrous floods and landslides have been more frequent on the most populated regions of Portugal: the metropolitan areas of Lisbon and Oporto. In addition, data shows that disastrous hydrologic and geomorphologic phenomena were more

High water stress due to economic growth and climate change (ex. global warming) will be falling into 2 billion people to 4 billion people in the future. Agricultural water use accounting for about 70% of global water consumption might continue to increase due to production of foods and biofuels occurred by population growth in the future. In particular, water demand, food and biofuel production have an inextricable link. It is very important to evaluate these relationship for sustainable water use from past to the future. In this study, we focused on the objective to assess the impact of water withdrawal from various sources (stream flow, medium-sized reservoirs and nonrenewable nonlocal blue water) in the 20th century by considering irrigation area and climate change. Irrigation water withdrawal is the most important water use sector accounting for about 90% of total water withdrawal. First, we make the global spatial database of equipped irrigation area change and medium-sized reservoirs capacity. Then, water withdrawal from each sources for 50 years from 1950 to 2000 were simulated in global-scale at a resolution of 1.0 degree x 1.0 degree using an integrated global water resources model (hereafter, the H08 model). The H08 model can simulate both natural or anthropogenic water flow and anthropogenic water withdrawals. For comparison with our results, distribution of agricultural, industrial and domestic water withdrawals from 1950 to 2000 were estimated by distributing the country-based withdrawal data from AQUASTAT with irrigation area, urban population and total population, respectively. Groundwater withdrawal was then estimated by distributing the country-based withdrawal data based on statistical data from WRI, IGRAC and AQUASTAT with the total water withdrawal. As a result, agricultural water withdrawal change from nonrenewable nonlocal blue water during the past 50 years agreed well with the observed groundwater abstraction based on statistical data. In

The 20th century surface air temperature (SAT) records of China from various sources are analyzed using data which include the recently released Twentieth Century Reanalysis Project dataset. Two key features of the Chinese records are confirmed: (1) significant 1920s and 1940s warming in the temperature records, and (2) evidence for a persistent multidecadal modulation of the Chinese surface temperature records in co-variations with both incoming solar radiation at the top of the atmosphere as well as the modulated solar radiation reaching ground surface. New evidence is presented for this Sun-climate link for the instrumental record from 1880 to 2002. Additionally, two non-local physical aspects of solar radiation-induced modulation of the Chinese SAT record are documented and discussed.Teleconnections that provide a persistent and systematic modulation of the temperature response of the Tibetan Plateau and/or the tropospheric air column above the Eurasian continent (e.g., 30°N-70°N; 0°-120°E) are described. These teleconnections may originate from the solar irradiance-Arctic-North Atlantic overturning circulation mechanism proposed by Soon (2009). Also considered is the modulation of large-scale land-sea thermal contrasts both in terms of meridional and zonal gradients between the subtropical western Pacific and mid-latitude North Pacific and the continental landmass of China. The Circum-global teleconnection (CGT) pattern of summer circulation of Ding and Wang (2005) provides a physical framework for study of the Sun-climate connection over East Asia. Our results highlight the importance of solar radiation reaching the ground and the concomitant importance of changes in atmospheric transparency or cloudiness or both in motivating a true physical explanation of any Sun-climate connection. We conclude that ground surface solar radiation is an important modulating factor for Chinese SAT changes on multidecadal to centennial timescales. Therefore, a

The history of rumours is as old as human history. Even in remote antiquity, rumours, gossip and hoax were always in circulation - in good or bad faith - to influence human affairs. Today with the development of mass media, rise of the internet and social networks, rumours are ubiquitous. The earthquakes, because of their characteristics of strong emotional impact and unpredictability, are among the natural events that more cause the birth and the spread of rumours. For this reason earthquakes that occurred in the Po valley the 20th and 29th May 2012 generated and still continue to generate a wide variety of rumours regarding issues related to the earthquake, its effects, the possible causes, future predictions. For this reason, as occurred during the L'Aquila earthquake sequence in 2009, following the events of May 2012 in Emilia Romagna was created a complex initiative training and information that at various stages between May and September 2012, involved population, partly present in the camp, and then the school staff of the municipalities affected by the earthquake. This experience has been organized and managed by the Department of Civil Protection (DPC), the National Institute of Geophysics and Volcanology (INGV), the Emilia Romagna region in collaboration with the Network of University Laboratories for Earthquake Engineering (RELUIS), the Health Service Emilia Romagna Regional and voluntary organizations of civil protection in the area. Within this initiative, in the period June-September 2012 were collected and catalogued over 240 rumours. In this work rumours of the Po Valley are studied in their specific characteristics and strategies and methods to fight them are also discussed. This work of collection and discussion of the rumours was particularly important to promote good communication strategies and to fight the spreading of the rumours. Only in this way it was possible to create a full intervention able to supporting both the local institutions and

The study of global patterns of wind and pressure gradients, and more specifically, their effect on the sea level variation (storm surge), is a key issue in the understanding of recent climate changes. The local effect of storm surges on coastal areas (zones particularly vulnerable to climate variability and changes in sea level), is also of great interest in, for instance, flooding risk assessment. Studying the spatial and temporal variability of storm surges from observations is a difficult task to accomplish since observations are not homogeneous in time and scarce in space, and moreover, their temporal coverage is limited. The development of a global storm surge database (DAC, Dynamic Atmospheric Correction by Aviso, Carrère and Lyard, 2003) fulfils the lack of data in terms of spatial coverage, but not regarding time extent since it only includes last couple of decades (1992-2014). In this work, we propose the use of the 20CR ensemble (Compo et al., 2011) which spans from 1871 to 2010 to statistically reconstruct storm surge at a global scale and for a long period of time. Therefore, the temporal and spatial variability of storm surges can be fully studied and with much less effort than performing a dynamical downscaling. The statistical method chosen to carry out the reconstruction is based on multiple linear regression between an atmospheric predictor and the storm surge level at daily scale (Camus et al., 2014). The linear regression model is calibrated and validated using daily mean sea level pressure fields (and gradients) from the ERA-interim reanalysis and daily maxima surges from DAC. The obtained daily database of maximum daily surges has allowed us to estimate global trends at a centennial scale and analyse the effect of the changing climate on storm surges during the 20th century. Hence, this work improves the knowledge on historical storm-surge conditions and provides helpful information to the community concern on marine climate evolution and

"Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

This is the final scientific report for grant DOE-FG02-08ER64588, "The Interhemispheric Pattern in 20th Century and Future Abrupt Change in Regional Tropical Rainfall."The project investigates the role of the interhemispheric pattern in surface temperature – i.e. the contrast between the northern and southern temperature changes – in driving rapid changes to tropical rainfall changes over the 20th century and future climates. Previous observational and modeling studies have shown that the tropical rainband – the Intertropical Convergence Zone (ITCZ) over marine regions, and the summer monsoonal rainfall over land – are sensitive to the interhemispheric thermal contrast; but that the link between the two has not been applied to interpreting long-term tropical rainfall changes over the 20th century and future.The specific goals of the project were to i) develop dynamical mechanisms to explain the link between the interhemispheric pattern to abrupt changes of West African and Asian monsoonal rainfall; ii) Undertake a formal detection and attribution study on the interhemispheric pattern in 20th century climate; and iii) assess the likelihood of changes to this pattern in the future. In line with these goals, our project has produced the following significant results: 1.We have developed a case that suggests that the well-known abrupt weakening of the West African monsoon in the late 1960s was part of a wider co-ordinated weakening of the West African and Asian monsoons, and driven from an abrupt cooling in the high latitude North Atlantic sea surface temperature at the same time. Our modeling work suggests that the high-latitude North Atlantic cooling is effective in driving monsoonal weakening, through driving a cooling of the Northern hemisphere that is amplified by positive radiative feedbacks. 2.We have shown that anthropogenic sulfate aerosols may have partially contributed to driving a progressively southward displacement of the Atlantic Intertropical

The tsunami generated by the 1st November 1755 (Mw ~8.5) earthquake off Portugal affected mainly the coasts of the Iberian Peninsula and Northwest Morocco, and was observed in some places on the North Atlantic coasts, towards the West Indies, but also towards Ireland and the Great Britain, in Cornwall. However, no evidence of observation were found along French Atlantic coastline so far. In a first step, to determine whether there could be effects due to tsunamis on the French coastline, we conducted a study to search for actual tsunamis signals in all historical tide gauge stations of the French Atlantic coast available during the 20th century, specifically for the 1969 and 1975 tsunamis that were well observed in Portugal. Because many recordings are available from the French Hydrographic Service in La Rochelle (west French Atlantic coastline), we focus our study on this harbor. The analysis of these historical tide gauge data shows no evidence for tsunamis in La Rochelle, neither in 1969 or in 1975. Then, to confirm this lack of tsunami, we simulate the tsunamis from the 1969 and 1975 sources, using non linear shallow water equations and a series of imbricated bathymetric grids focusing to the French coastline, and then towards the harbor of La Rochelle: the modeling results confirm unnoticeable amplitudes. In the following step, tsunamis from three different scenarios for the 1755 earthquake have been similarly modeled to estimate the impact of such a tsunami on the French Atlantic coast, with a focus on La Rochelle harbor. The results show that, while the harbor is well protected (amplitudes computed on a synthetic tide gage in the harbor do not exceed 20 to 30 cm crest-to-trough) several areas may have undergone a more important, yet moderate impact, from 0.5 to 1 m, especially in the western part of the island of Ré and the northern coast of the island of Oléron. This may have caused possible local inundations in lowland areas, all the more since the tide

Demeter Georgievitz-Weitzer (1873-1949), called "Surya", Sanskrit for "sun", was an important representative of medical occultism in the first half of the 20th century. He worked as a journal editor and published a 13-volume book series about occult medicine, mainly written by himself. His hypotheses were closely related to the "Lebensreform" movement around 1900. Regarding diagnostics, he relied on astrology, cheiromancy, and clairvoyance, while therapeutics were dominated by diet and spagyric remedies according to Cesare Mattei (1809-1896) and Carl-Friedrich Zimpel (1801-1879). In his later years, he developed his own healing system, initially comprising eight, later only two preparations. Surya remedies were commercially available until the end of the 20th century, PMID:22822609

Recent global warming has not been ubiquitous - there might be seasons, regions, and time periods with clearly discernible zero or downward air temperature trends. Regions that are not warming or are even cooling - also known as "warming holes" - have been previously detected mainly in autumn in the second half of the 20th century in large parts of North America as well as in central and eastern Europe. In this study we use daily maximum and minimum temperature (TX and TN, respectively) and daily temperature range (DTR) at 136 stations from the ECA&D database in Europe and the Mediterranean in the period 1961-2000 to precisely locate their seasonal and sub-seasonal trends in space and within the course of the year, and to assess the effect of circulation changes on these observed trends. Linear trends are calculated for moving "seasons" of differing lengths (10, 20, 30, 60, and 90 days), each shifted by one day. Thus we obtain 365 values of "moving trends" for each station and each variant of season length. The day-to-day variability of these trends is greatest for short "seasons" of 10 and 20 days. Trends of the 90-day seasons are the most stable throughout the year and also bear the lowest trend magnitudes. Cluster analysis of the annual course of "moving trends" reveals relatively well-defined regions with similar trend behavior. Over most of Europe, the observed warming is greatest in winter, and the highest trend magnitudes are reached by TN in eastern Europe. Two regions stand out of this general picture: in Iceland and the Mediterranean, winter shows almost no trends, while in summer we see a pronounced warming. Significant autumn cooling centered on mid-November was found in eastern and southeastern Europe for both TX and TN; in many other regions trends are close to zero in the same period. Other clearly non-warming (or even cooling) periods occur in western and central Europe in April and June. Trends of DTR are largely inconclusive and no general picture

Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.

Oceanic heat transport variations, carried by the northward flowing Atlantic Water, strongly influence Arctic sea-ice distribution, ocean-atmosphere exchanges, and pan-Arctic temperatures. Paleoceanographic reconstructions from marine sediments near Fram Strait have documented a dramatic increase in Atlantic Water temperatures over the 20th century, unprecedented in the last millennium. Here we present results from Earth system model simulations over the last millennium that reproduce and explain reconstructed integrated quantities such as pan-Arctic temperature evolution during the pre-industrial millennium as well as the exceptional Atlantic Water warming in Fram Strait in the 20th century. The associated increase in ocean heat transfer to the Arctic can be traced back to changes in the ocean circulation in the sub-polar North Atlantic. An interplay between a weakening overturning circulation and a strengthening sub-polar gyre as a consequence of 20th century global warming is identified as driving mechanism for the pronounced warming along the Atlantic Water path toward the Arctic. Simulations covering the late Holocene provide a reference frame that allows us to conclude that the changes during the last century are unprecedented in the last 1150 years and that they cannot be explained by internal variability or natural forcing alone.

Approaches to the organization and conduct of cancer research changed dramatically throughout the 20th century. Despite marked differences between the epidemiological approaches of the first half of the century and molecular techniques that gained dominance in the 1980s, prominent 20th-century researchers investigating the link between sexual activity and anogenital cancers continuously invoked the same 1842 treatise by Italian surgeon Domenico Rigoni-Stern, who is said to originate the problem of establishing a causal link between sex and cancer. In this article, I investigate 20th-century references to Rigoni-Stern as a case of a broader phenomenon: scientists situating their work through narratives of venerated ancestors, or originators. By explaining shifting versions of originator narratives in light of their authors' cultural context and research practices, we can reimagine as meaningful cultural symbols the references that previous scholars have treated as specious rhetorical maneuvers. In this case, references to Rigoni-Stern provide an interpretive anchor for American scientists to construct continuity between their work and a diverse historical legacy of cancer research. PMID:26477198

Cholera is an acute infectious disease with high mortality if left untreated. Historically, between the 19th and 20th centuries seven great pandemics of cholera occurred and worldwide, thousands of people died. Based on an old theory, cholera was considered an air-born disease and the emergence of its outbreaks were attributed to bad weather or miasma. However later in the 18th century, British physician John Snow (1813-1858) explained the association of a terrible cholera outbreak in London in 1849 to contamination of the drinking water supply with human excreta. Despite his finding, the causative agent of this dreaded illness was unidentified until later in the 19th century. In 1854, Filippo Pacini (1812-1883) an anatomist from Italy and then in 1883, Robert Koch (1843-1910) the German bacteriologist, discovered ‘vibrio cholerae’ as the etiologic agent. During the major pandemics of cholera in 19th and 20th centuries this illness reached Iran and led to vast depopulation and a crucial impact on the country’s socioeconomic status. Poor public health conditions, lack of a well-organized public health authority for implementing preventive and quarantine measures as well as Iran’s specific geographic location were the main facilitating factors of the emergence of various epidemics, including cholera in Iran. The present paper briefly reviews the cholera outbreaks in Iran during the 19th and 20th centuries. PMID:25197514

A long-term decrease in downward surface solar radiation from the 1950s to the 1980s ("global dimming") followed by a multi-decadal increase up to the present ("brightening") has been detected in many regions worldwide. In addition, some researchers have suggested the existence of an "early brightening" period in the first half of the 20th century. However, this latter phenomenon is an open issue due to the opposite results found in literature and the scarcity of solar radiation data during this period. This paper contributes to this relevant discussion analyzing, for the first time in Southern Europe, the atmospheric column transparency derived from pyrheliometer measurements in Madrid (Spain) for the period 1911-1928. This time series is one of the three longest datasets during the first quarter of the 20th century in Europe. The results showed the great effects of the Katmai eruption (June 1912, Alaska) on transparency values during 1912-1913 with maximum relative anomalies around 8%. Outside the period affected by this volcano, the atmospheric transparency exhibited a stable behavior with a slight negative trend without any statistical significance on an annual and seasonal basis. Overall, there is no evidence of a possible early brightening period in direct solar radiation in Madrid. This phenomenon is currently an open issue and further research is needed using the few sites with available experimental records during the first half of the 20th century.

This essay considers the developments in education for management in 20th-century Britain. In the late 19th and early 20th centuries, that is, the highpoint of the United Kingdom's economic success, management was considered more of an art than a science, and formal education specifically for management was limited. After the Second World…

The premise of this work is simple: the history of college admission since World War II is a consumer history. The way in which this history unfolds is far more complex. College admission is a contested good. It is simultaneously a consumable good (students purchase a college education for personal and familial reasons) and a social good (various…

This book tracks the dramatic outcomes of the federal government's growing involvement in higher education between World War I and the 1970s, and the conservative backlash against that involvement from the 1980s onward. Using cutting-edge analysis, Christopher Loss recovers higher education's central importance to the larger social and political…

Juxtaposition of Hans Christian Andersen's "The Little Sea Maid" (1837) and Disney's home video "The Little Mermaid" (1989) illustrates how the adolescent princesses have evolved with changing views of women's roles. The mermaid of the twentieth century, part of the world of men, is still in a subservient role. (SLD)

This paper examines the free speech fights, one of the more revolutionary tactics of the Industrial Workers of the World (I.W.W.), founded in 1905 by a small group of socialists, anarchists, industrial unionists, and dissident trade unionists. Two considerations guide this examination. The first is the rhetorical nature of the free speech fight…

The Dust Bowl of the 1930's focused the attention of the US on soil erosion and land conservation. The Universal Soil Loss Equation (USLE) was the result of this effort and has remained one of the most widely used equations for soil erosion prediction world-wide. This empirical relationship has been...

The advent of the big data era creates both opportunities and challenges for traditional Chinese medicine (TCM). This study describes the origin, concept, connotation, and value of studies regarding the scientific computation of TCM. It also discusses the integration of science, technology, and medicine under the guidance of the paradigm of real-world, clinical scientific research. TCM clinical diagnosis, treatment, and knowledge were traditionally limited to literature and sensation levels; however, primary methods are used to convert them into statistics, such as the methods of feature subset optimizing, multi-label learning, and complex networks based on complexity, intelligence, data, and computing sciences. Furthermore, these methods are applied in the modeling and analysis of the various complex relationships in individualized clinical diagnosis and treatment, as well as in decision-making related to such diagnosis and treatment. Thus, these methods strongly support the real-world clinical research paradigm of TCM. PMID:25190349

This Computer Software Requirements Specification defines the software requirements for the world model of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product. (This deploys end effectors into underground storage tanks by means of robotic arm on end of telescoping mast.)

I will present some facets of Hans Bethe’s life to illustrate how I have used biography to narrate certain aspects of the history of twentieth century physics. I will focus on post World War II quantum field theory, on the relation between solid state/condensed matter physics and high energy physics, and make some observations regarding certain “top down” views in solid state physics in postmodernity.

OPEC was formed 20 years ago when the producers perceived that major oil companies were conspiring to lower prices, but OPEC's domination of total world production and the industralized world's dependence on imported oil have led to the cartel's major triumphs of the past eight years. OPEC's power has caused major price shocks after 1973 and 1978. A major contribution to the growth of cartel power was the decline in US oil production, which peaked in 1970. Supply and demand in terms of oil prices and the world economic condition have fluctuated several times since the 1973 supply disruption and remain unstable in 1981. At the time this article was written, the author foresaw that OPEC prices could still rise as much as 30% in 1981 in spite of conservation efforts and more stable supply and demand conditions. Government policies to offset US vulnerability to OPEC control include building a strategic oil reserve and deregulation of oil prices, but the tradeoff has been higher inflation. (DCK)

Natural forests store vast amounts of carbon in the terrestrial biosphere, and play an important role in the global carbon cycle. Given the significance of natural forests, there is a lack of carbon accounting of primary forests that are undisturbed by human activities. One reason for this lack of interest stems from ecological orthodoxy that suggests that primary forests should be close to dynamic equilibrium, in that Net Ecosystem Production (NEP) approaches zero. However, recent results from the northern hemisphere and tropics, using eddy covariance flux towers, indicate that primary forests are a greater sink than first thought. The role of evergreen primary forests in Australian carbon balance studies remain uncertain and hence may function differently to their deciduous counterparts in the Northern Hemisphere. In order to address the lack of baseline carbon accounts, an undisturbed, 300 year old Mountain Ash (Eucalyptus regnans) ecosystem, located in the Central Highlands of Victoria (Australia) was selected as a permanent study site to investigate carbon and water budgets over diurnal, seasonal and annual cycles. Mountain Ash trees are the world's tallest angiosperms (flowering plants), and one of the largest carbon reservoirs in the biosphere, with an estimated 1900 tC ha-1. A 110 m tall micrometeorological tower that includes eddy covariance instrumentation was installed in August 2005. An independent biometric approach quantifying the annual net gain or loss of carbon was also made within close proximity to the flux tower. Analysis of NEP in 2006 suggests that the ecosystem acted as a carbon sink of 2.5 tC ha-1 yr-1. Woody and soil biomass increment for the same year was estimated to be 2.8 tC ha-1yr-1, in which nearly half of the biomass production was partitioned into the aboveground woody tissue. These results indicate that temperate primary forests act as carbon sinks, and are able to maintain their carbon sink status due to their uneven stand

Belarus became a Soviet Socialist Republic in the USSR in 1921. Belarus is now an independent country between Poland and Lithuania and Russia. The pharmacy sector of Belarus improved in fits and starts from 1921 to the present but serious quantitative and qualitative problems were evident until the 21st century. A number of factors caused this situation. The Soviet Republic of Belarus started with handicaps. The area, comprised of several provinces of western Russia, had no pharmaceutical factories during the imperial period and, while pharmacies were of high quality in the cities all over the Russian Empire--including Minsk, which became the capital of Belarus--pharmacies were sparse and primitive in rural areas and Belarus was basically rural. Belarus was devastated by wars--World War I, the Russian-Polish war of 1920-21, and of course, by World War II. The Bolshevik policy of nationalizing private pharmacies adversely affected dispensing between 1918 and 1921. Dispensing improved during the New Economic Policy of 1921 to 1927 with re-introduction of private enterprise and the establishment of BelMedTorg and the Mogilev Experimental Station of Medicinal Plants. The number of pharmacies and medical facilities increased during the 1930s and again after World War II. However, utopian plans to provide free or low-cost medicines to all citizens never came to fruition. Inadequate amounts of state-of the-art and even basic medicines persisted through the 1990s. The number of pharmacists also was inadequate and their education and training was on a low level. Because of shortages, citizens of Belarus often self-medicated with medicinal plants. The transition to a market economy in the 1990s made medicines expensive for citizens but opened the door to greater interaction with Western pharmaceutical practices and physical improvements in pharmacies and pharmaceutical production. PMID:25577887

More than 93% of the energy excess associated with anthropogenic climate change. The resulting ocean warming and thermal expansion is a leading contributor to global mean sea level (GMSL) rise. Confidence in projections of GMSL rise therefore depends on the ability of climate models to reproduce global mean thermosteric sea level (GMTSL) rise over the 20th century. In this study, we compare the GMTSL of climate models of the Coupled Models Intercomparison Project Phase 5 (CMIP5) to observations over 1961-2005. Although the model-ensemble mean is within the uncertainty of observations, the model ensemble exhibits a large spread. We aim at explaining the departure of CMIP5 climate models 20th century GMTSL from observations. We show that climate models' GMTSL rise linearly depends on the time integrated radiative forcing F (under continuously increasing radiative forcing). The constant of proportionality (nu) expresses the transient thermosteric sea level response of the climate system. nu depends on the fraction of excess heat stored in the ocean, the expansion efficiency of heat, the climate feedback parameter and the ocean heat uptake efficiency. Most models show noticeably the same fraction of excess heat stored in the ocean and the same expansion efficiency of heat that are consistent with observations. This is unlike the climate feedback parameter and the ocean heat uptake efficiency, which are significantly different across climate models. These differences in climate feedback parameter and ocean heat uptake efficiency along with differences in time-integrated F across models explain most of the departure of CMIP5 climate models 20th century GMTSL from observations.

Sediment profiles of two alpine lakes located in the Tatra Mountains, the Toporowy Staw Nizni (TSN) and the Zielony Staw Gasienicowy (ZSG), were studied for their chronology, lithology, diatom and cladoceran remains. The sediment sequences, 50cm long from TSN and 30cm long from ZSG, were deposited during the last 1000 and 300 years, respectively. Vertical changes in lithology, diatom and Cladocera allow the reconstruction of three periods in the lakes' evolution: mild climatic conditions during Medieval Warm Period (MWP, only in TSN), severe conditions between the end of 14th and 19th centuries, identified as the Little Ice Age (LIA), and 20th century warming. The LIA was recorded in the sediments of both lakes in the form of intensified erosion and lower lake ecosystem productivity, as indicated by organic matter lower content, changes in diatom species composition, and decline in Daphnia. The 20th century was a time of acidification in both lakes. The scale of acidification was assessed based on the decline in diatom-inferred pH (DI-pH). DI-pH dropped by 1.2 pH units during the last century in TSN and by 0.4 pH unit in ZSG. The decline of DI-pH was noted in both lakes, but its intensity was clearly higher in TSN due to the lower acid neutralisation capacity (ANC) of this lake. The lower pH during the final decades of the 20th century was lethal to some water organisms while attracting others, such as Daphnia. The Daphnia population increased after the pH drop, probably due to the high food flexibility of this genus. A similar increase was not observed in ZSG, where planktonivorous fishes were introduced in the 1940s, which effectively limited the crustacean plankton density. PMID:19896170

The Intelligent Ground Vehicle Competition (IGVC) is one of four, unmanned systems, student competitions that were founded by the Association for Unmanned Vehicle Systems International (AUVSI). The IGVC is a multidisciplinary exercise in product realization that challenges college engineering student teams to integrate advanced control theory, machine vision, vehicular electronics and mobile platform fundamentals to design and build an unmanned system. Teams from around the world focus on developing a suite of dual-use technologies to equip ground vehicles of the future with intelligent driving capabilities. Over the past 20 years, the competition has challenged undergraduate, graduate and Ph.D. students with real world applications in intelligent transportation systems, the military and manufacturing automation. To date, teams from over 80 universities and colleges have participated. This paper describes some of the applications of the technologies required by this competition and discusses the educational benefits. The primary goal of the IGVC is to advance engineering education in intelligent vehicles and related technologies. The employment and professional networking opportunities created for students and industrial sponsors through a series of technical events over the four-day competition are highlighted. Finally, an assessment of the competition based on participation is presented.

Forward modeling of climate proxies enables identification of uncertainties in the interpretation of high resolution proxy archives in a manner that is complementary to classical inverse methods. By coupling proxy models to climate model output driven with realistic external forcings, a framework for assessment of their consistency with proxy observations over long timescales is created. Here we model reef coral oxygen isotopic composition (δ18O) as a function of sea-surface temperature (SST) and sea-surface salinity (SSS), the latter a linear proxy for the isotopic composition of seawater in the tropics. We first validate this model against a comprehensive network of 20th century coral δ18O measurements. When driven with historical SST and SSS data for the last ~50 years, the forward model is able to capture the spatial and temporal pattern of ENSO variability observed in the corals better than a univariate SST-based model, hence establishing the skill of this bivariate model. We then drive the forward model with SST and SSS from 20th-century simulations of state-of-the-art climate models, such as GFDL CM2.0 (run 1 c3m; Delworth et al, 2006, Wittenberg et al, 2006), to assess the ability of climate models to reproduce tropical climate variability. GFDL CM2.0 forward modeled coral δ18O accurately reproduces the spatio-temporal pattern of the observed coral δ18O trend over the 20th century, and furthermore suggests the trend was primarily driven by increased SSTs. However, the magnitude of the modeled trend is smaller than observed in the corals, suggesting that coral physiology has reacted in a non-linear fashion to the observed climatic forcing, or that GFDL CM2.0's tropical response to external climate forcing is too small. Instead, variance in GFDL modeled corals is dominated by strong interannual variability that is much greater than that observed in corals. The temporal evolution of modeled ENSO variance and frequency over the 20th century is also markedly

The El Niño-Southern Oscillation (ENSO) drives large changes in global climate patterns from year to year, yet its sensitivity to continued anthropogenic greenhouse forcing is poorly constrained by both climate models and observational data. Here we analyze over 670 years worth of new, monthly-resolved fossil coral records of ENSO spanning the last 7,000 years. The new records are based on the oxygen isotopic composition (δ18O) of U/Th-dated fossil corals from Christmas (2°N, 157°W) and Fanning (4°N, 160°W) Islands, located in the heart of the ENSO region. The corals document a highly variable ENSO, with reductions in ENSO variance of up to 60% documented in the last millennium and at several points through the reconstruction. Maximum ENSO variance is observed in the early 17th century, when values were up to 15% larger than present. There is no evidence for a precessionally-driven trend in ENSO variance from 6,000 years ago to present that some paleodata and coupled global climate models document. If there is an insolation-related signal in ENSO variance, our data suggest that it is overwhelmed by the large range of natural variability in ENSO. The average ENSO variance for the fossil coral database is 40% lower than late 20th century ENSO variance estimates derived from the fossil corals' modern counterparts. In order to test whether late 20th century ENSO variance is significantly larger than the long-term average implied by the fossil coral data, we employ a Monte Carlo-based approach that uses output from a 2,000-yr-long unforced simulation of the GFDL CM2.1 coupled GCM (Wittenberg, 2009). We first confirm that the model-based distribution of ENSO variance is consistent with the fossil coral-based distribution, which suggests that at least some of the current class of coupled climate models are capable of simulating the intrinsic variability in ENSO reasonably well. We then generate 10,000 pairs of modern and fossil 'pseudocoral' databases, drawing

We provide empirical evidence on the existence of the Pigou-Dalton principle. The latter indicates that aggregate welfare is - ceteris paribus - maximized when incomes of all individuals are equalized (and therefore marginal utility from income is as well). Using anthropometric panel data on 101 countries during the 19th and 20th centuries, we determine that there is a systematic negative and concave relationship between height inequality and average height. The robustness of this relationship is tested by means of several robustness checks, including two instrument variable regressions. These findings help to elucidate the impact of economic inequality on welfare. PMID:23352274

The SAFRAN-ISBA-MODCOU (SIM) system is a combination of three different components: an atmospheric analysis system (SAFRAN) providing the atmospheric forcing for a land surface model (ISBA) that computes surface water and energy budgets and a hydrological model (MODCOU) that provides river flows and level of several aquifers. The variables generated by the SIM chain constitute the SIM reanalysis and the current version only covers the 1958-2012 period. However, long climate datasets are required for evaluation and verification of climate hindcasts/forecasts and to isolate the contribution of natural decadal variability from that of anthropogenic forcing to climate variations. The aim of this work is to extend of the fine-mesh SIM reanalysis to the entire 20th century, especially focusing on temperature and rainfall over France, but also soil wetness and river flows. This extension will first allow a detailed investigation of the influence of decadal variability on France at very fine spatial scales and will provide crucial information for climate model evaluation. Before 1958, the density of available observations from Météo-France necessary to force SAFRAN (rainfall, snow, wind, temperature, humidity, cloudiness) is much lower than today, and not sufficient to produce a correct SIM reanalysis. That's why is has been decided to use the available atmospheric observations over the past decades combined to a statistical downscaling algorithm to overcome the lack of observations. The DSCLIM software package implemented by the CERFACS and using a weather typing based statistical methodology will be used as statistical downscaling method to reconstruct the atmospheric variables necessary to force the ISBA-MODCOU hydrological component. The first stage of this work was to estimate and compare the bias and strengths of the two approaches in their ability to reconstruct the past decades. In this sense, SIM hydro-meteorological experiments were performed for some recent

At the Second World Conference on Computers in Education nearly 150 papers were presented. The author gives an overview of the conference, and selects a few of the papers for further discussion. Vendor-supplied computer education, "information science" as a high school elective, and computer-assisted instruction in a broader spectrum of course…

The term "proteome" was first introduced into the scientific literature in July 1995. Almost 20 years ago attempts to characterize the "total protein complement able to be encoded by a given genome" only became possible due to privileged access to what were then the world's most complete sets of genomic data. Today, proteomics has become an important pillar in the fields of disease diagnosis and drug research and development, while also playing a critical role in the much larger field of Healthcare Analytics and Biomarker Discovery and Detection. It is important to note that this industry originated mostly from building blocks in analytical science that predated the term "proteomics" by many decades. However, proteomics, as a discipline, has allowed protein scientists to more favorably compete in the face of highly fashionable Big Science and, more specifically, genomics. PMID:25689367

The major consumer electronics in U.S. homes accounted for over 10 percent of U.S. residential electricity consumption, which is comparable to the electricity consumed by refrigerators or lighting. We attribute 3.6 percent to video products, 3.3 percent to home office equipment, and 1.8 percent to audio products. Televisions use more energy than any other single product category, but computer energy use now ranks second and is likely to continue growing. In all, consumer electronics consumed 110 THw in the U.S. in 1999, over 60 percent of which was consumed while the products were not in use.

How did the education of surgical pathology, and pathology in general, differ at Mount Sinai? Passing the examination of the American Board of Pathology was never the focus of the department. Learning criteria or quoting references was de-emphasized, but mastery of macroscopic pathology was required, supported in both word and action by two brilliant surgical pathologists, Otani and Kaneko, and by two extraordinary medical pathologists, Klemperer and Popper. Meticulous microscopy emphasized pattern rather than reliance on lists of discrete features. Otani developed a regular "problem case" meeting for a community of pathologists, made up of alumni and other interested pathologists, as well as active department members. These monthly sessions provided the highest level of "continuing medical education." Otani and Kaneko unequivocally believed in learning from cases, and Mount Sinai residents were fortunate both in the one-to-one teaching and in the wealth of material, in all systems, that came to surgical pathology. Outstanding pathologists who came from Mount Sinai settled throughout the country and provided the highest level of diagnoses, but, with the exception of Bernard Wagner, Emanuel Rubin, Fiorenzo Paronetto, Richard Horowitz, Michael Gerber, Marc Rosenblum, Bruce Wenig, Jaishree Jagirdar, Swan Thung, Cesar Moran, Hideko Kamino, Philip LeBoit, Alberto Marchevsky, and others, there were relatively few academic leaders. Otani and Kaneko did not have national reputations. Klemperer, although world renowned, was relatively unassuming, and his disciples numbered almost as many nonpathologists as pathologists. Popper did establish a major center for liver pathology, with students coming from around the world, but did not particularly promote general surgical pathology. Can the Mount Sinai approach still be applied? The decline in the numbers of autopsies performed, the demands for rapid turnaround time, the de-emphasis of gross pathology as newer technologies (eg

Reform of the medical education in the early 20th century America caused many consequences in the various aspects of the medical fields as well as the improvement of the medical education itself such as the reinforcement of the laboratory training in the basic science courses and hospital instruction in the clinical courses. The reform brought about the direct or indirect elimination of the irregular sectarian practitioners and the minority groups such as black and women from the medical market place, established the concrete position of the regular physicians in the American society, reinforced the biomedical aspects which would become the general tendency in the 20th century Western medicine. And the author stressed that the reform was neither initiated nor invoked but just accelerated by the so-called Flexner Report of 1910, rather it had been performed through the processes of the interaction and struggle between the various contradictory trends, tendencies, and forces such as american Medical Association (AMA), some leading medical educators and scientists, medical colleges, and philanthropic foundations in the socio-cultural millieu gradually moving to favor the "science". PMID:11618917

This study evaluated changes in the profiles of African American women presented in fashion magazines during the 20th century. Twenty-six variables were measured on a total of 119 profile photographs collected from various fashion magazines published in the 1940s through the 1990s. The photographs were divided into 6 groups corresponding to the decade in which they were published. A 1-way analysis of variance was performed, and between-group differences were examined with a Tukey multiple comparison procedure. Significant between-group differences (P 20th century and, similar to the standards for the white profile, show a trend toward fuller and more anteriorly positioned lips. PMID:15067255

In this paper evidence of anthropogenic influence over the warming of the 20th century is presented and the debate regarding the time-series properties of global temperatures is addressed in depth. The 20th century global temperature simulations produced for the Intergovernmental Panel on Climate Change's Fourth Assessment Report and a set of the radiative forcing series used to drive them are analyzed using modern econometric techniques. Results show that both temperatures and radiative forcing series share similar time-series properties and a common nonlinear secular movement. This long-term co-movement is characterized by the existence of time-ordered breaks in the slope of their trend functions. The evidence presented in this paper suggests that while natural forcing factors may help explain the warming of the first part of the century, anthropogenic forcing has been its main driver since the 1970's. In terms of Article 2 of the United Nations Framework Convention on Climate Change, significant anthropogenic interference with the climate system has already occurred and the current climate models are capable of accurately simulating the response of the climate system, even if it consists in a rapid or abrupt change, to changes in external forcing factors. This paper presents a new methodological approach for conducting time-series based attribution studies. PMID:23555866

Have there been changes in the epidemiology of sexual abuse of children during the 20th century? To explore this question, a comparison was made between the results of the survey conducted by Alfred Kinsey and colleagues in the 1940s of women in the United States, the majority of whom were born between 1900 and 1929, and the results of more recent studies of the epidemiology of sexual abuse. In Kinsey's study, 24% of 4,441 women reported at least one episode of sexual abuse before adolescence; in 49% of the episodes, the perpetrator was not known to the child and, in 50%, the abuse did not involve bodily contact. These findings are compared to the results of more recent epidemiologic surveys, particularly Russell's study conducted in San Francisco in 1978. Although there are important differences in the methodologies used by Kinsey and Russell, it is likely that changes have occurred in the prevalence and nature of sexual abuse in the 20th century. Russell described a higher prevalence of sexual abuse, a greater proportion of perpetrators who were relatives or adults known to the child, and a greater proportion of serious types of sexual abuse. PMID:3186358

The ridge and slough landscape of the Florida Everglades consists of a mosaic of linear sawgrass ridges separated by deeper-water sloughs with tree islands interspersed throughout the landscape. We used pollen assemblages from transects of sediment cores spanning sawgrass ridges, sloughs, and ridge-slough transition zones to determine the timing of ridge and slough formation and to evaluate the response of components of the ridge and slough landscape to climate variability and 20th-century water management. These pollen data indicate that sawgrass ridges and sloughs have been vegetationally distinct from one another since initiation of the Everglades wetland in mid-Holocene time. Although the position and community composition of sloughs have remained relatively stable throughout their history, modern sawgrass ridges formed on sites that originally were occupied by marshes. Ridge formation and maturation were initiated during intervals of drier climate (the Medieval Warm Period and the Little Ice Age) when the mean position of the Intertropical Convergence Zone shifted southward. During these drier intervals, marsh taxa were more common in sloughs, but they quickly receded when precipitation increased. Comparison with regional climate records suggests that slough vegetation is strongly influenced by North Atlantic Oscillation variability, even under 20th-century water management practices. ?? 2009 by the Ecological Society of America.

This article is the fruit of research into the cultural heritage of healthcare in Minas Gerais (Brazil) and explores the construction of hospitals supported by Catholic charities from the 18th to 20th century. Catholicism has always been strong in Minas Gerais, partly because the Portuguese Crown prohibited the free travel of priests, who were suspected of illegally trading in gold from the mines. A brotherhood was responsible for creating the first Santa Casa, in Vila Rica. Another very important religious group in Brazil, the Vincentians, was also devoted to charitable works and propagated the ideas on charity of Frederico Ozanan, based on the work of St. Vincent de Paul. This group comprised both a lay movement, supported by conferences organized by the St. Vincent de Paul Society, and a religious order, the Vincentian priests and nuns. Catholic physicians make up the third group studied here, organized in a professional association promoted by the Catholic Church. The brotherhoods, Vincentians, and associations, with their Santa Casas, represent a movement that is recognized worldwide. The enormous Catholic participation in these charitable works brought in the physicians, who would often make no charge and exerted efforts to create hospitals that served the population. Although the capital of Minas Gerais was the creation of republicans and positivists in the 20th century, with their ideas of modernity, it remained dependent on Christian charity for the treatment of the poor. PMID:21936227

The aim of this work is to present the different aspects of modern high complexity electric beds of the period 1940 until 2000 exclusively. The chronology of the product has been strictly divided into three big stages: electric and semi-electric beds (until the 90’s), mechatronic beds (90’s until 2000) and, mechatronic intelligent beds of the last 15 years. The latter are not considered in this work due to the extension for its analysis. The justification for classifying the product is presented under the concepts of medical, assistive and mobility devices. Relevant aspects of common immobility problems of the different types of patients for which the beds are mainly addressed are shown in detail. The basic functioning of the patient’s movement generator and the implementation of actuators, together with IT programs, specific accessories and connectivity means and network-communication shown in this work, were those that gave origin to current mechatronic beds. We present the historical evolution of high complexity electric beds by illustrating cases extracted from a meticulous time line, based on patents, inventions and publications in newspapers and magazines of the world. The criteria adopted to evaluate the innovation were: characteristics of controls; accessories (mattresses, lighting, siderails, etc.), aesthetic and morphologic properties and outstanding functionalities.

New focus on the internationalization of universities occurred in the late 20th century and higher education in nursing has been quick to embrace the opportunities. In this manuscript, writers provide a brief overview of the nursing and more general literature from the late 20th century relating to key dimensions of internationalization, as well as present data from a survey conducted in 1995-96 of the international activities and dimensions at Canadian faculties/schools of nursing. While it is clear that nurses in Canadian universities were engaged in significant international endeavours in the 20th century, the literature and our experience suggest that the extent of such activity has increased substantially in recent years. Discussion centres on examination of how knowledge generated in the 20th century can inform current internationalization initiatives and on identification of key questions that merit consideration as we move forward in the 21st century. PMID:17402933

By 1900 there was a definite need to update the great star catalogues of the 19th century. First of all, the coordinates in them were pinned to the epoch of 1875. Then, too, the magnitudes were not on any systematic scale. Finally, they were all published before astronomers had any ideas or data for classifying large numbers of stars by their spectra. Under the leadership of Edward Pickering and the financing of Anna Palmer Draper, the work on the Henry Draper Memorial began at Harvard College Observatory in the late 1880s. Its primary goal was to gather and classify the photographic spectra of about 100,000 stars. As a test case, Pickering had Williamina Fleming develop a simple classification scheme and apply it to the spectra of about 10,000 stars. The Draper Catalogue was published in 1890. For the next two decades, Pickering worked to have astronomers approve the Harvard Classification scheme. After that happened in 1910 at the meeting of the International Solar Union, Annie Jump Cannon began the project of classifying 100,000 stars. She was so efficient that she completed the work in two years. Rather than limiting it to that number at that time, she continued classifying spectra for another two years for a total of 225,300 stars. Although Cannon completed the classification in 1915, it was not until 1918 that the first of nine volumes of the Henry Draper Catalogue was published. At that time, it was greeted with enthusiasm and congratulations from eminent astronomers around the world: Jacobus Kapteyn in the Netherlands, Herbert Hall Turner in England, Harlow Shapley in California, to name a few. Today, the HD Catalogue is now online and endures as a source of coherent data for a wide variety of ongoing investigations of the Milky Way.

In the central nervous systems of animals like pigeons and locusts, neurons were identified which signal objects approaching the animal on a direct collision course. In order to timely initiate escape behavior, these neurons must recognize a possible approach (or at least differentiate it from similar but non-threatening situations), and estimate the time-to-collision (ttc). Unraveling the neural circuitry for collision avoidance, and identifying the underlying computational principles, should thus be promising for building vision-based neuromorphic architectures, which in the near future could find applications in cars or planes. Unfortunately, a corresponding computational architecture which is able to handle real-situations (e.g. moving backgrounds, different lighting conditions) is still not available (successful collision avoidance of a robot was demonstrated only for a closed environment). Here we present two computational models for signalling impending collision. These models are parsimonious since they possess only the minimum number of computational units which are essential to reproduce corresponding biological data. Our models show robust performance in adverse situations, such as with approaching low-contrast objects, or with highly textured backgrounds. Furthermore, a condition is proposed under which the responses of our models match the so-called eta-function. We finally discuss which components need to be added to our model to convert it into a full-fledged real-world-environment collision detector.

This long-awaited synthesis approaches the past three centuries with an eye to highlighting the importance of significant schools, as well as important women educators in the emergence of secondary education for girls. At the same time, each contributor pays careful attention to the specific political, cultural, and socio-economic factors that…

Drought is a natural hazard that occurs all over the world and can have large economic, social, and environmental impacts. Large-scale models, Global Hydrological Models (GHMs) and Land Surface Models (LSMs), have been developed over the last decades to simulate the global and continental terrestrial water cycle. It is not clear whether these models provide suitable data for the analysis of hydrological extremes on a global scale, e.g. drought. Therefore, in this study, a multi-model analysis has been carried out to evaluate if a suite of these models can reproduce major drought events. Time series of monthly total runoff from five LSMs and five GHMs (0.5° x 0.5°) have been used to focus on hydrological drought. They were run with identical forcing (WATCH forcing data) within the framework of the EC-FP6 project WATCH for the period 1963-2001. A new drought identification tool that combines the variable threshold level method and the consecutive dry period method was used to characterize drought events in the ensemble median of the ten global models. After the temporal drought identification per cell, a clustering method was applied to determine the spatial extent of drought events and exclude small spatial events. From the ensemble median, the temporal development of area in drought for the globe and various subregions was investigated. The results show that known major drought events in all continents were identified. From the time series of drought occurrence, several severe drought events were selected, which were studied in the spatial domain. Duration and spatial extent of these events can substantially differ with documented studies. Considering the individual models especially the spatial extent of major drought events differed, however all models did capture them. In general, the modelled runoff shows a fast response to rainfall, which leads to deviations from observed drought events in slowly responding systems. By taking the multi-model ensemble median

Spider World is an interactive program designed to help individuals with no previous computer experience to learn the fundamentals of programming. The program emphasizes cognitive tasks which are central to programming and provides significant problem-solving opportunities. In Spider World, the user commands a hypothetical robot (called the…

Reliable future projection by the climate or Earth system model is crucial for the issue on future climate change. For the reliable future projection, uncertainty of the aerosol effect on the climate change should be reduced, because the uncertainty has been large. Therefore, it is essential to understand the effect of anthropogenic aerosol forcing on climate change in the 20th century. In this study, we have assessed the effect by a comparison between the 20th century historical simulations (20C and piAero) with the aerosol forcing fluctuated realistically over time and fixed in the pre-industrial condition by MIROC-ESM. We focus on the climate change in the North Pacific Ocean (NPO) due to anthropogenic aerosol emitted from China in the late 20th century. In the comparison between the two simulations, there has been little difference in the global mean surface temperature (SAT) from 1851 to 1900. Then the difference appears and reaches to about 0.2 deg. C in 1950's. After 1960, the difference in SAT between the two experiments become large. For SST change in the NPO, small positive trend is found after 1900 in the piAero, but not found in the 20C. Thus, the SST difference in the NPO between the two experiments is significant after 1900. While the positive SST trend in the NPO has been large in the piAero after 1960, SST in the Central NPO shows the negative trend in the 20C. These enlarge SST difference between the two experiments. The negative SST trend in the Central NPO in the 20C is likely to be attributable to an increase of aerosol emission from China. The aerosol increase, which is also found in the NPO, makes solar insolation into the surface decrease mainly through the aerosol indirect effect. This effect decreases SST. Also, the effect is seen in the boreal spring and summer. However, the effect is not found in the piAero. The Pacific Decadal Oscillation (PDO), which is the principal natural variability in the NPO, has been investigated. Linear trend of

We inferred climate drivers of 20th-century years with regionally synchronous forest fires in the U.S. northern Rockies. We derived annual fire extent from an existing fire atlas that includes 5038 fire polygons recorded from 12,070,086 ha, or 71% of the forested land in Idaho and Montana west of the Continental Divide. The 11 regional-fire years, those exceeding the 90th percentile in annual fire extent from 1900 to 2003 (>102,314 ha or approximately 1% of the fire atlas recording area), were concentrated early and late in the century (six from 1900 to 1934 and five from 1988 to 2003). During both periods, regional-fire years were ones when warm springs were followed by warm, dry summers and also when the Pacific Decadal Oscillation (PDO) was positive. Spring snowpack was likely reduced during warm springs and when PDO was positive, resulting in longer fire seasons. Regional-fire years did not vary with El Niño-Southern Oscillation (ENSO) or with climate in antecedent years. The long mid-20th century period lacking regional-fire years (1935-1987) had generally cool springs, generally negative PDO, and a lack of extremely dry summers; also, this was a period of active fire suppression. The climate drivers of regionally synchronous fire that we inferred are congruent with those of previous centuries in this region, suggesting a strong influence of spring and summer climate on fire activity throughout the 20th century despite major land-use change and fire suppression efforts. The relatively cool, moist climate during the mid-century gap in regional-fire years likely contributed to the success of fire suppression during that period. In every regional-fire year, fires burned across a range of vegetation types. Given our results and the projections for warmer springs and continued warm, dry summers, forests of the U.S. northern Rockies are likely to experience synchronous, large fires in the future. PMID:18459335

Twenty years ago, W. Lawrence (Larry) Gates approached the U.S. Department of Energy (DOE) Office of Energy Research (now the Office of Science) with a plan to coordinate the comparison and documentation of climate model differences. This effort would help improve our understanding of climate change through a systematic approach to model intercomparison. Early attempts at comparing results showed a surprisingly large range in control climate from such parameters as cloud cover, precipitation, and even atmospheric temperature. The DOE agreed to fund the effort at the Lawrence Livermore National Laboratory (LLNL), in part because of the existing computing environment and because of a preexisting atmospheric science group that contained a wide variety of expertise. The project was named the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and it has changed the international landscape of climate modeling over the past 20 years. In spring 2009 the DOE hosted a 1-day symposium to celebrate the twentieth anniversary of PCMDI and to honor its founder, Larry Gates. Through their personal experiences, the morning presenters painted an image of climate science in the 1970s and 1980s, that generated early support from the international community for model intercomparison, thereby bringing PCMDI into existence. Four talks covered Gates's early contributions to climate research at the University of California, Los Angeles (UCLA), the RAND Corporation, and Oregon State University through the founding of PCMDI to coordinate the Atmospheric Model Intercomparison Project (AMIP). The speakers were, in order of presentation, Warren Washington [National Center for Atmospheric Research (NCAR)], Kelly Redmond (Western Regional Climate Center), George Boer (Canadian Centre for Climate Modelling and Analysis), and Lennart Bengtsson [University of Reading, former director of the European Centre for Medium-Range Weather Forecasts (ECMWF)]. The afternoon session emphasized

Data from Single Photon Emission Computed Tomography (SPECT) studies are blurred by inevitable physical phenomena occurring during data acquisition. These errors may be compensated by means of reconstruction algorithms which take into account accurate physical models of the data acquisition procedure. Unfortunately, this approach involves high memory requirements as well as a high computational burden which cannot be afforded by the computer systems of SPECT acquisition devices. In this work the possibility of accessing High Performance Computing and Networking (HPCN) resources through a World Wide Web interface for the advanced reconstruction of SPECT data in a clinical environment was investigated. An iterative algorithm with an accurate model of the variable system response was ported on the Multiple Instruction Multiple Data (MIMD) parallel architecture of a Cray T3D massively parallel computer. The system was accessible even from low cost PC-based workstations through standard TCP/IP networking. A speedup factor of 148 was predicted by the benchmarks run on the Cray T3D. A complete brain study of 30 (64 x 64) slices was reconstructed from a set of 90 (64 x 64) projections with ten iterations of the conjugate gradients algorithm in 9 s which corresponds to an actual speed-up factor of 135. The technique was extended to a more accurate 3D modeling of the system response for a true 3D reconstruction of SPECT data; the reconstruction time of the same data set with this more accurate model was 5 min. This work demonstrates the possibility of exploiting remote HPCN resources from hospital sites by means of low cost workstations using standard communication protocols and an user-friendly WWW interface without particular problems for routine use. PMID:9506406

At the request of the Intergovernmental Panel on Climate Change (IPCC), simulations of 20th-century climate have been performed recently with some 20 global coupled ocean-atmosphere models. In view of its central importance for biological and socio-economic systems, model-simulated continental precipitation is evaluated relative to three observational estimates at both global and regional scales. Many models are found to display systematic biases, deviating markedly from the observed spatial variability and amplitude/phase of the seasonal cycle. However, the point-wise ensemble mean of all the models usually shows better statistical agreement with the observations than does any single model. Deficiencies of current models that may be responsible for the simulated precipitation biases as well as possible reasons for the improved estimate afforded by the multi-model ensemble mean are discussed. Implications of these results for water-resource managers also are briefly addressed.

The Prairie Pothole Region (PPR) of North America is a globally important resource that provides abundant and valuable ecosystem goods and services in the form of biodiversity, groundwater recharge, water purification, flood attenuation, and water and forage for agriculture. Numerous studies have found these wetlands, which number in the millions, to be highly sensitive to climate variability. Here, we compare wetland conditions between two 30-year periods (1946-1975; 1976-2005) using a hindcast simulation approach to determine if recent climate warming in the region has already resulted in changes in wetland condition. Simulations using the WETLANDSCAPE model show that 20th century climate change may have been sufficient to have a significant impact on wetland cover cycling. Modeled wetlands in the PPR's western Canadian prairies show the most dramatic effects: a recent trend toward shorter hydroperiods and less dynamic vegetation cycles, which already may have reduced the productivity of hundreds of wetland-dependent species. PMID:24223283

Since the 1920s, the road to the acknowledgement of personality psychology as a field of scientific psychology that has individuality as its object began with the founding of the discipline by Gordon W. Allport. Historians of psychology have made serious attempts to reconstruct the cultural, political, institutional, and chronological beginnings of this field in America in the 20th century. In this literature, however, an important European tradition of psychological studies of personality that developed in France in the 2nd half of the 19th century has been overlooked. The aim of this article is to cast some light on this unexplored tradition of psychological personality studies and to discuss its influence on the development of the scientific study of personality in the United States. PMID:12817602

Throughout the course of the 20th century, many observers have noted important tensions and antipathies between public health and medicine. At the same time, reformers have often called for better engagement and collaboration between the 2 fields. This article examines the history of the relationship between medicine and public health to examine how they developed as separate and often conflicting professions. The historical character of this relationship can be understood only in the context of institutional developments in professional education, the rise of the biomedical model of disease, and the epidemiologic transition from infectious disease to the predominance of systemic chronic diseases. Many problems in the contemporary burden of disease pose opportunities for effective collaborations between population-based and clinical interventions. A stronger alliance between public health and medicine through accommodation to a reductionist biomedicine, however, threatens to subvert public health's historical commitment to understanding and addressing the social roots of disease. PMID:10800418

The 20th Nantes Actualités Transplantation (NAT) meeting was held on June 11, 2015, and June 12, 2015. This year, the local organizing committee selected an update on infectious diseases in solid organ and hematopoietic stem cell transplantation. With an attendance of close to 170 clinicians, researchers, students, engineers, technicians, invited speakers, and guests from North and South America, Germany, Switzerland, Netherlands, and France, the meeting was well attended. Invited speakers' expertise covered basic as well as translational microbiology, immunology, transplantation, and intensive care medicine. This report identifies a number of advances presented during the meeting in the care and management of infectious diseases in transplantation and immunocompromised patients. New antiviral immune responses and their modulation by pathogens in addition to novel antimicrobial therapeutic strategies, cell therapies, and genomic analysis were discussed. PMID:26627674

This article examines for the case of yellow fever research in Germany in the first half of the 20th century how political and military conditions affected the interests of scientific research. As a tropical disease, yellow fever was unknown in Germany and very rare in the German colonies and thus of little scientific or political interest. But this changed once the National Socialists began pursuing their wars of conquest. In preparation for a new colonial empire and a military mission in Africa, yellow fever research became increasingly important. The chief beneficiary of this development was the Robert Koch-Institute in Berlin, where Eugen Haagen worked after his time as a researcher in the Yellow Fever Laboratory in New York. In collaboration with the army and the industry, he used human experiments in his efforts to develop a vaccine for mass production. Ultimately, the vaccine's application was prevented by the German surrender in North Africa. PMID:19137979

One of the major Dutch landscapes is formed by lowland rivers. They divide the country in a southern and a northern part, both physically and culturally. We screened the freely available database of 19th and early 20th century paintings of Simonis & Buunk, www.simonis-buunk.com, looking for lowland river landscapes depicting geodiversity and cultural heritage relationships (See References for other landscapes). Emperor Napoleon declared The Netherlands as naturally belonging to his empire as its lands originated from muds originating in France and transported there by the big rivers. A description that may have given rise to the idea of the Netherlands as a delta, but from a geomorphological perspective The Netherlands consists of series of river plains of terrestrial origin, of which the north-western part are subsiding and invaded by the sea. Now, the rivers Meuse and Rhine (including its branches Waal and IJssel) meander through ever larger river plains before reaching the North Sea. They end in estuaries, something one would not expect of rivers with catchments discharging a large part of Western Europe. Apart from the geological subsidence, the estuaries might be due to human interference, the exploitation of peat and building of dikes since the 11th century, heavy storms and the strong tidal currents. Archaeological finds show Vikings and Romans already used the river Rhine system for trading and transporting goods. During the Roman Empire the Rhine was part of The Limes, the northern defence line of the empire. Romans already influenced the distribution of water over the different river branches. Since the middle of the 19th century groins and canalization drastically changed the character of the rivers. The 19th and early 20th century landscape paintings illustrate this change as well as changes in land use. Examples of geodiversity and cultural heritage relationships shown: - meanders and irregular banks disappear as river management increases, i.a. bends

During the 19th century, numerous figures, with different qualifications, claimed to practice orthopedics: doctors, surgeons, inventors of equipment and instruments, and other empiricists. They performed certain types of techniques, massages, surgical operationsand/or fitted prostheses. The polysemous notion of orthopedics had created conflicts of interest that would reach their height at the end of the 19th century. The integration of orthopedics into the training at the university level enhanced its proximity to surgery, a discipline that has dominated the so-called modern medicine. During the 20th century, various medical branches defend the legitimacy of certain orthopedic practices, thereby threating to a degree the title itself of this specialization. By examining the challenges that have shaped the history of orthopedics in Switzerland, this article also seeks to shed light on the strategies that were implemented in adopting a medical and technical discipline within a transforming society. PMID:26111839

The Prairie Pothole Region (PPR) of North America is a globally important resource that provides abundant and valuable ecosystem goods and services in the form of biodiversity, groundwater recharge, water purification, flood attenuation, and water and forage for agriculture. Numerous studies have found these wetlands, which number in the millions, to be highly sensitive to climate variability. Here, we compare wetland conditions between two 30-year periods (1946–1975; 1976–2005) using a hindcast simulation approach to determine if recent climate warming in the region has already resulted in changes in wetland condition. Simulations using the WETLANDSCAPE model show that 20th century climate change may have been sufficient to have a significant impact on wetland cover cycling. Modeled wetlands in the PPR's western Canadian prairies show the most dramatic effects: a recent trend toward shorter hydroperiods and less dynamic vegetation cycles, which already may have reduced the productivity of hundreds of wetland-dependent species.

The Prairie Pothole Region (PPR) of North America is a globally important resource that provides abundant and valuable ecosystem goods and services in the form of biodiversity, groundwater recharge, water purification, flood attenuation, and water and forage for agriculture. Numerous studies have found these wetlands, which number in the millions, to be highly sensitive to climate variability. Here, we compare wetland conditions between two 30-year periods (1946–1975; 1976–2005) using a hindcast simulation approach to determine if recent climate warming in the region has already resulted in changes in wetland condition. Simulations using the WETLANDSCAPE model show that 20th century climate change may have been sufficient to have a significant impact on wetland cover cycling. Modeled wetlands in the PPR's western Canadian prairies show the most dramatic effects: a recent trend toward shorter hydroperiods and less dynamic vegetation cycles, which already may have reduced the productivity of hundreds of wetland-dependent species. PMID:24223283

20th Annual Meeting of the Society for Neuro-Oncology, San Antonio, TX, USA, 18-22 November 2015 The Society for Neuro-Oncology is the largest neuro-oncology meeting in the USA that meets annually and provides a multiday venue that showcases new brain cancer clinical trial results and basic research primarily pertaining to gliomas. The Society for Neuro-Oncology 2015 meeting comprising one education day, 2 days of premeetings and 3 days of presentation, over 200 oral presentations and 900 abstracts provides an overview of contemporary neuro-oncology that includes metastatic disease of the central nervous system as well as primary brain tumors. This review attempts to highlight select abstracts presented at this year's meeting in a short summary that provides a synopsis of a large and multifaceted meeting. PMID:27230972

The positive phase of the El Niño-Southern Oscillation (ENSO) increases global mean temperature, and contributes to a negative phase of the Southern Annular Mode (SAM), the dominant mode of climate variability in the Southern Hemisphere. This interannual relationship of a high global mean temperature associated with a negative SAM, however, is opposite to the relationship between their trends under greenhouse warming. We show that over much of the 20th century this relationship undergoes multidecadal fluctuations depending on the intensity of ENSO. During the period 1925-1955, subdued ENSO activities weakened the relationship. However, a similar weakening has occurred since the late 1970s despite the strong ENSO. We demonstrate that this recent weakening is induced by climate change in the Southern Hemisphere. Our result highlights a rare situation in which climate change signals emerge against an opposing property of interannual variability, underscoring the robustness of the recent climate change. PMID:23784087

Purpose of review To provide an overview of conceptualizations of female sexual problems, and ‘Female Sexual Dysfunction’ in particular, throughout the 20th century, especially in relation to psychiatry and mental illness. Recent findings In the past 15 years, there has been an increase in both medical and public discourse about ‘Female Sexual Dysfunction’. I discuss a variety of literature sources dealing with female sexual problems, where these are understood variously as problems of developmental psychopathology, as technical phenomena to be resolved through education, or as medical problems to be addressed pharmaceutically. Summary The stigma of mental illness shapes much recent discussion of female sexual problems, as does the legacy of the postwar critique of psychodynamic psychiatry. PMID:20802336

Alpine glacier retreat resulting from global warming since the close of the Little Ice Age in the 19th and 20th centuries has increased the risk and incidence of some geologic and hydrologic hazards in mountainous alpine regions of North America. Abundant loose debris in recently deglaciated areas at the toe of alpine glaciers provides a ready source of sediment during rainstorms or outburst floods. This sediment can cause debris flows and sedimentation problems in downstream areas. Moraines built during the Little Ice Age can trap and store large volumes of water. These natural dams have no controlled outlets and can fail without warning. Many glacier-dammed lakes have grown in size, while ice dams have shrunk, resulting in greater risks of ice-dam failure. The retreat and thinning of glacier ice has left oversteepened, unstable valley walls and has led to increased incidence of rock and debris avalanches. ?? 1993 Kluwer Academic Publishers.

The smoking prevalence by age of women in China is distinct from most other countries in showing more frequent smoking among older women than younger. Using newly developed birth cohort histories of smoking, the authors demonstrate that although over one quarter of women born 1908–1912 smoked, levels of smoking declined across successive cohorts. This occurred despite high rates of smoking by men and the wide availability of cigarettes. The analysis shows how this pattern is counter to that predicted by the leading theoretical perspectives on the diffusion of smoking and suggests that it arose out of a mix of Confucian traditions relating to gender and the socio-economic and political events early in the 20th century which placed emerging women's identities in conflict with national identities. That a similar pattern of smoking is evident in Japan and Korea, two countries with strong cultural affinities to China, is used to buttress the argument. PMID:22904585

During the first half of the 20th century numerous drugs, foodstuffs and cosmetics were brought on the market, whose supposed effects were explained with their weak radioactivity. Their subtle radiation was believed to stimulate the vital forces of the body, thus leading to recovery from illness, or to an improvement in beauty and to rejuvenation. Among others, bath and drinking waters enriched with radioactive materials were advertised for this purpose. The then known radioactive medicines included preparations of healing earth, the so-called Salus-Oil, the TRUW preparations, and "Radithor", which was popular in the United States. There were also radioactive foodstuffs (butter, chocolate, rusk) and cosmetics. This mild radiotherapy may be characterised as a form of bio-dynamistic healing. PMID:16382691

Throughout the 20th century, US public health and immigration policies intersected with and informed one another in the country's response to Mexican immigration. Three historical episodes illustrate how perceived racial differences influenced disease diagnosis: a 1916 typhus outbreak, the midcentury Bracero Program, and medical deportations that are taking place today. Disease, or just the threat of it, marked Mexicans as foreign, just as much as phenotype, native language, accent, or clothing. A focus on race rendered other factors and structures, such as poor working conditions or structural inequalities in health care, invisible. This attitude had long-term effects on immigration policy, as well as on how Mexicans were received in the United States. PMID:21493932

More than 30 Chinese students went to Great Britain to study physics during the first half of the 20th century. They were concentrated in London University (13), Cambridge University (9), Edinburgh University (5) and Manchester University (3) and so on. All these students returned to China after finishing their study and most of them later became excellent physicists. They contributed to the development of physics in China. Based on newly discovered primary materials concerning these Chinese physicists, I examine their study in UK and subsequent accomplishments after their return to China. I will then analyze these British-trained Chinese physicists and compare them with those studying in Japan and America. I would argue that Chinese physicists educated in Britain have high degree of specialization as a whole and formed unique style. They made certain unique contributions to Chinese physics development.

Results have been presented for studies of the microstructure and chemical composition of a number of articles made of iron and cast iron at the Kamensk plant, which cover the period from the start of the production of iron on the territory of the city of Kamensk-Ural'skii at the turn of the 17th-18th centuries to the beginning of the 20th century. Differences in the composition of the Kamensk cast iron and modern grades of foundry cast iron have been established. Possible sources of technological difficulties and production waste at the Kamensk plant have been revealed. The potential of metallographic studies for the attribution of historical articles made of ferrous metals are shown.

A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

Individual stature depends on multifactorial causes and is often used as a proxy for investigating the biological standard of living. While the majority of European studies on 19th and 20th century populations are based on conscript heights, stature derived from skeletal remains are scarce. For the first time in Denmark this study makes a comparison between skeletal stature and contemporary Danish conscript heights and investigates stature of males and females temporally and between socially distinct individuals and populations in 19th and early 20th century Copenhagen. A total of 357 individuals (181 males, 176 females) excavated at the Assistens cemetery in Copenhagen is analyzed. Two stature regression formulae (Trotter, 1970; Boldsen, 1990) are applied using femur measurements and evaluated compared to conscript heights. The results indicate that mean male stature using Boldsen follows a similar trend as the Danish conscript heights and that Trotter overestimate stature by ca. 6cm over Boldsen. At an inter population level statistically significant differences in male stature are observed between first and second half of the 19th century towards a slight stature decrease and larger variation while there are no significant changes observed in female stature. There are insignificant differences in stature between middle and high class individuals, but male stature differs statistically between cemeteries (p=0.000) representing middle/high class, paupers and navy employees, respectively. Female stature had no significant wealth gradient (p=0.516). This study provides new evidence of stature among males and females during the 19th century and suggests that males may have been more sensitive to changes in environmental living and nutrition than females. PMID:26256129

The Cariaco Basin, Venezuela is well-positioned to record a detailed history of surface ocean changes along the southern margin of the Caribbean and the tropical Atlantic. Varved, high deposition rate sediments deposited under anoxic conditions and an abundance of well-preserved microfossils result in one of the few marine records capable of preserving evidence of interannual- to decadal-scale climate variability in the tropical Atlantic. Here we present Mg/Ca and stable oxygen isotope data with sub-decadal resolution derived from sediments deposited over the last 800 years. Mg/Ca measured on the planktic foraminifer Globigerina bulloides from a Cariaco Basin sediment core strongly correlates with spring (March-May) instrumental SSTs between AD 1870 and 1990. The long-term record displays a surprising amount of variability for a tropical location. The temperature swings are not necessarily related to local upwelling variability, but instead represent wider conditions in the Caribbean and western tropical Atlantic. The Mg/Ca-SST record also captures the decadal and multidecadal variability observed in global land and sea surface temperature anomalies, and correlates with Atlantic tropical storm and hurricane frequency over the late-19th and 20th centuries. On average, 20th century temperatures are not the warmest in the entire record, but they do show the largest increase in magnitude and fastest rate of SST change over the last eight hundred years. Stable oxygen isotope data also correlate well with instrumental SSTs, but not over the full instrumental record. Poor correlations with early instrumental SST data suggest a salinity overprint. However, reconstructing δ- water variability using combined Mg/Ca and δ18O data is not straightforward as the δ- water/salinity relationship varies seasonally in the Cariaco Basin. Comparisons with percent titanium data suggest intervals of both local and regional surface salinity changes over the length of the record.

Background Time trends in infant mortality for the 20th century show a curvilinear pattern that most demographers have assumed to be approximately exponential. Virtually all cross-country comparisons and time series analyses of infant mortality have studied the logarithm of infant mortality to account for the curvilinear time trend. However, there is no evidence that the log transform is the best fit for infant mortality time trends. Methods We use maximum likelihood methods to determine the best transformation to fit time trends in infant mortality reduction in the 20th century and to assess the importance of the proper transformation in identifying the relationship between infant mortality and gross domestic product (GDP) per capita. We apply the Box Cox transform to infant mortality rate (IMR) time series from 18 countries to identify the best fitting value of lambda for each country and for the pooled sample. For each country, we test the value of λ against the null that λ = 0 (logarithmic model) and against the null that λ = 1 (linear model). We then demonstrate the importance of selecting the proper transformation by comparing regressions of ln(IMR) on same year GDP per capita against Box Cox transformed models. Results Based on chi-squared test statistics, infant mortality decline is best described as an exponential decline only for the United States. For the remaining 17 countries we study, IMR decline is neither best modelled as logarithmic nor as a linear process. Imposing a logarithmic transform on IMR can lead to bias in fitting the relationship between IMR and GDP per capita. Conclusion The assumption that IMR declines are exponential is enshrined in the Preston curve and in nearly all cross-country as well as time series analyses of IMR data since Preston's 1975 paper, but this assumption is seldom correct. Statistical analyses of IMR trends should assess the robustness of findings to transformations other than the log transform. PMID:19698144

Identifying 20th-century periodic coastal surge variation is strategic for the 21st-century coastal surge estimates, as surge periodicities may amplify/reduce future MSL enhanced surge forecasts. Extreme coastal surge data from Belfast Harbour (UK) tide gauges are available for 1901-2010 and provide the potential for decadal-plus periodic coastal surge analysis. Annual extreme surge-elevation distributions (sampled every 10-min) are analysed using PCA and cluster analysis to decompose variation within- and between-years to assess similarity of years in terms of Surge Climate Types, and to establish significance of any transitions in Type occurrence over time using non-parametric Markov analysis. Annual extreme surge variation is shown to be periodically organised across the 20th century. Extreme surge magnitude and distribution show a number of significant cyclonic induced multi-annual (2, 3, 5 & 6 years) cycles, as well as dominant multi-decadal (15-25 years) cycles of variation superimposed on an 80 year fluctuation in atmospheric-oceanic variation across the North Atlantic (relative to NAO/AMO interaction). The top 30 extreme surge events show some relationship with NAO per se, given that 80% are associated with westerly dominant atmospheric flows (+ NAO), but there are 20% of the events associated with blocking air massess (- NAO). Although 20% of the top 30 ranked positive surges occurred within the last twenty years, there is no unequivocal evidence of recent acceleration in extreme surge magnitude related to other than the scale of natural periodic variation.

This special section of the Journal of Micromechanics and Microengineering is devoted to the 20th European Workshop on Micromechanics (MME 2009), which was held in Toulouse, France, 20-22 September 2009. The MME workshop series started in 1989 in Twente and was the first European event created in the field of micro machining technology for developing micro components, micro sensors, micro actuators, and micro systems. Over the last two decades the MEMS community has grown considerably, and the MME workshops have sustained this progress through annual meetings all around Europe: Twente (The Netherlands, 1989), Berlin (Germany, 1990), Leuven (Belgium, 1992), Neuchatel (Switzerland, 1993), Pisa (Italy, 1994), Copenhagen (Denmark, 1995), Barcelona (Spain, 1996), Southampton (United Kingdom, 1997), Ulvik (Norway, 1998), Gif-sur-Yvette (France, 1999), Uppsala (Sweden, 2000), Cork (Ireland, 2001), Sinaia (Romania, 2002), Delft (The Netherlands, 2003), Leuven (Belgium, 2004), Goteborg (Sweden, 2005), Southampton (United Kingdom, 2006), Guimaraes (Portugal, 2007) and Aachen (Germany, 2008). For twenty years, MME conferences have provided an excellent opportunity to bring together many, predominantly European, scientists and engineers to present and discuss the latest developments in this field. For the 20th anniversary of the MicroMechanics Europe Workshop, 115 papers from 23 countries were submitted. Selected contributions were presented during four poster sessions, including short oral presentations. A very interesting feature of the MME workshops is their ability to promote young researchers. Six invited speakers from research centres and industry also gave an overview on advanced technological, characterization and simulation tools. The two day workshop was attended by 185 delegates from 22 countries all over Europe, and from Japan, Taiwan, USA and Mexico. On behalf of the MME 2009 Program Committee, I would like to express my sincere gratitude to all authors of

The Russian climate is more sensitive to global warming than the climate in many other parts of the world. According to the Second Climate Change National Assessment, since the mid-1970s, the average temperature has been rising with rate of 0.43 ° C / 10 years, which is more than two times higher than the rate of global warming. In the Altai region, the rate of temperature change is higher than the average for Russia with an annual surface air temperature increase equal to 1.8°C the 20th century. The maximum value of this increase the past 50 years (1963-2013) was found in the intermountain basins of Altai (+ 2.6°C) mainly due to the winter and spring warming with changes in the summer season being considerably smaller. This warming is accompanied with negative tendencies in annual precipitation over the entire Altai Krai. The mountain ranges of Altai are called the "water tower" of Northern Eurasia. The northward flow of numerous rivers streaming down from these ranges in the Basin of the Ob and the Irtysh Rivers is formed by melting of Altai glaciers and snowfields. Since the middle of the 19th century the largest glaciers in the Altai have retreated by 1.5-2 km and the thickness of their tails decreased by 50-70 m. The reduction of mountain glaciers poses a threat of depletion of water flow to major agricultural regions downstream affecting human activity and even the drinking water availability. Permafrost in the Altai Mountains is actively degraded (thawing), which represents a danger for infrastructure (first of all for roads and pipelines) and increases risk of catastrophic events (landslides, mudflows). Continued warming could contribute to a significant reduction of water resources, biodiversity and other negative processes in the region.The reported study was partially supported by the Russian Foundation for Baseline Research (project No. 15-45-04450).

Profiling the lives of children around the world at the end of the 20th century, this report calls on the international community to undertake the urgent actions necessary to realize the rights of every child. Part 1 of the report summarizes the progress made over the last decade in meeting the goals established at the 1990 World Summit for…

Southeastern South America (SESA) rainfall presents large variability from interannual to multidecadal times scales and is influenced by the tropical Pacific, Atlantic and Indian oceans. At the same time, these tropical oceans interact with each other inducing sea surface temperature anomalies in remote basins through atmospheric and oceanic teleconnections. In this study we employ a tool from complex networks to analyze the collective influence of the three tropical oceans on austral spring rainfall variability over SESA during the 20th century. To do so we construct a climate network considering as nodes the observed Niño3.4, Tropical North Atlantic (TNA), and Indian Ocean Dipole (IOD) indices, together with an observed or simulated precipitation (PCP) index over SESA. The mean network distance is considered as a measure of synchronization among all these phenomena during the 20th century. The approach allowed to uncover large interannual and interdecadal variability in the interaction among nodes. In particular, there are two main synchronization periods characterized by different interactions among the oceanic and precipitation nodes. Whereas in the '30s El Niño and the TNA were the main tropical oceanic phenomena that influenced SESA precipitation variability, during the '70s they were El Niño and the IOD. Simulations with an Atmospheric General Circulation Model reproduced the overall behavior of the collective influence of the tropical oceans on rainfall over SESA, and allowed to study the circulation anomalies that characterized the synchronization periods. In agreement with previous studies, the influence of El Niño on SESA precipitation variability might be understood through an increase of the northerly transport of moisture in lower levels and advection of cyclonic vorticity in upper levels. On the other hand, the interaction between the IOD and PCP can be interpreted in two possible ways. One possibility is that both nodes (IOD and PCP) are forced

Paleoceanographic reconstructions have revealed recent changes in oceanic heat transport into the Arctic that are unprecedented over the past 2000 to 3000 years [Spielhagen et al., 2011; Dylmer et al., 2013]. The 20th century increase in heat transport is related to intensified Atlantic Water advection and manifests itself in terms of changes in foraminiferal assemblages and geochemical tracers. Here we present results from an ensemble of three simulations covering the last millennium (850 - 2005 CE). The experiments were conducted in the framework of PMIP3/CMIP5 using the Max Planck Institute Earth System Model for paleo applications (MPI-ESM-P). The model features the same grids (ECHAM6: T63/L47, MPIOM:1.5°/L40) as the standard CMIP5 model (MPI-ESM-LR [Jungclaus et al., 2013]) but does not include dynamic vegetation. External forcings are applied following the PMIP3 protocol [Schmidt et al., 2011]. Combining forced simulations over pre-industrial times with those over the last 150 years enables us to put changes observed in the modern period in context with a period where only natural forcing agents were active. Additional long unperturbed control simulations serve to discriminate between forced and internally-generated variability. For the northward oceanic heat transports in the northern North Atlantic and in the Nordic Seas all simulations show pronounced multi-centennial variations and an unprecedented increase in the 20th century. The changes in heat transport can be traced back to changes in the gyre circulation in the subpolar North Atlantic with some local amplification. We investigate the reason for the changes in ocean circulation and atmospheric variability modes and discuss the implications for the role of ocean lateral transports for Arctic amplification of global climate change. Dylmer, C.V. et al. [2013], Northward advection of Atlantic water in the eastern Nordic Seas over the last 3000yr., Clim. Past., 9, 1505-1518. Jungclaus, J.H. et al. [2013

Social and economical changes in Lódź transforming the town into a strong textile industry centre at the turn of 19th and 20th century stimulated the development of medical sciences and professional activity. Between 1800 and 1914 a 600-fold increase in the number of inhabitants occured - a demographic phenomenon unknown in Europe at that time. This led to specific sanitary and hygiene conditions, negatively affecting the population's health. In these social and economic conditions, with Russian occupant unwillingness to act, medical professionals and other social groups, in particular founders ot the textile industry--K. Scheibler, I. Poznanski, I. Heinzel, I. Kunitzer and others spontaneously acted to solve urgent medical problems. In this period a number of modern private and public hospitals such as neuro-psychiatric " Kochanówka" hospital, paediatric "Anna-Maria" hospital, factory-owned hospitals and other social aid centres were organised. These institutions, apart clinical activities for the society, held scientific research. A number of scientists were active in Lódź at that time: Stanisław Bartoszewicz, Józef Brudzinski, Józef Kollnski, Jan Mazurkiewicz, Witold Chodźko, Tadeusz Mogielnicki, Józef Maybaum-Marzynski, Stanislaw Serkowski, Stanislaw Skalski, Emanuel Sonnenberg and others. They were involed in new fields of medical research, such as paediatric diseases, neurological and psychiatric diseases, neoplastic diseases, infectious diseases and occupational medicine. All these activities were co-ordinated by: Lódź Medical Association, Lódź Division of the Warsaw Society of Hygiene, Industrial Medicine Assocation and Tuberculosis League. At the turn of the 19th and 20th century Lódź was listed among well known centres of modern diagnostic and therapeutic methods in the menagement of tuberculosis, childhood diseases and neuropsychiatric diseases. Professional skills and scientific activity of physicians form Lódź, their efforts in

During most of the 20th century, precipitation has been continuously measured by means of the so-called "pluviographs", i.e. rain gauges including a mechanical apparatus for continuously recording the depth of water from precipitation on specific strip charts, usually on a weekly basis. The signal recorded on such strips was visually examined by trained personnel on a regular basis, in order to extract the daily precipitation totals and the maximum precipitation intensities over short periods (from a few minutes to hours). The rest of the high-resolution information contained in the signal was usually not extracted, except for specific cases. A systematic recovering of the entire information at high temporal resolution contained in these precipitation signals would provide a fundamental database to improve the characterization of historical rainfall climatology during the previous century. The Department of Land Engineering of the University of Cagliari has recently developed and tested an automatic software, based on image analysis techniques, which is able to acquire the scanned images of the pluviograph strip charts, to automatically digitise the signal and to produce a digital database of continuous precipitation records at the highest possible temporal resolution, i.e. 5 to 10 minutes. Along with that, a significant amount of daily precipitation totals from the late 19th and the 20th century, either elaborated from pluviograph strip charts or simply derived from bucket rain gauges, still exists in paper form, but it has never been digitalized. Within a project partly-funded by the Operational Programme of the European Union "Italia-Francia Marittimo", the Regional Environmental Protection Agency of Sardinia and the University of Cagliari will recover both the high-resolution rainfall signals and the older time series of daily totals recorded by a large number of pluviographs belonging to the historical monitoring networks of the island of Sardinia. Such data

In the present study, an isotope-incorporated GCM simulation for AD1871 to AD2008 nudged toward the so-called "20th Century Reanalysis (20CR)" atmospheric fields is conducted. Beforehand the long-term integration, a method to downscale ensemble mean fields is proposed, since 20CR is a product of 56-member ensemble Kalman filtering data assimilation. The method applies a correction to one of the ensemble members in such a way that the seasonal mean is equal to that of the ensemble mean, and then the corrected member is inputted into the isotope-incorporated GCM (i.e., IsoGSM) with the global spectral nudging technique. Use of the method clearly improves the skill than the cases of using only a single member and of using the ensemble means; the skill becomes equivalent to when 3-6 members are directly used. By comparing with GNIP precipitation isotope database, it is confirmed that the 20C Isotope Reanalysis's performance for latter half of the 20th century is just comparable to the other latest studies. For more comparisons for older periods, proxy records including corals, tree-rings, and tropical ice cores are used. First for corals: the 20C Isotope Reanalysis successfully reproduced the δ18O in surface sea water recorded in the corals at many sites covering large parts of global tropical oceans. The comparison suggests that coral records represent past hydrologic balance information where interannual variability in precipitation is large. Secondly for tree-rings: δ18O of cellulose extracted from the annual rings of the long-lived Bristlecone Pine from White Mountain in Southern California is well reproduced by 20C Isotope Reanalysis. Similar good performance is obtained for Cambodia, too. However, the mechanisms driving the isotopic variations are different over California and Cambodia; for California, Hadley cell's expansion and consequent meridional shift of the submerging dry zone and changes in water vapor source is the dominant control, but in Cambodia

In December, 2011, over 800 people experienced the exhibit, <1>:"der"//pattern for a virtual environment, created for the fully immersive CAVETM at the University of Wisconsin-Madison. This exhibition took my nature-based photographic work and reinterpreted it for virtual reality (VR).Varied responses such as: "It's like a moment of joy," or "I had to see it twice," or "I'm still thinking about it weeks later" were common. Although an implied goal of my 2D artwork is to create a connection that makes viewers more aware of what it means to be a part of the natural world, these six VR environments opened up an unexpected area of inquiry that my 2D work has not. Even as the experience was mediated by machines, there was a softening at the interface between technology and human sensibility. Somehow, for some people, through the unlikely auspices of a computer-driven environment, the project spoke to a human essence that they connected with in a way that went beyond all expectations and felt completely out of my hands. Other interesting behaviors were noted: in some scenarios some spoke of intense anxiety, acrophobia, claustrophobia-even fear of death when the scene took them underground. These environments were believable enough to cause extreme responses and disorientation for some people; were fun, pleasant and wonder-filled for most; and were liberating, poetic and meditative for many others. The exhibition seemed to promote imaginative skills, creativity, emotional insight, and environmental sensitivity. It also revealed the CAVETM to be a powerful tool that can encourage uniquely productive experiences. Quite by accident, I watched as these nature-based environments revealed and articulated an essential relationship between the human spirit and the physical world. The CAVETM is certainly not a natural space, but there is clear potential to explore virtual environments as a path to better and deeper connections between people and nature. We've long associated

The pigment composition of six painted metal knight shields of the Order of the Elephant dating from the second half of the 20th century belonging to the Danish royal collection were studied using Raman microscopy. By focusing a 785 nm laser with a 50× objective on particles in paint cross sections, it was possible to identify the following 20 compounds: hematite, goethite, chrome red/orange, chrome yellow, zinc chrome yellow, carbon black, toluidine red PR3, chlorinated para red PR4, dinitroaniline orange PO5, phthalocyanine blue PB15, indanthrone blue PB60, ultramarine, Prussian blue, lead white, anatase, rutile, calcium carbonate, barium sulphate, gypsum and dolomite. The components were frequently present in complex pigment mixtures. Additional information was obtained by elemental analysis with scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX) to identify cobalt blue, zinc white and cadmium red, as well as to indicate the presence of zinc white in some pigment mixtures. The study allowed a comparison between the industrially applied preparation layers and the artistic paint layers applied by the heraldic painter. Differences in the choice of paint and pigment types were observed on the earliest knight shields, demonstrating a general delay of industrial materials into artist paints. PMID:26023056

The Great Plains of North America are susceptible to multi-year droughts, such as the 1930s ‘Dust Bowl’. The droughts have been linked to SST variability in the Pacific and Atlantic basins. This observationally rooted analysis shows the SST influence in multi-year droughts and wet episodes over the Great Plains to be significantly more extensive than previously indicated. The remarkable statistical reconstruction of the major hydroclimate episodes attests to the extent of the SST influence in nature, and facilitated evaluation of the basin contributions. We find the Atlantic SSTs to be especially influential in forcing multi-year droughts; often, more than the Pacific ones. The Atlantic Multidecadal Oscillation (AMO), in particular, contributed the most in two of the four reconstructed episodes (Dust Bowl Spring, 1980s fall wetness), accounting for almost half the precipitation signal in each case. The AMO influence on continental precipitation was provided circulation context from analysis of NOAA's 20th Century Atmospheric Reanalysis. A hypothesis for how the AMO atmospheric circulation anomalies are generated from AMO SSTs is proposed to advance discussion of the influence pathways of the mid-to-high latitude SST anomalies. Our analysis suggests that the La Nina-US Drought paradigm, operative on interannual time scales, has been conferred excessive relevance on decadal time scales in the recent literature.

Wilhelm Bölsche (1861-1939) is the author of a poetic history of the evolution of love entitled Das Liebesleben in der Natur (1898-1903). This work, inspired by the writings of biologist Ernst Haeckel, was greatly successful in Germany at the beginning of the 20th century. Freud kept a copy of the three volumes in his London library and cites the text in his lectures on an Introduction to psychoanalysis. Bölsche develops an Entwicklungsgeschichte (history of evolution) of the distinguishing sexuality of several types of love (oral, anal and urinary). In addition, he describes the "zoological reactionary" homosexual and ties this sexual behaviour to the history of the development of anal sexuality. This paper will address an excerpt on this topic from Bölsche's text that has been translated for the occasion. The task at hand is to prepare the ground for a study of German evolutionism, both popular and scientific, and its ties to psychoanalysis. PMID:15368944

During the first half of the 19th century, idealistic morphology developed into an influential research program in the German biosciences. This program was based on the concept of an ideal connection existing between various living beings. The growth of Darwinian thought and its new paradigm of historical explanation supplanted the idealistic morphology. Yet in the first half of the 20th century the principles of idealistic morphology experienced a powerful revival. Wilhelm Troll (1897-1978) was one of the most significant figures in this renaissance. Guided by the ideas of J.W. von Goethe, Troll established a research program rejecting causal, functional, and phylogenetic explanations as well as the idea of evolutionary adaptation. Instead, he attempted to create a 'pure' morphology based on the descriptions of various plant species. Governed by some explicitly metaphysical presumptions, Troll based his theory on the description of the organismal Gestalt. In consequence, his theory was actually a return to the proper idealistic morphology as it was known in the early 19th century. It lead German botanical morphology to a period of methodological and epistemological return. PMID:16602487

During many centuries, the progress of Anatomy was based on the perseverant and laborious activities done by anatomists. Their work can be considered as a fight for Anatomy. A particular problem arouse when this fight is excessively done and it limits the ability to correctly analyze the contribution made by other scientists in the same domain. This situation was identified at different personalities who lived in the same time and were involved in the same fundamental field of research. If, theoretically, the similarities between scientists should get them closer, in order to have a better communication, the antagonisms can lead them to rivalry. Our paper exemplifies a historical case in which the personages are Victor Papilian (1888-1956) and Grigore T. Popa (1892-1948). The resemblances between these two famous Romanian anatomists from the first half of the 20th century induced an evident scientific rivalry. Papilian and Popa brought a significant contribution in anatomy. Each of them was very much appreciated by his students. It is interesting that both of them had achievements in literature. We present the reasons of their disagreement and its consequences. Paradoxically, not the contrasts, but the resemblances between their strong characters produced a sort of animosity between them. This attitude diminished in time and their successors - Ioan Albu from the Cluj Faculty of Medicine and Ion Iancu from the Jassy Faculty of Medicine - had a long lasting and successful cooperation. PMID:27151732

Trends over the 20th Century were examined in streamflow, river ice, and snowpack for coastal river basins in Maine. Trends over time were tested in the timing and magnitude of seasonal river flows, the occurrence and duration of river ice, and changes in snowpack depth, equivalent water content, and density. Significant trends toward earlier spring peak flow and earlier center-of-volume runoff dates were found in the extended streamflow record spanning 1906-21 and 1929-2000. Only one of the six coastal rivers in the study analyzed for trends in cumulative runoff had a significant change in total annual runoff volume. Last spring river-ice-off dates at most coastal streamflow-gaging stations examined are trending to earlier dates. Trends in later fall initial onset of ice also are evident, although these trends are significant at fewer stations than that observed for ice-off dates. Later ice-on dates in the fall and (or) earlier ice-off dates in the spring contribute to a statistically significant decrease over time in the total number of days of ice occurrence at most gaging stations on coastal rivers in Maine. The longest, most complete snow records in coastal Maine indicate an increase in snow density for the March 1 snow-survey date during the last 60 years. The historical trends in streamflow, ice, and snow are all consistent with an earlier onset of hydrologic spring conditions in coastal Maine.

Major changes made in the configuration of San Leandro Bay, Alameda County, California, during the 20th century have caused rapid sedimentation within parts of the Bay. Comparison of bathymetric surveys indicates that sedimentation in the vicinity of the San Leandro Bay channel averaged 0.7 cm/annum between 1856 and 1984. Lead-210 data collected at four shallow water sites east of the San Leandro Bay channel indicated that sedimentation rates have averaged between 0.06 and 0.28 cm/annum. Because bioturbation of bottom sediments cannot be discounted, better definition of this range in sedimentation rates would require measuring the activity of lead-210 on incoming sediments. In addition to sediment deposited in the vicinity of the San Leandro Bay channel and open, shallow areas to the east, 850,740 cu m of sediment was deposited between 1948 and 1983 in an area dredged at the mouth of San Leandro Creek. All available data indicate that between 1,213,000 and 1,364,000 cu m of sediment was deposited in San Leandro Bay between 1948 and 1983. Sediment yield data from an adjacent drainage basin, when combined with inventories of lead-210 and cesium-137, indicate that most of the sediment deposited in San Leandro Bay is coming from resuspension of bottom sediments or from erosion of marshes or shorelines of San Leandro or San Francisco Bay. 31 refs., 7 figs., 4 tabs.

Comparison of single-forcing varieties of 20th century historical experiments in a subset of models from the Fifth Coupled Model Intercomparison Project (CMIP5) reveals that South Asian summer monsoon rainfall increases towards the present day in GHG-only experiments with respect to pre-industrial levels, while it decreases in anthropogenic aerosol-only experiments. Comparison of these single-forcing runs with the all-forcings historical experiment suggests aerosol emissions have dominated South Asian monsoon rainfall trends in recent decades. By examining the 25 available all-forcings historical experiments, we show that models including aerosol indirect effects dominate this negative trend. Indeed, models including only the direct radiative effect of aerosol show an increase in monsoon rainfall, suggesting the dominance of increasing greenhouse gas emissions and planetary warming on monsoon rainfall in those models. The mechanism may be due to the indirect and direct effects acting in unison to suppress the monsoon, or to stronger local aerosol loading in the group of models containing indirect effects. The disparity between the two groups of models needs to be urgently investigated in the event that the suggested future decline in Asian anthropogenic aerosol emissions inherent to the representative concentration pathways (RCPs) used for future climate projection turns out to be optimistic.

Belgian Marbrite Fauquez opalescent glass is an innovative material developed at the take-off of the 20th century. S.A. Verreries de Fauquez produced this from 1922 onwards. Low maintenance requirements and appropriate properties in terms of hygienic aspects make these mass coloured glass plates popular as finishes for bathrooms, kitchens, hospitals, store fronts, decorative façade cladding,... Since production of Marbrite stopped in the 1960s, replacement of its applications in the framework of restoration procedures is almost impossible. Limited knowledge of the composition and production results in the lack of a proper conservation strategy. In order to improve the existing knowledge a historical research was conducted. Archive records including patents, literature, building specifications, plans, journals, advertisements, interviews with former directors of the glassworks, etc. were considered and evaluated. Onsite lifted samples of Marbrite glass were submitted to a laboratory investigation using optical and scanning electron microscopy. The results revealed crucial information about the original formula and components. This paper discusses the composition of Marbrite, a peculiar Belgian opalescent glass. This interdisciplinary research - a cooperation between the Vrije Universiteit Brussel and Royal Institute for Cultural Heritage - aims to improve the knowledge of Marbrite glass in order to develop repair and renovation techniques.

The decadal variability of the North Pacific gyre oscillation (NPGO) over the 20th century is examined from a long-term integration of the Simple Ocean Data Assimilation (SODA) reanalysis. The NPGO is reflected by the second dominant pattern of sea surface height (SSH) variability in SODA, with a north-south dipole structure over the northeast Pacific. SSH anomalies in this region exhibit distinct decadal variability with a significant spectrum peak at approximately 18 years. The upper-ocean heat budget reveals that this dipole structure associated with the NPGO is predominantly due to the anomalous Ekman pumping and Ekman advection induced by the surface wind. The NPGO mode in SODA reanalysis originates from atmosphere stochastic noise (North Pacific Oscillation) which has a meridional dipole pattern but no preferred time scale. The oceanic planetary wave, particularly the advective baroclinic mode, integration of atmospheric stochastic noise leads to a spatial resonance with preferred decadal time scale. The limitation of current study is also discussed.

A collection of 157 Triticum aestivum accessions, representative of wheat breeding in Italy during the 20(th) century, was assembled to describe the evolutionary trends of cultivated varieties throughout this period. The lines were cultivated in Italy, in two locations, over two growing seasons, and evaluated for several agronomical, morphological and qualitative traits. Analyses were conducted using the most common univariate approach on individual plant traits coupled with a correspondance multivariate approach. ANOVA showed a clear trend from old to new varieties, leading towards earliness, plant height reduction and denser spikes with smaller seeds. The average protein content gradually decreased over time; however this trend did not affect bread-making quality, because it was counterbalanced by a gradual increase of SDS sedimentation volume, achieved by the incorporation of favourable alleles into recent cultivars. Correspondence analysis allowed an overall view of the breeding activity. A clear-cut separation was observed between ancient lines and all the others, matched with a two-step gradient, the first, corresponding roughly to the period 1920-1940, which can be ascribed mostly to genetics, the second, from the 40s onward, which can be ascribed also to the farming practice innovations, such as improvement of mechanical devices and optimised use of fertilizers. PMID:25712271

Given the severe impacts of extreme heat on natural and human systems, we attempt to quantify the likelihood that rising greenhouse gas concentrations will result in a new, permanent heat regime in which the coolest warm-season of the 21(st) century is hotter than the hottest warm-season of the late 20(th) century. Our analyses of global climate model experiments and observational data reveal that many areas of the globe are likely to permanently move into such a climate space over the next four decades, should greenhouse gas concentrations continue to increase. In contrast to the common perception that high-latitude areas face the most accelerated response to global warming, our results demonstrate that in fact tropical areas exhibit the most immediate and robust emergence of unprecedented heat, with many tropical areas exhibiting a 50% likelihood of permanently moving into a novel seasonal heat regime in the next two decades. We also find that global climate models are able to capture the observed intensification of seasonal hot conditions, increasing confidence in the projection of imminent, permanent emergence of unprecedented heat. PMID:22707810

The paper describes a new methodology for predicting the behavior of macroeconomic variables. The approach is based on System Dynamics and Fuzzy Inductive Reasoning. A four- layer pseudo-hierarchical model is proposed. The bottom layer makes predications about population dynamics, age distributions among the populace, as well as demographics. The second layer makes predications about the general state of the economy, including such variables as inflation and unemployment. The third layer makes predictions about the demand for certain goods or services, such as milk products, used cars, mobile telephones, or internet services. The fourth and top layer makes predictions about the supply of such goods and services, both in terms of their prices. Each layer can be influenced by control variables the values of which are only determined at higher levels. In this sense, the model is not strictly hierarchical. For example, the demand for goods at level three depends on the prices of these goods, which are only determined at level four. Yet, the prices are themselves influenced by the expected demand. The methodology is exemplified by means of a macroeconomic model that makes predictions about US food demand during the 20th century.

Louis Nico Marie (L. N. M.) Duijsens (Duysens) was one of the giants in the biophysics of photosynthesis. His PhD thesis "Transfer of Excitation Energy in Photosynthesis" (Duysens, 1952) is a classic; he introduced light-induced absorption difference spectroscopy to photosynthesis research and proved the existence of reaction centers, introducing advanced methods from physics to understand biological processes. Further, it is his 1959-1961 seminal work, with Jan Amesz, that provided evidence for the existence of the series scheme for the two light reactions in oxygenic photosynthesis. In one word, he was one of the master biophysicists of the 20th century-who provided direct measurements on many key intermediates, and made us understand the intricacies of photosynthesis with a simplicity that no one else ever did. We present here our personal perspective of the scientist that Lou Duysens was. For an earlier perspective, see van Grondelle and van Gorkom (Photosynth Res 120: 3-7, 2014). PMID:27039907

During the last half of the 20th century within Western civilization, for the first time in human history, divorce replaced death as the most common endpoint of marriage. In this article I explore the history of this death-to-divorce transition, the forces associated with the transition, and what the transition may have revealed about the human capacity for monogamous, lifelong pair-bonding. The impact and consequences of the transition for the generations that came of age during it and immediately afterwards are examined, with particular attention to the emergence of new, alternative pair-bonding structures such as cohabitation and nonmarital co-parenting. The article highlights the inability of the dichotomous marriage-versus-being-single paradigm to encompass the new pair-bonding structures and the normalizing of divorce. Precepts for a new, more encompassing, veridical and humane pair-bonding paradigm are presented, and some of their implications for social policy, family law, social science, and couple and family therapy are elaborated. PMID:12140958

A collection of 157 Triticum aestivum accessions, representative of wheat breeding in Italy during the 20th century, was assembled to describe the evolutionary trends of cultivated varieties throughout this period. The lines were cultivated in Italy, in two locations, over two growing seasons, and evaluated for several agronomical, morphological and qualitative traits. Analyses were conducted using the most common univariate approach on individual plant traits coupled with a correspondance multivariate approach. ANOVA showed a clear trend from old to new varieties, leading towards earliness, plant height reduction and denser spikes with smaller seeds. The average protein content gradually decreased over time; however this trend did not affect bread-making quality, because it was counterbalanced by a gradual increase of SDS sedimentation volume, achieved by the incorporation of favourable alleles into recent cultivars. Correspondence analysis allowed an overall view of the breeding activity. A clear-cut separation was observed between ancient lines and all the others, matched with a two-step gradient, the first, corresponding roughly to the period 1920–1940, which can be ascribed mostly to genetics, the second, from the 40s onward, which can be ascribed also to the farming practice innovations, such as improvement of mechanical devices and optimised use of fertilizers. PMID:25712271

This special issue consists of papers that are associated with invited lectures, workshop papers and hot topic papers presented at the 20th European Sectional Conference on Atomic and Molecular Physics of Ionized Gases (ESCAMPIG XX). This conference was organized in Novi Sad (Serbia) from 13 to 17 July 2010 by the Institute of Physics of the University of Belgrade. It is important to note that this is not a conference 'proceedings'. Following the initial selection process by the International Scientific Committee, all papers were submitted to the journal by the authors and have been fully peer reviewed to the standard required for publication in Plasma Sources Science and Technology (PSST). The papers are based on presentations given at the conference but are intended to be specialized technical papers covering all or part of the topic presented by the author during the meeting. The ESCAMPIG conference is a regular biennial Europhysics Conference of the European Physical Society focusing on collisional and radiative aspects of atomic and molecular physics in partially ionized gases as well as on plasma-surface interaction. The conference focuses on low-temperature plasma sciences in general and includes the following topics: Atomic and molecular processes in plasmas Transport phenomena, particle velocity distribution function Physical basis of plasma chemistry Plasma surface interaction (boundary layers, sheath, surface processes) Plasma diagnostics Plasma and discharges theory and simulation Self-organization in plasmas, dusty plasmas Upper atmospheric plasmas and space plasmas Low-pressure plasma sources High-pressure plasma sources Plasmas and gas flows Laser-produced plasmas During ESCAMPIG XX special sessions were dedicated to workshops on: Atomic and molecular collision data for plasma modeling, organized by Professors Z Lj Petrovic and N Mason Plasmas in medicine, organized by Dr N Puac and Professor G Fridman. The conference topics were represented in the

Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests. PMID:24098436

Purpose To investigate the effects of eye diseases on several important artists who have been given little attention from a medical-historical viewpoint. The examples chosen demonstrate problems artists have had to face from different types of eye disease, including cataract, glaucoma, and retinal diseases. The ophthalmological care provided is described in terms of scientific knowledge at the time. Methods Investigation of primary and secondary source material. Discussion with art historians and ophthalmic historians. Examination of work by the artists. Results Artists can be markedly affected by ocular diseases that change their ability to see the world. The individuals described here worked during the 19th century and first half of the 20th century. Homer Martin suffered from cataracts, and his works reveal changes in details and color as he aged. Henri Harpignies, who had an extremely long career, undoubtedly had cataracts and may also have had macular degeneration. Angle-closure glaucoma blinded Jules Chéret. Auguste Ravier suffered from neovascular glaucoma in one eye and was able to work with his remaining eye, which developed a cataract. Louis Valtat suffered from what was in all likelihood open-angle glaucoma, but specific changes due to this disease are not apparent in his work. Roger Bissière developed glaucoma and did well following filtration surgery. George Du Maurier lost one eye from what was probably a retinal detachment and later suffered from a central retinal problem in the other eye. Conclusions Diseases of the eye may profoundly influence artists by altering their perception of the world. The specific effects may vary, depending on the disease, its severity, and the psychology of the artist. Cataracts typically affect an artist’s ability to depict color and detail. The effect of glaucoma generally depends on whether central vision is preserved. Disease that affects the center of the retina has a substantial effect on an artist’s ability

Handheld computers give students access to the larger world of information beyond the classroom. Wireless access allows students real time access to all networks resources, including downloading e-books and documents, searching the library media centres catalog and searching databases to which the library has subscribed.

Describes development and evaluation of an error analysis procedure for a computer-assisted language learning program using natural language processing techniques. The program can be used for learning passive voice in Japanese on any World Wide Web browser. The program enables learners to type sentences freely, detects errors, and displays…

We examine the maximum (Tmax) and minimum (Tmin) temperature changes in San Juan Mountain (SJM) region of southwestern Colorado between 1895-2005. We analyze monthly averaged observations from 6 National Weather Service (NWS) stations between 1895-1949, and 25 NWS stations between 1950-2005. These changes are evaluated on annual, seasonal and monthly bases. Annually, our results suggest a long-term gradual warming trend in Tmin and no such discernable trend in Tmax. However, between 1990 and 2005, the region experiences a rapid warming trend with both Tmax and Tmin increasing by 1 degree C. Between 1950 and 1985, there is a regional cooling trend during which there are significant decreases in Tmax and almost no trend in Tmin. Similar to the annual trends, only Tmin shows a gradual warming trend during the 20th century during all seasons. Furthermore, during fall and summer, there is a lower correlation between Tmax and Tmin as compared to winter and spring. Between 1990-2005, Tmax increases more than Tmin during summer and spring, whereas Tmin shows greater increases during winter. We also examine Tmax and Tmin trends from 23 Snow Telemetry (SNOTEL) sites in the region between 1984-2005. We find strong correlation between NWS and SNOTEL observations, both annually and seasonally. Between 1990-2005, the largest warming at the SNOTEL sites occurs during summer while it is largest during winter at the NWS sites. Spatially, there are similar increases in Tmax and Tmin except in the central mountain region, where increases in Tmin started earlier and are greater. Physical mechanisms for these observed trends in Tmax and Tmin will be discussed.

The 20th century has seen a tremendous population growth and industrialization on a global scale. One particular 'hot spot' of these developments is Europe. These changes were accompanied, among others, by a substantial increase in aerosol emission. To learn more about associated consequences for the climate system we have carried out a comparatively large set of transient sensitivity studies with the global atmosphere only climate model ECHAM5-HAM, using aerosol emission data from NIES (National Institute of Environmental Studies, Japan) and prescribed, observation based sea surface temperatures (SSTs) from the Hadley Center. The sensitivity studies cover the period from 1870 to 2005 and comprise ensembles of simulations (up to 13 members per ensemble), which allow to address the role of different aerosol species, greenhouse gases, and prescribed sea surface temperatures. Analyzing these simulation data for Europe, we find a clear decrease of surface solar radiation (SSR) from about 1950 to 1980, followed by a renewed increase. This dimming / brightening is well known from observational data. The modeled and observed magnitude of the phenomenon are in good agreement, although dimming in the model ceases too early. One possible explanation for the latter could lie with the prescribed aerosol emissions, in particular too weak SO2 emissions or a too early reduction of black carbon emissions. Modeled SSR changes show substantial regional differences in magnitude and timing, again in line with observations. The model data further suggests a substantial random / natural variability / cloud component with regard to SSR changes under all sky conditions. While some ensemble members show a much more pronounced dimming than the ensemble average, others show hardly any dimming. Interestingly, the brightening signal after 1990 is found to be more robust in this respect. Surface temperatures bear some imprint of the SSR changes, especially in Eastern Europe, but the dominant

Early statistical methods focused on pre-data probability statements (i.e., data as random variables) such as P values; these are not really inferences nor are P values evidential. Statistical science clung to these principles throughout much of the 20th century as a wide variety of methods were developed for special cases. Looking back, it is clear that the underlying paradigm (i.e., testing and P values) was weak. As Kuhn (1970) suggests, new paradigms have taken the place of earlier ones: this is a goal of good science. New methods have been developed and older methods extended and these allow proper measures of strength of evidence and multimodel inference. It is time to move forward with sound theory and practice for the difficult practical problems that lie ahead. Given data the useful foundation shifts to post-data probability statements such as model probabilities (Akaike weights) or related quantities such as odds ratios and likelihood intervals. These new methods allow formal inference from multiple models in the a prior set. These quantities are properly evidential. The past century was aimed at finding the "best" model and making inferences from it. The goal in the 21st century is to base inference on all the models weighted by their model probabilities (model averaging). Estimates of precision can include model selection uncertainty leading to variances conditional on the model set. The 21st century will be about the quantification of information, proper measures of evidence, and multi-model inference. Nelder (1999:261) concludes, "The most important task before us in developing statistical science is to demolish the P-value culture, which has taken root to a frightening extent in many areas of both pure and applied science and technology". PMID:24804444

Major changes made in the configuration of San Leandro Bay, Alameda County, California, during the 20th century have caused rapid sedimentation within parts of the Bay. Opening of the Oakland tidal channel and removal of 97% of the marshlands formerly surrounding the Bay have decreased tidal velocities and volumes. Marshland removal has decreased the tidal prism by about 25%. Comparison of bathymetric surveys indicates that sedimentation in the vicinity of the San Leandro Bay channel averaged 0.7 cm/annum between 1856 and 1984. Lead-210 data collected at four shallow water sites east of the San Leandro Bay channel indicated that sedimentation rates have averaged between 0.06 and 0.28 cm/annum. Because bioturbation of bottom sediments cannot be discounted, better definition of this range in sedimentation rates would required measuring the activity of lead-210 on incoming sediments. In addition to sediment deposited in the vicinity of the San Leandro Bay channel and open, shallow areas to the east, 850,740 cu m of sediment was deposited between 1948 and 1983 in an area dredged at the mouth of San Leandro Creek. All available data indicate that between 1 ,213,000 and 1,364,000 cu m of sediment was deposited in San Leandro Bay between 1948 and 1983. Sediment yield data from an adjacent drainage basin, when combined with inventories of lead-210 and cesium-137, indicate that most of the sediment deposited in San Leandro Bay is coming from resuspension of bottom sediments or from erosion of marshes or shorelines of San Leandro or San Francisco Bay. (Author 's abstract)

In this study, we assess how the anthropogenically induced increase in greenhouse gas concentrations affects the climate of central and southern South America. We utilise two regional climate simulations for present day (PD) and pre-industrial (PI) times. These simulations are compared to historical reconstructions in order to investigate the driving processes responsible for climatic changes between the different periods. The regional climate model is validated against observations for both re-analysis data and GCM-driven regional simulations for the second half of the 20th century. Model biases are also taken into account for the interpretation of the model results. The added value of the regional simulation over global-scale modelling relates to a better representation of hydrological processes that are particularly evident in the proximity of the Andes Mountains. Climatic differences between the simulated PD minus PI period agree qualitatively well with proxy-based temperature reconstructions, albeit the regional model overestimates the amplitude of the temperature increase. For precipitation the most important changes between the PD and PI simulation relate to a dipole pattern along the Andes Mountains with increased precipitation over the southern parts and reduced precipitation over the central parts. Here only a few regions show robust similarity with studies based on empirical evidence. However, from a dynamical point-of-view, atmospheric circulation changes related to an increase in high-latitude zonal wind speed simulated by the regional climate model are consistent with numerical modelling studies addressing changes in greenhouse gas concentrations. Our results indicate that besides the direct effect of greenhouse gas changes, large-scale changes in atmospheric circulation and sea surface temperatures also exert an influence on temperature and precipitation changes in southern South America. These combined changes in turn affect the relationship between

Eight 1-m sediment cores were extracted from across the basin of Friary Lough, a 5.4-ha eutrophic lake in a wholly grassland agricultural catchment in Co. Tyrone, Northern Ireland. Sedimentary TP, diatom inferred TP, Ca, Na, Fe, Mn, loss-on-ignition (LOI), dry weight and density were determined in the core profiles. Core dating and correlation gave a 210Pb, 137Cs and 241Am chronology from 1906 to 1995 and enabled a whole-basin estimate of chemical and sediment accumulation rate over the 20th Century. The major changes for all parameters occurred after c. 1946. Sediment accumulation rate was most influenced by organic matter accumulations, probably of planktonic origin, and increasing after c. 1946. Inorganic sediment accumulation rate was found to be largely unchanging through the century at 10 t km(-2) yr(-1) when expressed as catchment exports. All chemical accumulation rate changes occurred after c. 1946. Total phosphorus accumulation rate, however, was found to be the only chemical to be increasing throughout the epilimnion and hypolimnion areas of the sedimentary basin at an average of 22.5 mg m(-2) yr(-1) between 1946 and 1995. The other chemical parameters showed increasing accumulation rates after c. 1946 in the epilimnion part of the basin only. Interpreted in terms of whole-basin sedimentation and catchment export processes over time, it is suggested that diffuse TP inputs are independent of sediment inputs. This corresponds to hydrochemical models that suggest soluble P as the primary fraction that is lost from grassland catchments. The increase in sedimentary TP accumulation rate, and DI-TP concentration, are also explained with regard to current models that suggest increases in runoff P concentrations from elevated soil P concentrations. Increases in eplimnion chemical and sediment accumulation rate after c. 1946 may be due to local erosion that has limited impact on lake basin sedimentation. PMID:12389788

Three early 20th-century attempts at unifying separate areas of biology, in particular development, genetics, physiology, and evolution, are compared in regard to their success and fruitfulness for further research: Jacques Loeb's reductionist project of unifying approaches by physico-chemical explanations; Richard Goldschmidt's anti-reductionist attempts to unify by integration; and Sewall Wright's combination of reductionist research and vision of hierarchical genetic systems. Loeb's program, demanding that all aspects of biology, including evolution, be studied by the methods of the experimental sciences, proved highly successful and indispensible for higher level investigations, even though evolutionary change and properties of biological systems up to now cannot be fully explained on the molecular level alone. Goldschmidt has been appraised as pioneer of physiological and developmental genetics and of a new evolutionary synthesis which transcended neo-Darwinism. However, this study concludes that his anti-reductionist attempts to integrate genetics, development and evolution have to be regarded as failures or dead ends. His grand speculations were based on the one hand on concepts and experimental systems that were too vague in order to stimulate further research, and on the other on experiments which in their core parts turned out not to be reproducible. In contrast, Sewall Wright, apart from being one of the architects of the neo-Darwinian synthesis of the 1930s, opened up new paths of testable quantitative developmental genetic investigations. He placed his research within a framework of logical reasoning, which resulted in the farsighted speculation that examinations of biological systems should be related to the regulation of hierarchical genetic subsystems, possibly providing a mechanism for development and evolution. I argue that his suggestion of basing the study of systems on clearly defined properties of the components has proved superior to

Regional climate change is the combined response to global and regional radiative forcing, circulation and interactions between the atmosphere and Earth surface. A potentially key component of regional climate change derives from natural and human-caused land use and land-use change (LULC) and related atmosphere-surface feedbacks. In the first step of disentangling and quantifying the effect of LULC over the southeastern United States, we conducted a series of simulations with the RegCM4 regional climate model which includes the Biosphere Atmosphere Transfer Scheme (BATS) surface physics package. Land use and vegetation types determine the specified values of a number of biophysical and physical parameters in BATS such as albedo, roughness length, stomatal resistance, and rooting depth. We ran four simulations in which we specified BATS land use types derived from harmonized reconstructions for 1650, 1850, 1920 and Landsat-based observations for 1992 (Steyaert and Knox, 2008). The simulations were run over a model domain comprising a 20 km × 20 km horizontal grid and 23 atmospheric levels for the period 1990-2000. The boundary conditions for the simulations were derived from the NCAR-NCEP Reanalysis and the NOAA Optimum Interpolated Sea Surface Temperature global data sets. Depending on the time periods considered, the simulations indicate regionally coherent patterns of April-September changes in surface radiation, warming and cooling of up to 2 °C or more, substantial changes in soil moisture and atmospheric humidity associated with conversion of native vegetation to agriculture and agriculture back to forest, draining of wetlands and marshes, and urbanization. Extensive draining of wetlands in the lower Mississippi Valley during the 20th century induced a robust feedback with the atmosphere which suppressed convective summertime precipitation. Our joint analysis of LULC classes and the model simulations suggests a need to be able to disaggregate land use to a

Freshwater lakes play a key role in the global carbon cycle as sinks (organic carbon sequestration) and sources (greenhouse gas emissions). Understanding the carbon cycle response to environmental changes is becoming a crucial challenge in the context of global warming and the preponderance of human pressures. We reconstructed the long-term (1500 years) evolution of trophic functioning of the benthic food web, based on methanotrophic ancient DNA and chironomid isotope analyses). In addition, human land use is also reconstructed in three different lakes (eastern France, Jura Mountains). Our findings confirm that the benthic food web can be highly dependent on methane-derived carbon (up to 50% of the chironomid biomass) and reveal that the activation of this process can correspond to a natural functioning or be a consequence of anthropic perturbation. The studied lakes also showed a similar temporal evolution over the last century with the disappearance of the profundal aquatic insects (Chironomidae, Diptera), considered as keystone for the whole lake food web (e.g., coupling benthic-pelagic), inducing a potential collapse in the transfer of methane to top consumers. This functional state, also called the dead zone expansion, was caused by the change in human land-use occurring at the beginning of the 20th century. The strong modification of agro-pastoral practices (e.g., fertilization practices, intensive grazing, and sewage effluent) modified the influx of nutrients (by diffuse and/or point-source inputs) and induced a significant increase in the trophic status and organic matter sedimentation to reach unprecedented values. Further studies should be planned to assess dead zone expansion and, according to the regime shift theory, to provide environmental tipping points for sustainable resource management.

Long-term changes in nitrate concentration and flux between the middle of the 20th century and the first decade of the 21st century were estimated for the Des Moines River and the Middle Illinois River, two Midwestern Corn Belt streams, using a novel weighted regression approach that is able to detect subtle changes in solute transport behavior over time. The results show that the largest changes in flow-normalized concentration and flux occurred between 1960 and 1980 in both streams, with smaller or negligible changes between 1980 and 2004. Contrasting patterns were observed between (1) nitrate export linked to non-point sources, explicitly runoff of synthetic fertilizer or other surface sources and (2) nitrate export presumably associated with point sources such as urban wastewater or confined livestock feeding facilities, with each of these modes of transport important under different domains of streamflow. Surface runoff was estimated to be consistently most important under high-flow conditions during the spring in both rivers. Nitrate export may also have been considerable in the Des Moines River even under some conditions during the winter when flows are generally lower, suggesting the influence of point sources during this time. Similar results were shown for the Middle Illinois River, which is subject to significant influence of wastewater from the Chicago area, where elevated nitrate concentrations were associated with at the lowest flows during the winter and fall. By modeling concentration directly, this study highlights the complex relationship between concentration and streamflow that has evolved in these two basins over the last 50 years. This approach provides insights about changing conditions that only become observable when stationarity in the relationship between concentration and streamflow is not assumed.

Anthropogenic lead (Pb) from industrial activities has greatly altered the distribution of Pb in the present-day oceans, but no continuous temporal Pb evolution record is available for the Indian Ocean despite rapidly emerging industries around the region. Here, we present the coral-inferred annual history of Pb concentration and isotope ratios in the surface Indian Ocean since the mid-20th century (1945-2010). We analyzed Pb in corals from the Chagos Archipelago, western Sumatra and Strait of Singapore - which represent the central Indian Ocean via nearshore sites. Overall, coral Pb/Ca increased in the mid-1970s at all the sites. However, coral Pb isotope ratios evolve distinctively at each site, suggesting Pb contamination arises from different sources in each case. The major source of Pb in the Chagos coral appears to be India's Pb emission from leaded gasoline combustion and coal burning, whereas Pb in western Sumatra seems to be largely affected by Indonesia's gasoline Pb emission with additional Pb inputs from other sources. Pb in the Strait of Singapore has complex sources and its isotopic composition does not reflect Pb from leaded gasoline combustion. Higher 206Pb/207Pb and 208Pb/207Pb ratios found at this site may reflect the contribution of Pb from coals and ores from southern China, Indonesia, and Australia, and local Pb sources in the Strait of Singapore. It is also possible that the Pb isotope ratios of Singapore seawater were elevated through isotope exchange with natural fluvial particles considering its delta setting.

This article addresses the emergence of the "motoneuron concept," i.e., the idea that this cell had properties of particular advantage for its control of muscle activation. The motor function of the ventral roots was established early in the 19th C and the term "motor cell," (or "motor nerve cell") was introduced shortly thereafter by Albrecht von Kölliker and some other histologists. They knew that motor cells were among the neurons with the largest soma in vertebrates and for this reason they were, and remained for many decades, the best and most studied neuronal model. The work of clinicians like Guillaume Duchenne de Boulogne and Jean-Martin Charcot on motor degenerative syndromes began before a clear description of motor cells was available, because it was initially more difficult to establish whether the deficits of paralysis and muscle weakness were due to neuronal or muscular lesions. Next, the pioneering physiologist, Charles Sherrington, who was influenced greatly by the anatomical contributions and speculations of Santiago Ramón y Cajal, used the term, "motor neuron," rather than motor cell for the neuron that he considered was functionally "the final common path" for providing command signals to the musculature. In the early 20th C he proposed that activation of a motor neuron resulted from the sum of its various excitatory and inhibitory CNS inputs. The contraction of motor neuron to "motoneuron(e)" was put into common usage by John Fulton (among possibly others) in 1926. The motoneuron concept is still evolving with new discoveries on the horizon. PMID:21723536

Traditionally Patagonia has been seen as a very pristine area, being an important reserve of wildlife and clean waters. Nonetheless it was dramatically affected by the first settlers at the beginning of the 20th century, that generated large fires for clearing the land originally covered by native forest. Those fires produced a dramatic impact left behind thousands of dead trees, increasing soil erosion, altering nutrient inputs in the aquatic ecosystems, which in turn affected the aquatic biota. However those impacts have been barely asses, then through the study of the sediment record of lake La Esponja (45°S) we want to evaluate the magnitude of the changes produced by the fires and to determine the resilience capacity of the lake. We analyzed magnetic susceptibility, organic content, charcoal, total phosphorous and a biological proxy (Chironomidae) in a sediment sequence of 60 cm long. Magnetic susceptibility shows a very variable behavior along the profile, being possible to identify a decreasing trend since the bottom, up to the most recent part of the record. An opposite behavior describes the organic content, showing a noticeable increase toward the surficial sediments. The number of charcoal particles -a direct indicator of fires occurrence, shows a peak of fires approximately at seven cm depth, diminishing toward the recent part. Total phosphorous also follow the trend recognized by charcoal, which allow inferring a probable trophic increase of the lake. This trend is partially recognized by chironomid assemblages through the increasing of some taxa typical of a higher nutrient status. Acknowledgements: Fondecyt projects 1120765 and 1120807.

Within this paper the author examines the current nostalgia for a never-present past through critical analysis of images of the mid 20th century American classroom in media culture. The author uses theories of nostalgia and the history of the photographic image to trouble the numerous equity issues surrounding the unchallenged canonization of the…

This document includes proceedings, abstracts, and papers of the 20th Annual Conference on Undergraduate Teaching of Psychology, which was held on March 29-31, 2006 in Monticello, NY. The conference, which dealt with ideas and innovations in the teaching of psychology, was sponsored by the Psychology Department of the Farmingdale State University.…

This annotated bibliography is dedicated to rural women workers and their roles during the 20th century (1875-1971). It is concerned with materials which consider both the questions of rural manpower and rural womanpower. There are a variety of source materials (books, articles, research papers, etc.) and some 338 entries. Divided into six…

Educating About Social Issues in the 20th and 21st Centuries: A Critical Annotated Bibliography, is comprised of critical essays accompanied by annotated bibliographies on a host of programs, models, strategies and concerns vis-a-vis teaching and learning about social issues facing society. The primary goal of the book is to provide undergraduate…

The paper analyses and compares developments in history teaching in Germany, England, and the Netherlands in the 19th and 20th centuries. The development of history teaching in the three countries shows striking similarities. National politics have always used history education for purposes which did not necessarily tally with distanced critical…

The 17 papers in this collection all deal with 20th-century journalism, journalists, and mass media. The papers and their authors are: "Building One's Own Gallows: The Trade Publications' Reaction to a Federal Shield Law, 1972-1974" (Karla Gower); "The Useful Ogre: Sweden's Use and Views of American Television, 1956-62" (Ulf Jonas Bjork); "Black…

By the end of the 20th century, a modern-day clarion call echoed throughout the Christian collegiate community for religious institutions to divert the trend of secularization and to recover their Christian identity. This study, rather than addressing recovery, explores the distinctive, enduring elements of Wheaton College, a midwestern Christian…

The papers of this proceedings are presented in three parts. The four papers in part 1, "Issues in 20th Century American Education", discuss the women graduates of Oberlin College, 1836-1860, the Americanization of art museums, the business community's response to government's support of the labor movement, and the response of liberalism to the…

During the first half of the 20th century, Catholic educators in the United States used theological arguments both to resist and embrace the progressive educational reform effort of educational measurement. The significant expansion of Catholic schooling and the increased number of students attending them, along with increased state oversight, led…

This study reports some preliminary analyses of multichannel lidar measurements taken in Rome Tor Vergata (Italy) during the 20th March 2015 partial solar eclipse. The objective is assessing the capability of the instrument to document the effect of the eclipse in the lower troposphere, with a particular emphasis on the information content at relatively small temporal and spatial scales.

In 2002, Munk defined an important enigma of 20th century global mean sea-level (GMSL) rise that has yet to be resolved. First, he listed three canonical observations related to Earth’s rotation [(i) the slowing of Earth’s rotation rate over the last three millennia inferred from ancient eclipse observations, and changes in the (ii) amplitude and (iii) orientation of Earth’s rotation vector over the last century estimated from geodetic and astronomic measurements] and argued that they could all be fit by a model of ongoing glacial isostatic adjustment (GIA) associated with the last ice age. Second, he demonstrated that prevailing estimates of the 20th century GMSL rise (~1.5 to 2.0 mm/year), after correction for the maximum signal from ocean thermal expansion, implied mass flux from ice sheets and glaciers at a level that would grossly misfit the residual GIA-corrected observations of Earth’s rotation. We demonstrate that the combination of lower estimates of the 20th century GMSL rise (up to 1990) improved modeling of the GIA process and that the correction of the eclipse record for a signal due to angular momentum exchange between the fluid outer core and the mantle reconciles all three Earth rotation observations. This resolution adds confidence to recent estimates of individual contributions to 20th century sea-level change and to projections of GMSL rise to the end of the 21st century based on them. PMID:26824058

Historical records from London hospitals in the late 19th-early 20th centuries were analyzed for their depiction of nursing trainees. Analysis reveals a strong emphasis on character traits rather than intellectual ability. In contrast, the literature of the last 3 decades shows a contemporary concern for nurses as knowledgeable doers. (Contains 31…

Welcome to this special issue marking our 20th birthday Physics World first appeared in October 1988 as the successor to - for those who remember it - Physics Bulletin. Since then, Physics World has grown in importance and stature, and, without wishing to blow our own trumpet too much, it is now one of the best specialist science magazines around. Its success is largely due to the talented teams of staff who have worked for the magazine over the last two decades - in particular founding editor Philip Campbell and his successor Peter Rodgers - and to the editorial independence given to us by our publishers, the Institute of Physics.

The World-Wide LHC Computing Grid encompasses a set of heterogeneous information systems; from central portals such as the Open Science Grid's Information Management System and the Grid Operations Centre Database, to the WLCG information system, where the information sources are the Grid services themselves. Providing a consistent view of the information, which involves synchronising all these informations systems, is a challenging activity that has lead the LHC virtual organisations to create their own configuration databases. This experience, whereby each virtual organisation's configuration database interfaces with multiple information systems, has resulted in the duplication of effort, especially relating to the use of manual checks for the handling of inconsistencies. The Global Service Registry aims to address this issue by providing a centralised service that aggregates information from multiple information systems. It shows both information on registered resources (i.e. what should be there) and available resources (i.e. what is there). The main purpose is to simplify the synchronisation of the virtual organisation's own configuration databases, which are used for job submission and data management, through the provision of a single interface for obtaining all the information. By centralising the information, automated consistency and validation checks can be performed to improve the overall quality of information provided. Although internally the GLUE 2.0 information model is used for the purpose of integration, the Global Service Registry in not dependent on any particular information model for ingestion or dissemination. The intention is to allow the virtual organisation's configuration databases to be decoupled from the underlying information systems in a transparent way and hence simplify any possible future migration due to the evolution of those systems. This paper presents the Global Service Registry architecture, its advantages compared to the

Smart man machine interfaces turn out to be a key technology for service robots, for automation applications in industrial environments as well as in future scenarios for applications in space. For either field, the use of virtual reality (VR) techniques showed a great potential. At the IRF a virtual reality system was developed and implemented which allows the intuitive control of a multi-robot system and different automation systems under one unified VR framework. As the developed multi-robot system is also employed for space application, the intuitive commanding of inspection and teleoperation sequences is of great interest. In order to facilitate teleoperation and inspection, we make use of several metaphors and a vision system as an `intelligent sensor'. One major metaphor to be presented in the paper is the `TV-view into reality', where a TV-set is displayed in the virtual world with images of the real world being mapped onto the screen as textures. The user can move the TV-set in the virtual world and, as the image generating camera is carried by a robot, the camera-viewpoint changes accordingly. Thus the user can explore the physical world `behind' the virtual world, which is ideal for inspection and teleoperation tasks. By means of real world images and with different measurement-services provided by the underlying 3D vision system, the user can thus interactively build up or refine the virtual world according to the physical world he is watching through the TV-set.

erection. The concepts from animal experimentations in Europe in the 19th century significantly contributed to the current understanding of penile erection. van Driel MF. Physiology of penile erection—a brief history of the scientific understanding up till the eighties of the 20th century. Sex Med 2015;3:343–351. PMID:26797073

The effect of 20th-century drought on groundwater and lakes in a climatically sensitive area of the Northern Great Plains (Grant County, western Minnesota) was investigated by analysis of lake sediments and historical air photos, in conjunction with groundwater flow simulations. Drought caused an observed 4.0 to 5.1 m lake-head decline between 1923 and 1938; all but three either dried completely or declined to <1.7 m depth. From the deepest (Elk Lake), a 210Pb-dated sediment core was analyzed for ostracode species distribution and geochemistry of C. Rawsoni shells (δ 18O, δ 13C, Mg/Ca, Sr/Ca). Mg/Ca and δ 13C increased during drought at the same time that salinity-tolerant ostracodes thrived, suggesting increasing salinity. However, δ 18O decreased during drought, anti-correlative with Mg/Ca. A numerical model for transient response of groundwater flow to drought, modeled after that of 1923-38, was constrained by drought-end recessional strand lines observed on air photos of one lake, with strand line elevations inferred from bathymetry. Simulated lake-water-level declines in the first 15 drought years were consistent with strand line elevations. The rate of decline was exponential at drought onset, slowing as water in lakes and wetlands receded to below land surface and lake evaporation declined. A near-steady state was attained between 40 and 50 simulation years after onset of drought, with the water table from 1-5 m below the level of dry lakebeds and only the two deepest lakes still holding water. They were, however, greatly constricted in area and depth. Simulated evaporation fluxes were reduced by over 50% within 15 years, despite increased ET rates, due to lake area constriction and water table declines. Both relate to land surface morphology beneath and around lakes. Model results suggest that lake-bed elevations exert control on the rate and ultimate level to which groundwater is depressed by drought. The deepest lakes in any region will tend to become

Comparison of single-forcing varieties of 20th century historical experiments in a subset of models from the Fifth Coupled Model Intercomparison Project (CMIP5) reveals that South Asian summer monsoon rainfall increases towards the present day in Greenhouse Gas (GHG)-only experiments with respect to pre-industrial levels, while it decreases in anthropogenic aerosol-only experiments. Comparison of these single-forcing experiments with the all-forcings historical experiment suggests aerosol emissions have dominated South Asian monsoon rainfall trends in recent decades, especially during the 1950s to 1970s. The variations in South Asian monsoon rainfall in these experiments follows approximately the time-evolution of inter-hemispheric temperature gradient over the same period, suggesting a contribution from the large-scale background state relating to the asymmetric distribution of aerosol emissions about the equator. By examining the twenty-five available all-forcings historical experiments, we show that models including aerosol indirect effects dominate the negative rainfall trend. Indeed, models including only the direct radiative effect of aerosol show an increase in monsoon rainfall, consistent with the dominance of increasing greenhouse gas emissions and planetary warming on monsoon rainfall in those models. For South Asia, reduced rainfall in the models with indirect effects is related to decreased evaporation at the land surface rather than from anomalies in horizontal moisture flux, suggesting the impact of indirect effects on local aerosol emissions. This is confirmed by examination of aerosol loading and cloud droplet number trends over the South Asia region. Thus while remote aerosols and their asymmetric distribution about the equator play a role in setting the inter-hemispheric temperature distribution on which the South Asian monsoon, as one of the global monsoons, operates, the addition of indirect aerosol effects acting on very local aerosol emissions

Comparison of single-forcing varieties of 20th century historical experiments in a subset of models from the Fifth Coupled Model Intercomparison Project (CMIP5) reveals that South Asian summer monsoon rainfall increases towards the present day in Greenhouse Gas (GHG)-only experiments with respect to pre-industrial levels, while it decreases in anthropogenic aerosol-only experiments. Comparison of these single-forcing experiments with the all-forcings historical experiment suggests aerosol emissions have dominated South Asian monsoon rainfall trends in recent decades, especially during the 1950s to 1970s. The variations in South Asian monsoon rainfall in these experiments follows approximately the time evolution of inter-hemispheric temperature gradient over the same period, suggesting a contribution from the large-scale background state relating to the asymmetric distribution of aerosol emissions about the equator. By examining the 24 available all-forcings historical experiments, we show that models including aerosol indirect effects dominate the negative rainfall trend. Indeed, models including only the direct radiative effect of aerosol show an increase in monsoon rainfall, consistent with the dominance of increasing greenhouse gas emissions and planetary warming on monsoon rainfall in those models. For South Asia, reduced rainfall in the models with indirect effects is related to decreased evaporation at the land surface rather than from anomalies in horizontal moisture flux, suggesting the impact of indirect effects on local aerosol emissions. This is confirmed by examination of aerosol loading and cloud droplet number trends over the South Asia region. Thus, while remote aerosols and their asymmetric distribution about the equator play a role in setting the inter-hemispheric temperature distribution on which the South Asian monsoon, as one of the global monsoons, operates, the addition of indirect aerosol effects acting on very local aerosol emissions also

Water Year 2006 (October 1, 2005, to September 30, 2006) was a year of extreme hydrologic drought and the driest year in the recent 2002-2006 drought in Oklahoma. The severity of this recent drought can be evaluated by comparing it with four previous major hydrologic droughts, water years 1929-41, 1952-56, 1961-72, and 1976-81. The U.S. Geological Survey, in cooperation with the Oklahoma Water Resources Board, completed an investigation to summarize the Water Year 2006 hydrologic drought and compare it to the four previous major hydrologic droughts in the 20th century. The period of water years 1925-2006 was selected as the period of record because before 1925 few continuous record streamflow-gaging sites existed and gaps existed where no streamflow-gaging sites were operated. Statewide annual precipitation in Water Year 2006 was second driest and statewide annual runoff in Water Year 2006 was sixth driest in the 82 years of record. Annual area-averaged precipitation totals by the nine National Weather Service Climate Divisions from Water Year 2006 are compared to those during four previous major hydrologic droughts to show how rainfall deficits in Oklahoma varied by region. Only two of the nine climate divisions, Climate Division 1 Panhandle and Climate Division 4 West Central, had minor rainfall deficits, while the rest of the climate divisions had severe rainfall deficits in Water Year 2006 ranging from only 65 to 73 percent of normal annual precipitation. Regional streamflow patterns for Water Year 2006 indicate that Oklahoma was part of the regionwide below-normal streamflow conditions for Arkansas-White-Red River Basin, the sixth driest since 1930. The percentage of long-term stations in Oklahoma (with at least 30 years of record) having below-normal streamflow reached 80 to 85 percent for some days in August and November 2006. Twelve long-term streamflow-gaging sites with periods of record ranging from 62 to 78 years were selected to show how streamflow

The region surrounding Palmerton, PA has been affected by airfall of metals from the NJ Zinc Co. smelter along the Lehigh River just north of the Kittatinny Ridge. The deposition of zinc, cadmium, lead, and arsenic, led to the destruction of a forest ecosystem in the immediate vicinity and metals contamination in the town and surrounding area. Although the smelter was closed in the 1980's, concerns linger over whether the soil still remains contaminated with elevated levels of metals. This study has been directed to determining the validity of these concerns. The present concentration and distribution of metals in the soil is the result of the initial (20th century) concentration and the processes of leaching, erosion, and biological uptake and dispersal that have proceeded since the smelter was shut down. At the site of the smelter, analyses of samples from shallow soil pits had zinc concentrations up to 25,500 mg/kg, lead concentrations to 380 mg/kg, and cadmium up to 25 mg/kg. We analyzed soil samples from 52 locations in the region. Zinc, the most obvious metal from the zinc smelter, does not exceed residential concentration standards anywhere in the surrounding "far field" region, but is a maximum in the vicinity of the smelter falling to background within 20 km. Lead follows the same decay curve with distance, but exceeds residential standards in the West Plant (the smelter) itself and the immediate surroundings. Cadmium follows the same decay curve. Concentrations decay with distance from the smelter, but are found in contrasting concentrations in the O, A and B soil horizons. The regional average metal concentrations for all metals analyzed are higher in the O and A horizons than in the underlying B horizon. Zinc is focused in the O-horizon, suggesting that plants have taken up the zinc and concentrated it in leaf litter. Lead is also focused in the O-horizon, but this is more likely due to its lack of mobility downward through the soil. Arsenic

The Arveyron of the Mer de Glace is the emissary of the most famous and largest French glacier. The latter has dramatically shrunk since the end of the Little Ice Age (LIA), such as every alpine glacier: the front has registered a retreat of 2.7 km since 1820 and a recent modelling showed a likely decrease of an extra km by 2040. The Arveyron and its surroundings are deeply impacted by the retreat. Then, dynamics of proglacial streams and of lateral moraines have been studied at different time and space scales through various methods: airborne and terrestrial Lidar DEM comparisons, mapping from orthophotos, 2D and 3D monoplotting to quantify past events from old terrestrial pictures, etc. By coupling studies on moraines and on stream morphology we wanted to better understand the influence of glacier retreat on sediment supply and transport downstream. Results show the evolution of the stream sediment sources linked to the glacier retreat. Before the middle of the 20th century, till was the main sediment source and was released by major flood events such as GLOFs. Now, geomorphic activity is especially important on the right lateral moraine into the recently deglaciated hanging valley of the Mer de Glace but also in the moraine flanks of the current glacier tongue (many landslides occurred during the Summer 2014). The recent glacier retreat has also formed sediments sinks such as two proglacial lakes which are progressively filling. These lakes work as big sediment traps until they will disappear (around 2017). Fluvial dynamics of the Arveyron depends on the connectivity with potential sediments sources. This is why we crossed upstream studies with the channel evolution on its fan. Arveyron channel has got narrower and incised for at least a century. Such evolution should mean a decreasing sediment yield, but anthropic factors play also an important role on stream morphology. The main anthorpic impact is the complex subglacial harnessing of the Mer de Glace. The

Two simulations with a regional climate model are analyzed for climatic changes between the late 20th century and a pre-industrial period over central and southern South America. The model simulations have been forced with large-scale boundary data from the global simulation performed with a coupled atmosphere-ocean general circulation model. The regional simulations have been carried out on a 0.44° × 0.44° grid (approx. 50 km × 50 km horizontal resolution). The differences in the external forcings are related to a changed greenhouse gas content of the atmosphere, being higher in the present-day simulation. For validation purposes the climate model is analyzed using a five year long simulation between 1993 and 1997 forced with re-analysis data. The climate model reproduces the main climatic features reasonably well, especially when comparing model output co-located with observational station data. However, the comparison between observed and simulated climate is hampered by the sparse meteorological station network in South America. The present-day simulation is compared with the pre-industrial simulation for atmospheric fields of near-surface temperatures, precipitation, sea level pressure and zonal wind. Higher temperatures in the present-day simulation are evident over entire South America, mostly pronounced over the southern region of the Andes Mountains and the Parana basin. During southern winter the higher temperatures prevail over the entire continent, with largest differences over the central Andes Mountains and the Amazonian basin. Precipitation differences show a more heterogeneous pattern, especially over tropical regions. This might be explained by changes in convective processes acting on small scales. During southern summer wetter conditions are evident over the Amazonian and Parana basin in the present-day simulation. Precipitation increases are evident over Patagonia together with decreases to the north along the western slope of the Andes

Molecular computing based on enzymes or nucleic acids has attracted a great deal of attention due to the perspectives of controlling living systems in a way we control electronic computers. Enzyme-based computational systems can respond to a great variety of small molecule inputs. They have an advantage of signal amplification and highly specific recognition. DNA computing systems are most often controlled by oligonucleotide inputs/outputs and are capable of sophisticated computing, as well as controlling gene expressions. Here, we developed an interface that enables communication of otherwise incompatible nucleic acid and enzyme computational systems. The enzymatic system processes small molecules as inputs and produces NADH as an output. The NADH output triggers electrochemical release of an oligonucleotide, which is accepted by a DNA computational system as an input. This interface is universal since the enzymatic and DNA computing systems are independent of each other in composition and complexity. PMID:25864379

Molecular computing based on enzymes or nucleic acids has attracted a great deal of attention due to the perspectives of controlling living systems in the way we control electronic computers. Enzyme-based computational systems can respond to a great variety of small molecule inputs. They have the advantage of signal amplification and highly specific recognition. DNA computing systems are most often controlled by oligonucleotide inputs/outputs and are capable of sophisticated computing as well as controlling gene expressions. Here, we developed an interface that enables communication of otherwise incompatible nucleic-acid and enzyme-computational systems. The enzymatic system processes small molecules as inputs and produces NADH as an output. The NADH output triggers electrochemical release of an oligonucleotide, which is accepted by a DNA computational system as an input. This interface is universal because the enzymatic and DNA computing systems are independent of each other in composition and complexity. PMID:25864379

The Adriatic Sea and the Black Sea are two semienclosed basins connected to the Mediterranean Sea by the Otranto and the Bosporus straits, respectively. This work aims to reconstruction the sea level for both basins in the 20th century and to investigate main sources of interannual variability. Using 7 tide gauge timeseries located along the Adriatic coast and 5 along the Black Sea coast, provided by the PSMSL (Permanent service of mean sea level), a seamless sea level timeseries (1900-2009) has been obtained for each basin on the basis of statistical procedure involving PCA and Least Square Method. The comparison with satellite data in the period 1993 - 2009 confirms that these are reliable representations of the observed sea level for the whole basin, showing a great agreement with a correlation value of 0.87 and 0.72 for Adriatic and Black Sea respectively. The sea level has been decomposed in various contributions in order to analyze the role of the factors responsible for its interannual variability. The annual cycles of the local effect of pressure (inverse barometer effect IB), of the steric effect due to temperature and salinity variation and of the wind effect have been computed. The largest contribute for the Adriatic Sea is due to the wind, whilst inverse barometer effect plays a minor role and the steric effect seems to be almost negligible. For the Black Sea, on the contrary, wind effect is negligible, and the largest source of variability is due to the Danube river, which is estimated from the available discharge data of Sulina (one of the exits of the Danube delta. Steric and IB effects play both a minor role in this basin. A linear regression model, built considering as predictor the SLP gradient identified at large scale after having carried out the correlation analysis, is capable to explain a further percentage of variability (about 20-25%) of the sea level after subtracting all the factors considered above. Finally, residual sea levels show a

Agent-based modeling allows researchers to investigate theories of complex social phenomena and subsequently use the model to generate new hypotheses that can then be compared to real-world data. However, computer modeling has been underutilized in regard to the understanding of religious systems, which often require very complex theories with multiple interacting variables (Braxton et al. in Method Theory Study Relig 24(3):267-290, 2012. doi: 10.1163/157006812X635709 ; Lane in J Cogn Sci Relig 1(2):161-180, 2013). This paper presents an example of how computer modeling can be used to explore, test, and further understand religious systems, specifically looking at one prominent theory of religious ritual. The process is continuous: theory building, hypothesis generation, testing against real-world data, and improving the model. In this example, the output of an agent-based model of religious behavior is compared against real-world religious sermons and texts using semantic network analysis. It finds that most religious materials exhibit unique scale-free small-world properties and that a concept's centrality in a religious schema best predicts its frequency of presentation. These results reveal that there adjustments need to be made to existing models of religious ritual systems and provide parameters for future models. The paper ends with a discussion of implications for a new multi-agent model of doctrinal ritual behaviors as well as propositions for further interdisciplinary research concerning the multi-agent modeling of religious ritual behaviors. PMID:25851082

Population aging was one of the most distinctive events of the 20th century and will remain important throughout the 21st century. Initially, a phenomenon of more developed countries, the process has recently become apparent in much of the developing world as well. The shift in age structure associated with population aging has a profound impact…

way in which people were reproduced. People depicted are possibly not representative of the general Bernese population as they constituted a socioeconomically advantaged group. What is already know on this topicReviews of individual portraits from the past have found clinical signs of illness that have led to discussions of underlying diseasesGoitre probably affected in excess of 80% of the population of the canton of Berne up to the beginning of the 20th centuryWhat this study addsIn a large series of portraits from the Bernese region, goitre and other diseases are under-representedFindings of age dependent overweight (a survival advantage in times of potential famine) were probably more realisticLikely explanations for this include idealisation depending on sex and age, artistic skills, fashion, and sociocultural significance of illnessA decline in depicted signs of illness from the 19th century may indicate progress of preventive medicine and hygiene PMID:12493682

Geodiversity is the natural and cultural range of geological, geomorphological and soil features. We analysed the large database of 19th and early 20th century paintings of Simonis and Buunk (www.Simonis-Buunk.com) to track changes in the geodiversity of Dutch peatlands since pre-photographic times. Peat dominated in two of the eight main landscapes of the Netherlands: the Lowland peats in the Holocene west and the Highland peats in the sandy Pleistocene eastern parts. Painters were mainly attracted by the lowland peats. Since more than thousand years, peat plays a major role in Dutch military security, economy, ecology and cultural life. Natural variety and cultural use resulted in a geodiversity that is unique in Europe. There are more than 100 place names with 'veen' (= peat), and surnames with 'veen' are common. Proof of the exploitation of peat for salt and fuel exists from the Roman times onwards. In the 9th century, peatlands were drained and reclaimed for growing wheat. Already in the 11th century, it was necessary to build dikes to prevent flooding, to control waterlevels to avoid further oxidation, and to convert landuse to grassland. But subsidence continued, and in the 14th century windmills were needed to drain the lands and pump the water out. In the 16th century industrial peat exploitation fuelled the rise of industries and cities. All this draining and digging caused the peat surface to shrink. The few remaining living peats are conserved by nature organisations. Geodiversity and landscape paintings In the peat landscapes, popular painting motives were high water levels, the grasslands of the 'Green Heart', the winding streams and remaining lakes. The paintings of landscapes where peat had been removed, show watermanagement adaptations: wind mills, different water levels, canals made for the transport of fuel, bridges, tow paths and the 'plassen', i.e. the lakes left after peat exploitation. The droogmakerijen (reclaimed lakes), now 2 to 5 m below

Water year 2011 (October 1, 2010, through September 30, 2011) was a year of hydrologic drought (based on streamflow) in Oklahoma and the second-driest year to date (based on precipitation) since 1925. Drought conditions worsened substantially in the summer, with the highest monthly average temperature record for all States being broken by Oklahoma in July (89.1 degrees Fahrenheit), June being the second hottest and August being the hottest on record for those months for the State since 1895. Drought conditions continued into the fall, with all of the State continuing to be in severe to exceptional drought through the end of September. In addition to effects on streamflow and reservoirs, the 2011 drought increased damage from wildfires, led to declarations of states of emergency, water-use restrictions, and outdoor burning bans; caused at least $2 billion of losses in the agricultural sector and higher prices for food and other agricultural products; caused losses of tourism and wildlife; reduced hydropower generation; and lowered groundwater levels in State aquifers. The U.S. Geological Survey, in cooperation with the Oklahoma Water Resources Board, conducted an investigation to compare the severity of the 2011 drought with four previous major hydrologic drought periods during the 20th century – water years 1929–41, 1952–56, 1961–72, and 1976–81. The period of water years 1925–2011 was selected as the period of record because few continuous record streamflow-gaging stations existed before 1925, and gaps in time existed where no streamflow-gaging stations were operated before 1925. In water year 2011, statewide annual precipitation was the 2d lowest, statewide annual streamflow was 16th lowest, and statewide annual runoff was 42d lowest of those 87 years of record. Annual area-averaged precipitation totals by the nine National Weather Service climate divisions from water year 2011 were compared to those during four previous major hydrologic drought

The most essential achievements in 20th century biology are analyzed and the question of how throughout the last century physicists, chemists and biologists answered the question "What is life?" is considered. The most considerable scientific achievement of 20th century biology, and perhaps of all science, is considered by many to be the discovery by biologist J Watson and physicists F Crick and M Wilkins that resulted in establishing the DNA structure. The related work of well-known scientists of the USA and Europe, E Schrödinger, L Pauling, M Perutz, J Kendrew, and of the Russian scientists N K Koltsov, N V Timofeeff-Ressovsky, G A Gamow, A M Olovnikov, is analyzed. Presently, when the structure of DNA, the process of gene expression and even the genomes of human beings are already known, scientists realize that we still do not know many of the most important things. In our opinion, the 20th century studies of nucleic acids largely ignored the principle of the cyclic organisation of DNA. In this connection, we analyze the principle of cyclicity, which in its generality may well complement the concept of the atomic structure of matter.

Statistics compiled by the National Cancer Institute indicate that, between 1935 and 1974, age-adjusted mortality from most 'Western' cancers (those of the breast, colon, prostate, pancreas, ovary, and kidney) rose dramatically in African-Americans. This phenomenon is paralleled by marked increases in the incidence of these cancers in Asia and Southern Europe during the latter 20th century, in conjunction with increased intakes of dietary animal products. A credible case can be made that diets rich in animal products work in various complementary ways to up-regulate serum levels of insulin, free IGF-I, and free sex hormones: hormones that appear to have important promotional activity for Western cancers. It seems likely that dietary animal product intake by black Americans increased substantially during the 20th century, and that this fact is primarily responsible for their concurrent marked increase in mortality from Western cancers. A whole-food vegan diet rich in fruits and vegetables, especially if coupled with regular exercise and smoking avoidance, could be expected to have a remarkably positive impact on African-American cancer risk, reversing the increases in cancer risk incurred during the 20th century. PMID:11461167

Google Glass was deployed in an Urban Studies field course to gather videographic data for team-based student research projects. We evaluate the potential for wearable computing technology such as Glass, in combination with other mobile computing devices, to enhance reflexive research skills, and videography in particular, during field research.…

Managers of Apple Computer, the company that pioneered campus personal computing and later lost most of its share of the market, are again focusing energies on academic buyers. Campus technology officials, even those fond of Apples, are greeting the company's efforts with caution. Some feel it may be too late for Apple to regain a significant…

A current personal computer has a high clock speed, large cache and disc memory, and it can run graphics and multimedia software systems that were originally designed for graphics workstations exclusively. The goal of the current research is to investigate a personal computer-based visualization system that has its own data visualization software, processing units, and data storage units. The architecture of this personal computer-based visualization system is similar to that of a visualization system using a graphics workstation. Since the personal computer used in the current visualization system is a laptop computer; it is highly portable, and with its network capability, it can be used for some realtime simulation and interactive applications wherever an internet access is available. The data visualization software system is AVS/Express, which can run on personal computers and workstations as well. In this paper, designed considerations of a personal computer-based visualization system will be examined. Performances of this system with a workstation-based visuali system will be compared. Integrated visualization and web techniques with tools such as VRML and JAVA for interactive and cooperative visualization practices will be explored. A visualization case study with data generated by a numerical simulation model will be presented. A live visualization session using a laptop personal computer will be demonstrated.

Mathematics educators often despair at math's austere, "abstract" reputation. This paper describes recent work in developing an application named "HyperGami," which is designed to integrate both the abstract and"real-world" aspects of mathematics by allowing children to design and construct polyhedral models and sculptures. Children use formal…

A 137 m ice core drilled in 1999 from Eastern Bolivian Andes at the summit of Nevado Illimani (16º 37' S, 67º 46' W, 6350 m asl) was analyzed at high temporal resolution, allowing a characterization of trace elements in Andean aerosol trapped in the ice during the 20th century. The upper 50 m of the ice core were dated by multi-proxy analysis of stable isotopes (d18O and d2H), 137Cs and Ca+2 content, electrical conductivity, and insoluble microparticle content, together with reference historical horizons from atmospheric nuclear tests and known volcanic eruptions. This 50 m section corresponds to a record of environmental variations spanning about 80 years from 1919 to 1999. It was cut in 744 sub-samples under laminar flow in a clean bench, which were analyzed by Ion Chromatography for major ionic concentration, by a particle counter for insoluble aerosol content, and by Inductively Coupled Plasma Mass Spectrometry (ICP-MS) for the concentration of 45 chemical species from Li to U. This paper focuses on results of trace element concentrations measured by ICP-MS. The high temporal resolution used in the analyses allowed classifying samples as belonging to dry or wet seasons. During wet season elemental concentrations are low and samples show high crustal enrichment factors. During dry seasons the situation is opposite, with high elemental concentrations and low crustal enrichments. For example, with salt lakes as main sources in the region, average Li concentration during the 20th century is 0.035 and 0.90 ng g-1 for wet and dry seasons, respectively. Illimani average seasonal concentration ranges cover the spectrum of elemental concentration measurements at another Andean ice core site (Sajama) for most soil-related elements. Regional crustal dust load in the deposits was found to be overwhelming during dry season, obfuscating the contribution of biomass burning material. Marked temporal trends from the onset of 20th century to more recent years were identified

A modern massive Porites coral was collected from the Longwan Bay (19??20???N, 110??39???E) on the east coast of the Hainan Island, China. The coral was sectioned vertical to the growth axis into discs of double density-bands representing annual growth. The samples were analyzed for the Sr/Ca ratio by inductively coupled plasma atomic emission spectrometry. The history of winter sea-surface temperature (SST) is reconstructed using the Sr/Ca ratio in winter bands of corals. The winter SST at Xisha in the middle of the South China Sea (SCS) is weakly correlated with the instrument-measured winter monsoon velocity (WMV) with a correlation coefficient of 0.19. The winter SST data from corals at Longwan Bay, Hainan, in the northern SCS are moderately correlated with the WMV (r = 0.40). Interestingly we found that the difference of winter SSTs between the two sites (Xisha and Longwan Bay, Hainan) (the X-H index) is significantly negatively correlated with the WMV (r = - 0.73). This negative correlation may be related to the intrusion of the warm Kuroshio Current into the SCS through the Luzon Strait promoted by the strong northeastern monsoon winds in the winter. Using the relationship between our coralline data and observed WMV, the calculated winter monsoon velocity (WMVc) was obtained for 87??years. This data set in combination with the instrument-measured data between 1993 and 1998 generate a record of WMVc for a period of 93??years from 1906 to 1998. The WMVc in the 20th century shows significant interannual and decadal variability with a trend of persistent decline in the whole 20th century at the rate of decrease of - 0.02 (m/s)/a. The lowest wind velocity occurred during the last two decades of the 20th century. The WMVc has decreased significantly by about 30% from the early to the late of 20th century. The 20th century decline of winter monsoon velocity evidenced from the SCS coral records is consistent with the atmosphere-ocean general circulation models

With the advent of systematic high-resolution satellite photography, striking geometric shapes of banded vegetation several km2 in size, but not apparent from the ground, have been documented for many areas of the arid and semiarid world. Banded vegetation, in which dense perennial vegetation altern...

Many societies view the world as composed of two distinct and complementary spheres: the female (domestic) sphere and the male (public) sphere. Because science was part of the male sphere, women were inhibited from pursuing a career in scientific research. However, the more limited female sphere often found within university departments of home…

The development of the World Wide Web has led to an explosion of educational and clinical resources available via the Internet with minimal effort or special training. However, most of these Web pages contain only static information; few offer dynamic information shaped around clinical or laboratory test findings. In this report we show how this goal can be achieved with the design and construction of Medical Algorithm Web Pages (MAWP). Specifically, using Internet technologies known as forms and CGI scripts we demonstrate how one can implement medical algorithms remotely over the Internet's World Wide Web. To use a MAWP, one enters the URL for the site and then enters information according to the instructions presented there, usually by entering numbers and other information into fields displayed on screen. When all the data is entered, the user clicks on the SUBMIT icon, resulting in a new Web page being constructed "on-the-fly" containing diagnostic calculations and other information pertinent to the patient's clinical management. Four sample applications are presented in detail to illustrate the concept of a Medical Algorithm Web page: Computation of the alveolar-arterial oxygen tension difference using the alveolar gas equation; Computation of renal creatinine clearance; drug infusion calculation (micrograms/kilogram/minute); Computation of the renal failure index. PMID:10179110

Global interaction during the 19th and 20th centuries is examined by using an historical dialectic approach. Intervention is suggested in world affairs that will promote cooperation between developing and industrialized nations. The dialectical approach maintains that the status quo is constantly moving toward upheaval as a result of pressure for…

Virtual worlds (VWs), in which participants navigate as avatars through three-dimensional, computer-generated, realistic-looking environments, are emerging as important new technologies for distance health education. However, there is relatively little documented experience using VWs for international healthcare training. The Geneva Foundation for Medical Education and Research (GFMER) conducted a VW training for healthcare professionals enrolled in a GFMER training course. This paper describes the development, delivery, and results of a pilot project undertaken to explore the potential of VWs as an environment for distance healthcare education for an international audience that has generally limited access to conventionally delivered education. PMID:24555833

This talk, which was the keynote address of the NAS Colloquium on Human-Machine Communication by Voice, discusses the past, present, and future of human-machine communications, especially speech recognition and speech synthesis. Progress in these technologies is reviewed in the context of the general progress in computer and communications technologies. PMID:7479726

Describes a computer program called RiskMap. Explains that after completing an assignment on rural economics and hunger dynamics in Africa, students showed an increased level of understanding and felt that using RiskMap was helpful in learning the material. Includes references. (DAJ)

This paper proposes that those traits which handicap visually oriented dyslexics in a verbally oriented educational system may confer advantages in new fields which rely on visual methods of analysis, especially those in computer applications. It is suggested that such traits also characterized Albert Einstein, Michael Faraday, James Maxwell, and…

The emergence of the information society introduces the academic community to the most significant revolution since the invention of the printing press. The growing use of computers can lead to a depreciation of self-worth. Since the machine can handle complex logical applications with considerably more speed and accuracy than most people, many…

This paper will explore the development of medical education in the Soviet Union, its underlying principles and the subsequent migration of this format into the countries of the Soviet Bloc following World War II. The impact of Perestroika and the collapse of the Warsaw Pact on university training and medical education in particular will be reviewed. The need for external funding as a factor in the emergence of English Parallel courses in Hungary, Czechoslovakia and subsequently in other countries will also be considered. PMID:23596029

Fire is the primary terrestrial ecosystem disturbance agent on a global scale. It affects carbon balance of global terrestrial ecosystems by emitting carbon to atmosphere directly and immediately from biomass burning (i.e., fire direct effect), and by changing net ecosystem productivity and land-use carbon loss in post-fire regions due to biomass burning and fire-induced vegetation mortality (i.e., fire indirect effect). Here, we provide the first quantitative assessment about the impact of fire on the net carbon balance of global terrestrial ecosystems for the 20th century, and investigate the roles of fire direct and indirect effects. This study is done by quantifying the difference between the 20th century fire-on and fire-off simulations with NCAR community land model CLM4.5 as the model platform. Results show that fire decreases net carbon gain of the global terrestrial ecosystems by 1.0 Pg C yr-1 average across the 20th century, as a results of fire direct effect (1.9 Pg C yr-1) partly offset by indirect effect (-0.9 Pg C yr-1). Fire generally decreases the average carbon gains of terrestrial ecosystems in post-fire regions, which are significant over tropical savannas and part of forests in North America and the east of Asia. The general decrease of carbon gains in post-fire regions is because fire direct and indirect effects have similar spatial patterns and the former (to decrease carbon gain) is generally stronger. Moreover, the effect of fire on net carbon balance significantly declines prior to ~1970 with trend of 8 Tg C yr-1 due to increasing fire indirect effect and increases afterward with trend of 18 Tg C yr-1 due to increasing fire direct effect.

Groundwater is a vital resource throughout the Northeast corridor and is an important water source for domestic, industrial and irrigation purposes. During the 20th century, suburban groundwater withdrawals intensified with increasing population growth, the advent of rural electrification and sophisticated pumping technologies, thus, the need for effective groundwater management becomes increasingly important in the region. Data from the Unites States Geological Survey National Water-Use Information Program documents this concentrated use of groundwater in suburban areas, and is particularly prominent across the majority of New Jersey. Focusing on New Jersey as an area of significant groundwater use and increasing demand, this project investigates total groundwater withdrawals in conjunction with a policy-based framework, facilitating an awareness of groundwater impacts as informed through existing policy during the 20th century. The objectives of this study are to identify the relevant federal, statewide and municipal policies that evolved in the state of New Jersey during the 20th century, and examine the groundwater withdrawal trends for the state of New Jersey between 1950 - 2005. Preliminary results revealed that increased restrictions on groundwater policy between 1982 and 1997 had an observable affect on reducing total groundwater withdrawals. Multivariate regression analyses using indicator variables, i.e. mixed effects model, will be used to explore relationships between county specific withdrawals and significant policy that may have influenced groundwater usage. It is anticipated to observe a strong correlation between groundwater withdrawals and the effectiveness of the implemented groundwater policies. Future collaborative work will further investigate the effectiveness of policy as hydrologically evidenced by alterations in baseflow contribution to streamflow, and groundwater persistence.

Background: The consumption of omega-3 (n–3) and omega-6 (n–6) essential fatty acids in Western diets is thought to have changed markedly during the 20th century. Objective: We sought to quantify changes in the apparent consumption of essential fatty acids in the United States from 1909 to 1999. Design: We calculated the estimated per capita consumption of food commodities and availability of essential fatty acids from 373 food commodities by using economic disappearance data for each year from 1909 to 1999. Nutrient compositions for 1909 were modeled by using current foods (1909-C) and foods produced by traditional early 20th century practices (1909-T). Results: The estimated per capita consumption of soybean oil increased >1000-fold from 1909 to 1999. The availability of linoleic acid (LA) increased from 2.79% to 7.21% of energy (P < 0.000001), whereas the availability of α-linolenic acid (ALA) increased from 0.39% to 0.72% of energy by using 1909-C modeling. By using 1909-T modeling, LA was 2.23% of energy, and ALA was 0.35% of energy. The ratio of LA to ALA increased from 6.4 in 1909 to 10.0 in 1999. The 1909-T but not the 1909-C data showed substantial declines in dietary availability (percentage of energy) of n−6 arachidonic acid, eicosapentaenoic acid (EPA), and docosahexaenoic acid (DHA). Predicted net effects of these dietary changes included declines in tissue n--3 highly unsaturated fatty acid status (36.81%, 1909-T; 31.28%, 1909-C; 22.95%, 1999) and declines in the estimated omega-3 index (8.28, 1909-T; 6.51, 1909-C; 3.84, 1999). Conclusion: The apparent increased consumption of LA, which was primarily from soybean oil, has likely decreased tissue concentrations of EPA and DHA during the 20th century. PMID:21367944

Fire is the primary form of terrestrial ecosystem disturbance on a global scale. It affects the net carbon balance of terrestrial ecosystems by emitting carbon directly and immediately into the atmosphere from biomass burning (the fire direct effect), and by changing net ecosystem productivity and land-use carbon loss in post-fire regions due to biomass burning and fire-induced vegetation mortality (the fire indirect effect). Here, we provide the first quantitative assessment of the impact of fire on the net carbon balance of global terrestrial ecosystems during the 20th century, and investigate the roles of fire's direct and indirect effects. This is done by quantifying the difference between the 20th century fire-on and fire-off simulations with the NCAR Community Land Model CLM4.5 (prescribed vegetation cover and uncoupled from the atmospheric model) as a model platform. Results show that fire decreases the net carbon gain of global terrestrial ecosystems by 1.0 Pg C yr-1 averaged across the 20th century, as a result of the fire direct effect (1.9 Pg C yr-1) partly offset by the indirect effect (-0.9 Pg C yr-1). Post-fire regions generally experience decreased carbon gains, which is significant over tropical savannas and some North American and East Asian forests. This decrease is due to the direct effect usually exceeding the indirect effect, while they have similar spatial patterns and opposite sign. The effect of fire on the net carbon balance significantly declines until ∼1970 with a trend of 8 Tg C yr-1 due to an increasing indirect effect, and increases subsequently with a trend of 18 Tg C yr-1 due to an increasing direct effect. These results help constrain the global-scale dynamics of fire and the terrestrial carbon cycle.

Since the publication of Plant, Hammond, and Turner (2004), which highlighted a pressing need for researchers to pay more attention to sources of error in computer-based experiments, the landscape has undoubtedly changed, but not necessarily for the better. Readily available hardware has improved in terms of raw speed; multi core processors abound; graphics cards now have hundreds of megabytes of RAM; main memory is measured in gigabytes; drive space is measured in terabytes; ever larger thin film transistor displays capable of single-digit response times, together with newer Digital Light Processing multimedia projectors, enable much greater graphic complexity; and new 64-bit operating systems, such as Microsoft Vista, are now commonplace. However, have millisecond-accurate presentation and response timing improved, and will they ever be available in commodity computers and peripherals? In the present article, we used a Black Box ToolKit to measure the variability in timing characteristics of hardware used commonly in psychological research. PMID:19587169

The 20th century climate for the Southeastern United States and surrounding areas as simulated by global climate models used in the Coupled Model Intercomparison Project Phase 5 (CMIP5) was evaluated. A suite of statistics that characterize various aspects of the regional climate was calculated from both model simulations and observation-based datasets. CMIP5 global climate models were ranked by their ability to reproduce the observed climate. Differences in the performance of the models between regions of the United States (the Southeastern and Northwestern United States) warrant a regional-scale assessment of CMIP5 models.

Prejudice and stigma against people with mental illness can be seen throughout history. The worst instance of this prejudice was connected to the rise of the eugenics movement in the early 20th century. Although the Nazi German T-4 program of killing people with mental illness was the most egregious culmination of this philosophy, the United States has its own dark eugenics history-nearing a slippery slope all too similar to that of the Nazis. Mental health care clinicians need to examine this period to honor the memory of the victims of eugenics and to guarantee that nothing like this will ever happen again. PMID:23197125

During the Balkan wars and particularly after the national disaster of Asia Minor, Thessaloniki had to lodge many tenths of thousands of refugees. Then the town authorities created settlements for their hosting. These settlements were rapidly transformed to urban quarters of the town leading to an extension of the town to any direction, particularly to the NW and SE directions without any urban planning. These settlements and later on urban quarters were mapped in different Charts of the town during the 20th Century. The study of these maps is the subject of this paper.(in Greeks)

This study reconstructs a series of droughts and high flow volumes of the Bermejo River from the 17th to 20th century based on a content analysis of historic documentary evidence, which is calibrated with instrumental climate data. The historic data series shows an increase in the frequency of extraordinarily high waters beginning in the 19th century and a significant decrease in extreme droughts beginning in 1890. The data are compared to variations in the Mendoza River for the same period, which show that there was a long-standing lack of correlation between the rivers.

Fire is the dominant natural disturbance in southeastern Australia. For millennia it has been the driving force shaping terrestrial ecosystems in the region -- simultaneously killing vegetation and initiating regeneration across whole landscapes. Fire regimes across the region are driven by several factors including climate, vegetation, and ignition sources. Humans have been a significant contributing factor to past and present fire regimes. Prior to European settlement in the late 1700s, Aboriginal Australians used frequent, low-intensity fires to manage vegetation across much of the landscape. European settlement led to the displacement of Aboriginal communities and a shift to active fire suppression and control. This changing approach to fire management is widely believed to have initiated a fundamental shift towards extreme, high-intensity fire events as fuel loads increased. In addition, during the 20th Century prolonged periods of warm, dry conditions have occurred with greater frequency and intensity. The relative importance of climate and fire management practices on contemporary fire regimes is vigorously debated in Australia and is directly relevant to land management policies and their implementation. To put the current fire regime into historical context, we used a multi-proxy approach combining palaeo-charcoal and tree-ring analyses to assess how fire regimes have changed over the last 3000 years in the Snowy Mountains region of southeastern Australia. We found almost no evidence of high-intensity fires in the 3000 years that preceded the 20th Century. However, in the mid-20th Century there is a sudden and dramatic increase in the presence of charcoal and the pulsed establishment of trees across the landscape, suggesting a recent shift from low-intensity fires with minimal charcoal signatures to moderate- to high-intensity fires with substantial charcoal inputs. Importantly, the tree-ring data demonstrate that most of these fires were not stand

Topological classification of the 4-manifolds bridges computation theory and physics. A proof of the undecidability of the homeomorphy problem for 4-manifolds is outlined here in a clarifying way. It is shown that an arbitrary Turing machine with an arbitrary input can be encoded into the topology of a 4-manifold, such that the 4-manifold is homeomorphic to a certain other 4-manifold if and only if the corresponding Turing machine halts on the associated input. Physical implications are briefly discussed.

We study the scalability of parallel discrete-event simulations for arbitrary short-range interacting systems with asynchronous dynamics. When the synchronization topology mimics that of the short-range interacting underlying system, the virtual time horizon (corresponding to the progress of the processing elements) exhibits Kardar-Parisi-Zhang-like kinetic roughening. Although the virtual times, on average, progress at a nonzero rate, their statistical spread diverges with the number of processing elements, hindering efficient data collection. We show that when the synchronization topology is extended to include quenched random communication links between the processing elements, they make a close-to-uniform progress with a nonzero rate, without global synchronization. We discuss in detail a coarse-grained description for the small-world synchronized virtual time horizon and compare the findings to those obtained by simulating the simulations based on the exact algorithmic rules.

The Biennial Conferences on Chemical Education have developed into the largest gathering of chemical educators in the world because they reflect the wide interests of people who teach chemistry in schools, colleges, and universities as well as all those with an interest in chemical education—at the American Chemical Society, in industry, and in government. What makes a BCCE exciting is the opportunity to gather in an informal setting to socialize and to share ideas, expertise, and experience with colleagues who are committed to excellence in chemical education. You may sit down at lunch with someone you just heard give a presentation of interest or chat with the author or publisher of the textbook or lab manual you are using.

In this paper evidence of anthropogenic influence over the warming of the 20th century is presented and the debate regarding the time-series properties of global temperatures is addressed in depth. The 20th century global temperature simulations produced for the Intergovernmental Panel on Climate Change’s Fourth Assessment Report and a set of the radiative forcing series used to drive them are analyzed using modern econometric techniques. Results show that both temperatures and radiative forcing series share similar time-series properties and a common nonlinear secular movement. This long-term co-movement is characterized by the existence of time-ordered breaks in the slope of their trend functions. The evidence presented in this paper suggests that while natural forcing factors may help explain the warming of the first part of the century, anthropogenic forcing has been its main driver since the 1970’s. In terms of Article 2 of the United Nations Framework Convention on Climate Change, significant anthropogenic interference with the climate system has already occurred and the current climate models are capable of accurately simulating the response of the climate system, even if it consists in a rapid or abrupt change, to changes in external forcing factors. This paper presents a new methodological approach for conducting time-series based attribution studies. PMID:23555866

Although atherosclerosis is widely thought to be a disease of modernity, computed tomographic evidence of atherosclerosis has been found in the bodies of a large number of mummies. This article reviews the findings of atherosclerotic calcifications in the remains of ancient people-humans who lived across a very wide span of human history and over most of the inhabited globe. These people had a wide range of diets and lifestyles and traditional modern risk factors do not thoroughly explain the presence and easy detectability of this disease. Nontraditional risk factors such as the inhalation of cooking fire smoke and chronic infection or inflammation might have been important atherogenic factors in ancient times. Study of the genetic and environmental risk factors for atherosclerosis in ancient people may offer insights into this common modern disease. PMID:25667088

A brain computer interface that enables navigation through a virtual environment (VE) using four different navigation commands (turn right, turn left, move forward and move back) is presented. A graphical interface allows subjects to select a specific command. In this interface, the different navigation commands are surrounding a circle. A bar in the center of the circle is continuously rotating. The subject controls, by only two mental tasks, the bar extension to reach the chosen command. In this study, after an initial training based on three sessions, 8 out of 15 naive subjects were able to navigate through the VE discriminating between imagination of right-hand movements and relaxed state. All subjects (except one) improved their performance in each run and a mean error rate of 23.75% was obtained. PMID:19469662

A 137 m ice core drilled in 1999 from Eastern Bolivian Andes at the summit of Nevado Illimani (16° 37' S, 67° 46' W, 6350 m a.s.l.) was analyzed at high temporal resolution, allowing a characterization of trace elements in Andean aerosol trapped in the ice during the 20th century. The upper 50 m of the ice core were dated by multi-proxy analysis of stable isotopes (d18O and d2H), 137Cs and Ca+2 content, electrical conductivity, and insoluble microparticle content, together with reference historical horizons from atmospheric nuclear tests and known volcanic eruptions. This 50 m section corresponds to a record of environmental variations spanning about 80 years from 1919 to 1999. It was cut in 744 sub-samples under laminar flow in a clean bench, which were analyzed by Ion Chromatography for major ionic concentration, by a particle counter for insoluble aerosol content, and by Inductively Coupled Plasma Mass Spectrometry (ICP-MS) for the concentration of 45 chemical species from Li to U. This paper focuses on results of trace element concentrations measured by ICP-MS. The high temporal resolution used in the analyses allowed classifying samples as belonging to dry or wet seasons. During wet season elemental concentrations are low and samples show high crustal enrichment factors. During dry seasons the situation is opposite, with high elemental concentrations and low crustal enrichments. For example, with salt lakes as main sources in the region, average Li concentration during the 20th century is 0.035 and 0.90 ng g-1 for wet and dry seasons, respectively. Illimani average seasonal concentration ranges cover the spectrum of elemental concentration measurements at another Andean ice core site (Sajama) for most soil-related elements. Regional crustal dust load in the deposits was found to be overwhelming during dry season, obfuscating the contribution of biomass burning material. Marked temporal trends from the onset of 20th century to more recent years were

Sakurajima volcano is a post-caldera volcano of Aira caldera and has repeated large plinian eruptions with dormant periods in AD 1471, AD 1779 and AD 1914. After AD 1914 eruption, medium scale of lava effusion occurred in AD 1946. Since AD 1955, frequent vulcanian eruptions have repeated until now. Thus, mode of eruptive activity of the volcano has changed since 20th century. Based on temporal change of petrological features of these eruptive materials, We discuss the relationship between the mode of eruptive activity and magma system to forecast the future eruptive activity. The rocks of AD 1471 and AD 1779 eruptions are CPX-OPX dacite, in which normally and reversely zoned pyroxene and plagioclase phenocrysts coexist. In addition, compositional distribution of plagioclase phenocrysts is bi-modal. These suggest that these rocks are mixing products between dacitic and andesitic magmas. This is consistent with compositional variations of whole-rock chemistry for these rocks. On the other hand, the rocks of AD 1914 and AD 1946 eruption often contain olivine phenocrysts. Plagioclase and pyroxenes phenocrysts in these rocks show similar features to those of AD 1471 and AD 1779 eruptions, suggesting that these rocks are also mixing products of two end-member magmas, dacitic and andesitic ones. However, olivine phenocrysts are much magnesian compared with pyroxenes phenocrysts, indicating that these olivine phencorysts are derived from another basaltic magma. Thus, the basaltic magma injected into the mixed magma between dacitic and andesitic ones. Mixing among three magmas has been recognized since 20th century. The rocks from frequent eruptions since AD 1955 also contain minor amount of olivine phenocrysts, suggesting the injection of basaltic magma has continued. In 1970's and AD 1987 periods, relatively larger scale of vulcanian eruptions had occurred. The rocks from these periods contain considerable amount of olivine phenocrysts, indicating mixing ratio of the

Our study reports computer software that simulates the work of a single glycine receptor (GlyR). GlyRs have been found in various types of tissues, but their most important role seems to be in neurons, where they hyperpolarise membranes by opening chloride transmembrane channels. The software is based on a combination of two blocks. One block describes the Brownian dynamics of charged particle motion in a dielectric medium, and the other block determines the probability and timing of receptor activation. Using this software, the voltage-current dependencies and time curves of the transmembrane current were obtained. The mean value of the simulated anion current (4.5 ± 0.3 pA) is in good agreement with measured values under identical conditions ([Formula: see text] pA). It was shown that there is a condition under which the GlyR anion channel remains active despite a negligible chloride gradient. Virtual experiments allow evaluation of the value of half maximal effective concentration (EC[Formula: see text]) of the GlyR ([Formula: see text] [Formula: see text]M) and confirm that this receptor activates according to a mechanism involving three ligand binding sites. The advantage of the model is the ability to adjust parameters to the precise demands of experimental researchers. Moreover, the introduced algorithm has low computational power demands; therefore, it can be used as a research tool for assistance with structural experiments and applied aspects of neurophysiology. PMID:27412156

Between the end of the 19th century and the beginning of the 20th numerous asbestos industries began operations in various parts of the world. At the time of the First World War there is ample evidence of the use of this mineral in shipbuilding, the aircraft industry and in the construction industry. In the years 1912-17 the writer Franz Kafka was co-proprietor of a small asbestos factory in Prague. Some of the writer's novels and journal pages were inspired by this experience. In this way asbestos entered into the history of 20th century European literature. In 1917 asbestos extraction was started at the quarry in Balangero, near Turin, Italy. Risks related to the use of asbestos were known at the beginning of the 20th century and legislation aimed at preventing the harmful effects of the mineral were approved in Italy. PMID:26621063

Malaria case statistics were analysed for the period 1926 to 1960 to identify inter-annual variations in malaria cases for the Uganda Protectorate. The analysis shows the mid-to-late 1930s to be a period of increased reported cases. After World War II, malaria cases trend down to a relative minimum in the early 1950s, before increasing rapidly after 1953 to the end of the decade. Data for the Western Province confirm these national trends, which at the time were attributed to a wide range of causes, including land development and management schemes, population mobility, interventions and misdiagnosis. Climate was occasionally proposed as a contributor to enhanced case numbers, and unusual precipitation patterns were held responsible; temperature was rarely, if ever, considered. In this study, a dynamical malaria model was driven with available precipitation and temperature data from the period for five stations located across a range of environments in Uganda. In line with the historical data, the simulations produced relatively enhanced transmission in the 1930s, although there is considerable variability between locations. In all locations, malaria transmission was low in the late 1940s and early 1950s, steeply increasing after 1954. Results indicate that past climate variability explains some of the variations in numbers of reported malaria cases. The impact of multiannual variability in temperature, while only on the order of 0.5°C, was sufficient to drive some of the trends observed in the statistics and thus the role of climate was likely underestimated in the contemporary reports. As the elimination campaigns of the 1960s followed this partly climate-driven increase in malaria, this emphasises the need to account for climate when planning and evaluating intervention strategies. PMID:27063740

Climate change is a multi-faceted (or 'wicked') problem. True climate literacy therefore requires understanding not only the workings of the climate system, but also the current and potential future impacts of climate change and sea level rise on individuals, communities and countries around the world, as noted in the US Global Change Research Program's (2009) Climate Literacy: The Essential Principles of Climate Sciences. The asymmetric nature of climate change impacts, whereby the world's poorest countries have done the least to cause the problem but will suffer disproportionate consequences, has also been widely noted. Education in climate literacy therefore requires an element of ethics in addition to physical and social sciences. As if addressing these multiple aspects of climate change were not challenging enough, polling data has repeatedly shown that many members of the public tend to see climate change as a far away problem affecting people remote from them at a point in the future, but not themselves. This perspective is likely shared by many students. Computer gaming provides a possible solution to the combined problems of, on the one hand, addressing the multi-faceted nature of climate change, and, on the other hand, making the issue real to students. Fate of the World, a game produced by the company Red Redemption, has been used on several occasions in a small (20-30 students) introductory level general education course on global warming at Weber State University. Players are required to balance difficult decisions about energy investment while managing regional political disputes and attempting to maintain minimum levels of development in the world's poorer countries. By providing a realistic "total immersion" experience, the game has the potential to make climate change issues more immediate to players, and presents them with the ethical dilemmas inherent in climate change. This presentation reports on the use of Fate of the World in an educational

History is made up of individual events. The modern ocean wind waves research has been active for nearly 70 years since the early years of the decade of 1940's while the World War II was still fighting in earnest and Sverdrup and Munk were embarked on an unprecedented attempt to make wave condition prediction for Navy Amphibious forces carrying out landing operation. That was certainly a monumental event that started the modern ocean wind wave's research. Here I wish to present a set of other monumental events in the intervening years which, in my personal view, are vital to the formation of our present day conventional ocean wind wave's research: • Circa 1945: The war time invention of underwater pressure wave gage that measures pressure fluctuations induced by surface waves and also marked as the start of single-point wave measurements prevalent today. • Circa 1950: When oceanographer Pierson met statistician Tukey and ocean wave spectrum analysis was thereby born. • Circa 1952: Something old something new - Longuet-Higgins introduced the distribution function of Load Rayleigh to the emerging ocean wave data analysis and Rayleigh distribution has been the mainstay of ocean wind wave's research ever since. • Circa 1953: Neumann started the quest to formulate a wind wave spectrum with his impressive first empirical spectrum before spectrum was widely measured. • Circa 1957: Phillips worked out the resonance theory for wind wave's generation. • Circa 1957: Miles simultaneously developed the shear flow model for wind wave's generation, complementary to Phillips theory. • Circa 1959: Hasselmann formulated the source function to start the first framework of comprehensive wind wave modeling. These are all the basic innovative milestones that the bulk of the conventional ocean wind wave research studies today were evolved from. While the monumental status of these works may represent merely the personal opinion of a single aficionado, I do feel that they

In 2008, there were more than 5,000 motorcycle crash fatalities in the United States. Many states have motorcycle helmet laws that are meant to protect riders during a crash. After recruiting motorcycle occupants injured in crashes, a protocol was established to scan three different types of motorcycle helmets commonly worn (cap, ¾ shield, and full face shield) using a computed tomography (CT) scanner. The protocol developed was for a GE 64 slice PET/CT Discovery VCT scanner with axial images from anterior to posterior helmet acquired in helical mode. It had 512x512 resolution and the full face and ¾ face shield helmets were scanned with greater voxels in the axial plane compared to the skull cap helmets. New helmets were scanned as exemplary images for comparison with helmets involved in motorcycle crashes. After CT scans were gathered, three-dimensional reconstructions were made to visualize scratches and impacts to the exterior of the helmets. Initial work was also conducted in analyzing interior components, and a trend was seen in decreased thickness between the interior foam and shell with sides of the exterior helmet thought to have contacted roadside barriers or the ground during motorcycle crashes. These helmet analysis methods have been established, and will be used to investigate multiple motorcycle crashes in conjunction with occupant injuries and direct head impacts to improve helmet design and the understanding of head injuries. This work also establishes the basis for development of finite element models of three of the most common helmet types. PMID:21525626

Advances in observational systems and reduction in their cost are allowing us to explore, monitor, and digitally represent our environment in unprecedented details and over large areas. Low cost in situ sensors, unmanned autonomous vehicles, imaging technologies, and other new observational approaches along with airborne and space borne systems are allowing us to measure nearly everything, almost everywhere, and at almost all the time. Under the aegis of observatories they are enabling an integrated view across space and time scales ranging from storms to seasons to years and, in some cases, decades. Rapid increase in the convergence of computational, communication and information systems and their inter-operability through advances in technologies such as semantic web can provide opportunities to further facilitate fusion and synthesis of heterogeneous measurements with knowledge systems. This integration can enable us to break disciplinary boundaries and bring sensor data directly to desktop or handheld devices. We describe CyberInfrastructure effort that is being developed through projects such as Earthcube Geosemantics (http://geosemantics.hydrocomplexity.net), (SEAD (http://sead-data.net/), and Browndog (http://browndog.ncsa.illinois.edu/)s o that data across all of earth science can be easily shared and integrated with models. This also includes efforts to enable models to become interoperable among themselves and with data using technologies that enable human-out-of-the-loop integration. Through such technologies our ability to use real time information for decision-making and scientific investigations will increase multifold. The data goes through a sequence of steps, often iterative, from collection to long-term preservation. Similarly the scientific investigation and associated outcomes are composed of a number of iterative steps from problem identification to solutions. However, the integration between these two pathways is rather limited. We describe

Extra-tropical cyclones and wind storms are responsible for a large portion of damages all around the globe. Thus, knowledge about the temporal variability of these events during the past is of high socio-economic importance. In this study, the temporal variability of extra-tropical cyclones and wind storms during the past century are analysed using ERA-20C and NOAA-20CR reanalysis datasets. Cyclones are identified using six hourly mean sea level pressure fields whereas wind storms are identified based on near-surface wind speeds. Analyses focus on wintertime events over both hemispheres and also for several subregions. Long-term trends as well as higher-frequency variability are investigated. Therefore, cyclone and wind storm time series are low-pass filtered with a cut-off frequency of 1/31 years using 31 weights and the high-frequency time series are obtained by calculating the residuum of the original and low pass-filtered time series. To analyse long-term trends, a linear regression model is fitted to the original time series for three different periods: 1901-1930, 1931-1960 and 1961-1999. Results suggest substantial differences regarding long-term trends between ERA-20C and NOAA-20CR for cyclones and wind storm events, especially during the first half of the 20th century. In general, a better agreement is found for extreme cyclones compared to all cyclones. Furthermore, high-frequency variability over the Northern Hemisphere is in good agreement for cyclones and wind storms over most regions and throughout the century, with the highest correlations found at the end of the 20th century. Analyses for the Southern Hemisphere show smaller agreement between ERA-20C and NOAA-20CR for cyclone and wind storms regarding their high-frequency variability. The results of this study indicate that no reliable conclusion regarding long-term variability of cyclones and wind storms can be drawn, based on solely these two 20th century reanalysis products. However, analyses

Here we have used the Self Calibrated PDSI (scPDSI) proposed by Wells et al (2004) as a more appropriate approach to characterize drought conditions in the Mediterranean area. The scPDSI has been shown to perform better (than the original PDSI) when evaluating spatial and temporal drought characteristics for regions outside the USA (Schrier et al, 2005). Seasonal and annual trends for the 1901-2000, 1901-1950 and 1951-2000 periods were computed using the standard Mann-Kendall test for trend significance evaluation. However, statistical significance obtained with this test can be highly misleading because it does not take into account the low variability nature that dominates the seasonal evolution of scPDSI fields. We have now improved these results by employing a modified Mann-Kendall test for auto-correlated series (Hamed and Ramachandra, 1997), such as the scPDSI case. This development allowed for a better definition of the Mediterranean areas characterized by significant changes in the scPDSI, namely the largely negative trends that dominate the Mediterranean basin, with the exceptions of parts of eastern Turkey and northwestern Iberia, since initially these areas were overestimated. The spatio-temporal variability of these indices was evaluated with an EOF analysis, in order to reduce the large dimensionality of the fields under analysis. Spatial representation of the first EOF patterns shows that EOF 1 covers the entire Mediterranean basin (16.4% of EV), while EOF2 is dominated by a W-E dipole (10% EV). The following EOF patterns present smaller scale features, and explain smaller amounts of variance. The EOF patterns have also facilitated the definition of four sub-regions with large socio-economic relevance: 1) Iberia, 2) Italian Peninsula, 3) Balkans and 4) Turkey. Afterwards we perform a comprehensive analysis on the links between the scPDSI and the large-scale atmospheric circulation indices that affect the Mediterranean basin, namely; NAO, EA, and SCAND

A technical symposium, aircraft display dedication, and pilots' panel discussion were held on May 27, 1992, to commemorate the 20th anniversary of the first flights of the F-8 Digital Fly-By-Wire (DFBW) and Supercrit- ical Wing (SCW) research aircraft. The symposium featured technical presentations by former key government and industry participants in the advocacy, design, aircraft modification, and flight research program activities. The DFBW and SCW technical contributions are cited. A dedication ceremony marked permanent display of both program aircraft. The panel discussion participants included eight of the eighteen research and test pilots who flew these experimental aircraft. Pilots' remarks include descriptions of their most memorable flight experiences The report also includes a survey of the Gulf Air War, and an after-dinner presentation by noted aerospace author and historian Dr. Richard Hallion.

Historical topographic maps are the only systematically collected data resource covering the entire nation for long-term landscape change studies over the 20th century for geographical and environmental research. The paper discusses aspects of the historical U.S. Geological Survey topographic maps that present constraints on the design of a database for such studies. Problems involved in this approach include locating the required maps, understanding land feature classification differences between topographic vs. land use/land cover maps, the approximation of error between different map editions of the same area, and the identification of true changes on the landscape between time periods. Suggested approaches to these issues are illustrated using an example of such a study by the author.

The absorption curves of the cosmic ray charged component for solar minima in 1965 and 1975 to 1977 are analyzed on the basis of daily stratospheric measurements in Murmansk, Moscow, Alma-Ata and Mirny (Antarctic). Two distinct features in the energy spectra of galactic cosmic rays are revealed during these periods. At the 20th solar activity minimum there was the additional short range component of cosmic rays. Additional fluxes in the stratosphere at high latitudes caused by this component are probably protons and He nuclei with the energy 100 to 500 MeV/n. The fluxes are estimates as Approx. 300 sq m/s/sr. At the minimum in 1975 to 1977 the proton intensity in the energy range 1 to 15 GeV is 10 to 15% lower than that in the 1965 solar activity minimum.

The National Society of Black Physicists will hold its Twentieth annual meeting and its XXIIII Day of Scientific Lectures at Lawrence Berkeley National Laboratory on March 27th - 30th, 1997. The meeting provides a major opportunity for African American physicists and students to present their current research and discuss issues germane to the constituency. It is therefore crucial to have the broadest cross-section of the membership at each meeting. The Lawrence Berkeley National Laboratory was chosen as the site of the 20th annual meeting because of its historical significance to Physics (being one of the first national laboratories in the United States) and the laboratories continuing support of the goals and objectives of the society.

A technical symposium, aircraft display dedication, and pilots' panel discussion were held on May 27, 1992. to commemorate the 20th anniversary of the first flights of the F-8 Digital Fly-By-Wire (DFBW) and Supercritical Wing (SCW) research aircraft. The symposium featured technical presentations by former key government and industry participants in the advocacy, design, aircraft modification, and flight research program activities. The DFBW and SCW technical contributions are cited. A dedication ceremony marked permanent display of both program aircraft. The panel discussion participants included eight of the eighteen research and test pilots who flew these experimental aircraft. Pilots' remarks include descriptions of their most memorable flight experiences. The report also includes a survey of the Gulf Air War, an after-dinner presentation by noted aerospace author and historian Dr. Richard Hallion.

The principal features of the thermodynamic regime of the stratosphere are governed by the development of the winter stratospheric low and the Aleutian and Atlantic heights. These are fed by the influx of the eddy energy transported into the stratosphere by the planetary waves. The intensity and variability of planetary waves and vortices associated with the waves determine the conditions of low-to-high latitudes ozone transport in the winter hemisphere. The ozone distributions are zonally inhomogeneous. The planetary wave dynamics are affected by solar activity variations during solar cycles. The 20th solar cycle maximum was accompanied by decreases of stratospheric planetary wave amplitudes, the 21st cycle was accompanied by increases of amplitudes.

Geodetic leveling observations from Biloxi, MS, to New Orleans, LA, and water level gauge measurements in the New Orleans-Lake Pontchartrain area were analyzed to infer late 20th century vertical motions. These data were used to test the validity of previous subsidence rate measurements and the models that predict the location and causes of subsidence. Water gauges attached to bridge foundations and benchmarks affixed to deep rods that penetrate Holocene strata subsided as much as 0.8 m locally between 1955 and 1995. The observed deep-seated subsidence far exceeds model predictions and demonstrates that shallow processes such as compaction and consolidation of Holocene sediments are inadequate by themselves to explain late 20th century subsidence. Deep-seated subsidence occurring east and north of the normal faults marking the Gulf of Mexico basin margin can be explained by local groundwater withdrawal, and regional tectonic loading of the lithosphere by the modern Mississippi River delta (MRD). Sharp changes in subsidence coincide with strands of the basin margin normal faults. Displacements are consistent with activity and show motions consonant with fault creep. Deep subsidence of the region to the south, including New Orleans, can be explained by a combination of groundwater withdrawal from shallow upper Pleistocene aquifers, the aforementioned lithospheric loading, and perhaps, nongroundwater-related faulting. Subsidence due to groundwater extraction from aquifers ˜160 to 200 m deep dominated urbanized areas and is likely responsible for helping to lower local flood protection structures and bridges by as much as ˜0.8 m.

The surface solar radiation (SSR) is the fundamental source of energy in the climate system, and consequently the source of life on our planet, due to its central role in the surface energy balance. Therefore, a significant impact on temperatures is expected due to the widespread dimming/brightening phenomenon observed since the second half of the 20th century (Wild, 2009). Previous studies pointed out the effects of SSR trends in temperatures series over Europe (Makowski et al., 2009; Philipona et al., 2009), although the lack of long-term SSR series limits these results. This work describes an updated sunshine duration (SS) dataset compiled by the European Climate Assessment and Dataset (ECA&D) project based on around 300 daily time series over Europe covering the 1961-2010 period. The relationship between the SS and temperature series is analysed based on four temperature variables: maximum (TX), minimum (TN) and mean temperature (TG), as well as the diurnal temperature range (DTR). Regional and pan-European mean series of SS and temperatures are constructed. The analyses are performed on annual and seasonal scale, and focusing on the interannual and decadal agreement between the variables. The results show strong positive correlations on interannual scales between SS and temperatures over Europe, especially for the DTR and TX during the summer period and regions in Central Europe. Interestingly, the SS and temperatures series show a tendency towards higher correlations in the smoothed series, both for different regions and temperature variables. These results confirm the relationship between temperature and SS trends over Europe since the second half of the 20th century, which has been speculated to partially decrease (increase) temperatures during the dimming (brightening) period (Makowski et al., 2009; Wild, 2009). Further research is needed to confirm this cause-effect relationship currently found only using correlation analysis.

Summertime anomalous high temperatures leading to enormous damages to public health and society are increasing continuously since late 20th century. However, the rise of global surface air temperature (SAT) has apparently been slowed in these 15 years. It is not clear why hot summers become more frequent despite the recent global-warming hiatus. Here we present, using ensembles of an atmospheric general circulation model (AGCM), that the continuous increase of hot summers over the Northern Hemisphere (NH) was largely due to direct effect of radiative forcing by the increasing concentration of greenhouse gases (GHGs) and to natural decadal climate variability. The 10-member AGCM ensemble, when forced by 63-year histories of observed sea surface temperature (SST), aerosols, land cover and GHGs, reproduced well the increasing hot summers, but another ensemble without changes in anthropogenic forcings and ocean surface warming still showed a small increase. Decadal SST variations in the Pacific and Atlantic Ocean contributed to the recent increase of hot summers over North America through atmospheric teleconnections. Analyses of the two and an additional ensembles show that local radiative heating over land due to increasing GHGs contributed by 39% to continuous increase over the mid- and high-latitude land areas. The contribution of the direct effect of anthropogenic forcing is independent of the global SST hiatus, suggesting a continuous increase in heat extremes over the land due to this effect even if the climate hiatus persist for the coming decades.ReferenceKamae, Y., H. Shiogama, M. Watanabe, and M. Kimoto (2014), Attributing the increase in Northern Hemisphere hot summers since the late 20th century. Geophys. Res. Lett., doi:10.1002/2014GL061062.

In this article, the predictability of the 20th century sea-surface temperature (SST) forced East African short rains variability is analyzed using observational data and ensembles of long atmospheric general circulation model (AGCM) simulations. To our knowledge, such an analysis for the whole 20th century using a series of AGCM ensemble simulations is carried out here for the first time. The physical mechanisms that govern the influence of SST on East African short rains in the model are also investigated. It is found that there is substantial skill in reproducing the East African short rains variability, given that the SSTs are known. Consistent with previous recent studies, it is found that the Indian Ocean and in particular the western pole of the Indian Ocean dipole (IOD) play a dominant role for the prediction skill, whereas SSTs outside the Indian Ocean play a minor role. The physical mechanism for the influence of the western Indian Ocean on East African rainfall in the model is consistent with previous findings and consists of a gill-type response to a warm (cold) anomaly that induces a westerly(easterly) low-level flow anomaly over equatorial Africa and leads to moisture flux convergence (divergence) over East Africa. On the other hand, a positive El Nino-Southern Oscillation (ENSO) anomaly leads to a spatially non-coherent reducing effect over parts of East Africa, but the relationship is not strong enough to provide any predictive skill in our model. The East African short rains prediction skill is also analyzed within a model-derived potential predictability framework and it is shown that the actual prediction skill is broadly consistent with the model potential prediction skill. Low-frequency variations of the prediction skill are mostly related to SSTs outside the Indian Ocean region and are likely due to an increased interference of ENSO with the Indian Ocean influence on East African short rains after the mid-1970s climate shift.

The paper describes and analyzes original data, extracted from historical documents and scientific surveys, related to Russian fisheries in the southeastern part of the Gulf of Finland and its inflowing rivers during the 15- early 20th centuries. The data allow tracing key trends in fisheries development and in the abundance of major commercial species. In particular, results showed that, over time, the main fishing areas moved from the middle part of rivers downstream towards and onto the coastal sea. Changes in fishing patterns were closely interrelated with changes in the abundance of exploited fish. Anadromous species, such as Atlantic sturgeon, Atlantic salmon, brown trout, whitefish, vimba bream, smelt, lamprey, and catadromous eel were the most important commercial fish in the area because they were abundant, had high commercial value and were easily available for fishing in rivers. Due to intensive exploitation and other human-induced factors, populations of most of these species had declined notably by the early 20th century and have now lost commercial significance. The last sturgeon was caught in 1996, and today only smelt and lamprey support small commercial fisheries. According to historical sources, catches of freshwater species such as roach, ide, pike, perch, ruffe and burbot regularly occurred, in some areas exceeding half of the total catch, but they were not as important as migrating fish and no clear trends in abundance are apparent. Of documented marine catch, Baltic herring appeared in the 16th century, but did not become commercially significant until the 19th century. From then until now herring have been the dominant catch. PMID:24204735

In the Tuscan Apennine Alps, recent research has shown that similarity in trends of monthly climate variables (i.e., temperature and rainfall) is non-stationary amongst sites during the 20th century even between sites that differ little in elevation and at a relatively short distance from each other (D'Aprile et al., 2010; D'Aprile et al., 2011). Moreover, the level of correlation between series of monthly climate variables varies irregularly from highly positive to negative over time. We hypothesised that those changing climate conditions, even at the local level, could cause different tree-ring growth responses in silver fir amongst sites. The hypothesis was tested by dendroclimatological analysis, which was applied to study stands in silver fir forests close to the meteorological stations where climate analysis has been made. Results show that the influences of both monthly mean temperature and monthly rainfall on silver fir growth vary greatly during the 20th century in the Tuscan Apennine Alps, and the ways that they change differ with month and amongst sites. Within sites, differences in the relationships between climate variables and silver fir tree-ring growth appear small in spite of different elevation of the study stands. These results contribute a changing point in forest planning and management especially in consideration of the need to adapt forest management and interventions to changing climate conditions and mitigate the impacts on silver fir forests. Moreover, they introduce climate variability as a key parameter in sustainable forest management for biodiversity conservation, socially responsible uses, nature conservation, and survival of the only conifer tree species typical of mountain mixed forest ecosystems in the Apennine Alps.

ABSTRACT Animal viruses frequently cause zoonotic disease in humans. As these viruses are highly diverse, evaluating the threat that they pose remains a major challenge, and efficient approaches are needed to rapidly predict virus-host compatibility. Here, we develop a combined computational and experimental approach to assess the compatibility of New World arenaviruses, endemic in rodents, with the host TfR1 entry receptors of different potential new host species. Using signatures of positive selection, we identify a small motif on rodent TfR1 that conveys species specificity to the entry of viruses into cells. However, we show that mutations in this region affect the entry of each arenavirus differently. For example, a human single nucleotide polymorphism (SNP) in this region, L212V, makes human TfR1 a weaker receptor for one arenavirus, Machupo virus, but a stronger receptor for two other arenaviruses, Junin and Sabia viruses. Collectively, these findings set the stage for potential evolutionary trade-offs, where natural selection for resistance to one virus may make humans or rodents susceptible to other arenavirus species. Given the complexity of this host-virus interplay, we propose a computational method to predict these interactions, based on homology modeling and computational docking of the virus-receptor protein-protein interaction. We demonstrate the utility of this model for Machupo virus, for which a suitable cocrystal structural template exists. Our model effectively predicts whether the TfR1 receptors of different species will be functional receptors for Machupo virus entry. Approaches such at this could provide a first step toward computationally predicting the “host jumping” potential of a virus into a new host species. IMPORTANCE We demonstrate how evolutionary trade-offs may exist in the dynamic evolutionary interplay between viruses and their hosts, where natural selection for resistance to one virus could make humans or rodents susceptible

For most of the 20th century, high school was enough for a shot at middle-class status and wages. Eventually high school was surpassed by postsecondary education as the preferred route to professional and managerial jobs in the post-World War II era. This article describes how today, young workers rarely go anywhere in the American job market…

Upon rejecting traditional world history curricula, which include all chronological developments from cave dweller to 20th century, a course is suggested to study significant patterns of government, economics, revolution, war, major religions, geography, scientific advancement, education, and rise and fall of civilizations. (Author/AV)

Use of symbolic figures in the college level world history course can provide a vehicle for studying social organization, political movements, and economic institutions of a given period. For example, Zhou Enlai, an activist and leader throughout much of the 20th century, symbolizes the major themes and forces of this era: change,…

The Apollo 11 Mission which culminated in the first manned lunar landing on July 20, 1969 is recounted. Historical footage of preparation, takeoff, stage separation, the Eagle Lunar Lander, and the moon walk accompany astronauts Michael Collins, Buzz Aldrin, and Neal Armstrong giving their recollections of the mission are shown.

The two largest river basins in South America are Amazon Basin (AMB) in the tropical region and La Plata Basin (LPB) in subtropical and extratropical regions. Extreme droughts have occurred during this decade in Amazonia region which have affected the transportation, fishing activities with impacts in the local population, and also affecting the forest. Droughts or floods over LPB have impacts on agriculture, hydroelectricity power and social life. Therefore, monthly wet and dry extremes in these two regions have a profound effect on the economy and society. Observed rainfall over Amazon Basin (AMB) and La Plata Basin (LPB) is analyzed in monthly timescale using the Standardized Precipitation Index (SPI), from 1979 to 1999. This period is taken to compare GPCP data with HADCM3 simulations (Hadley Centre) of the 20th century and to analyze reanalyses data which have the contribution of satellite information after 1979. HADCM3 projections using SRES A2 scenario is analyzed in two periods: 2000 to 2020 and 2079 to 2099 to study the extremes frequency in a near future and in a longer timescale. Extreme, severe and moderate cases are identified in the northern and southern sectors of LPB and in the western and eastern sectors of AMB. The main objective is to analyze changes in the frequency of cases, considering the global warming and the associated mechanisms. In the observations for the 20th century, the number of extreme rainy cases is higher than the number of dry cases in both sectors of LPB and AMB. The model simulates this variability in the two sectors of LPB and in the west sector of AMB. In the near future 2000 to 2020 the frequency of wet and dry extremes does not change much in LPB and in the western sector of AMB, but the wet cases increase in the eastern AMB. However, in the period of 2079 to 2099 the projections indicate increase of wet cases in LPB and increase of dry cases in AMB. The influence of large scale features related to Sea Surface Temperature

Tropical montane forests act as water catchment and host of biodiversity in the Southeast Asian monsoon region, and understanding how their hydrological conditions change with global warming is vitally important. Global climate model simulations project enhanced moisture cycle in the tropics, which would cause stronger summer monsoon precipitations, but on the other hand the adiabatic lapse rate would be shifted towards a moister condition (amplification of warming at high elevation), inhibiting dry season orographic lifting cloud/fog formation (lifting cloud base hypothesis), enhancing evapo-transpiration, and leading to a net moisture loss during winter dry season. In this study, we have attempted to investigate how the seasonal moisture balance in Southeast Asia has evolved in response to these influences through the 20th century using the oxygen isotopic composition (δ18O) of subannual tree cellulose samples extracted from the annual rings of pine trees that grow in Doi Chiang Dao, a limestone mountain in northern Thailand. At this location the δ18O of cellulose exhibits distinctive annual cycles of up to 12‰, which is primarily a reflection of both the so-called ‘isotope amount effect’ that is associated with the strong monsoon precipitation during summer wet season and the moisture availability from different sources during winter dry season. We have demonstrated that tree cellulose δ18O could be used as a proxy for regional monsoon strength by showing that the annual mean cellulose δ18O correlate significantly with All India Rainfall, Webster-Yang monsoon index, as well as with both local and regional monsoon precipitation. ENSO is the dominant influence on interannual rainfall variability and this is well expressed in the interannual cellulose δ18O record. Using a 21-year moving window correlation analysis we find a weakening of ENSO influence after 1980, coinciding with the most rapid atmospheric warming. We expect to analyze older trees to

David Deutsch (Oxford University) is known for ``Deutsch's algorithmm"---for going beyond the initial ideas about quantum computing (QC) of Benioff, Bennett and Feynman, to describe a quantum Turing machine, and sparking today's widespread experimental research to actually build one. Deutsch does not accept the standard Copenhagen Interpretation (CI) of quantum mechanics (QM); he supports the Many Worlds Interpretation (MWI) and credits it for his insights. In his book, The Fabric of Reality, and numerous articles and talks, he argues that the the adoption of the MWI is phnecessary for making the advances required for quantum computing. His argues that physicists must resolutely take a ``realistic" viewpoint to its logical conclusions, and, that the MWI is phthe realistic theory. In response to ubiquitous assertions by the majority of physicists that ``both systems give the same numbers," and ``all physics does (or can do) is to predict the outcome of experiments," he argues strenuously for the importance of explanations in quantum physics, and to scientific progress in general. Hence, I argue that there are two, reasonably separable, layers here: (i) opposition to positivist and instrumentalist arguments against the validity and/or value of phany explanation(s) phas such, and (ii) an argument about just what is the correct explanation: the MWI over the CI. While establishing the validity of (i) may possibly undermine CI's spirit, nevertheless (i) can be strongly validated independent from complicatons of an overlap with issues of the interpretation of QM. I develop some simple, historical contradictions regarding point (i), (without passing judgement on Deutsch's MWI or involving the CI). For example, the majority viewpoint identifies its seemingly ``non- philosophical" and non-``methaphysical" mindset as archtypically ``scientific", while seemingly quite unaware of the theoretical difficuties with positivist notions of truth, such as the fact that the foundations

As the world enters the last decade of the 20th century, the environmental problems facing human societies have been given increased emphasis in world policy debates. The goal of the Worldwatch Institute is to help raise the level of public understanding of global environmental threats to the point where the public will support policies needed to…

This junior high school level student manual contains three units on the role of women in world history. The units, designed to supplement what is customarily taught in world history courses at this level, are entitled Women Under Feudalism: Western Europe and China, Women in the Industrial Revolution, and Women in Change: 20th Century Women in…

For many years the term nephritis was used to indicate renal diseases (in the sense of Bright s disease) in a larger sense. This review summarizes the history of the concept of glolomerulonephritis from Egyptian Medicine up to the Post-Biopsy Era, in particularly in Turin and in Italy. This study reports an epidemiology survey of Bright s disease in Italy from 1880 up to 1960. Towards the end of the 19th century Bright s disease accounted for 26 deaths/year/105 population (in comparison with more than 200 from tubercolosis) in Italy. At the beginning of the 20th century, Bright s disease was the seventh cause of death in Italy. Moreover, in Italy autopsy studies showed a higher percentage of deaths attributed to Bright s disease (5-7%) in comparison with those obtained from vital studies. In 1960, just before the beginning of renal replacement therapy, Bright s disease accounted for 15.7 deaths/year/105 population. Probably it was difficult to recognize in the real incidence of chronic renal diseases leading to death in the 1960s, and vital studies were able to furnish only approximate estimates. However, noteworthy is the fact that these values were very close to those estimated as being the annual need for renal replacement therapy (10-20/year/105 population). PMID:11346720

In 1498, the Portuguese crossed the Cape of Good Hope. It was not until the period of 1633 and 1666, dates of the founding, respectively, of the Compagnie de l'Orient and the Compagnie des Indes orientales, that the way was definitively opened for trade between France and India. Because so many sailors developed scurvy after voyages that lasted 4 to 5 months, the French settled on Bourbon Island (Réunion) and Ile de France (Mauritius), to provide them with medical care. Created in 1689 by Louis XIV, the Navy Health Service was responsible for health in the colonies until it was replaced in 1890 by the Colonial Health Service. European medicine began its slow diffusion around the Indian Ocean in Pondicherry (India). The naval doctors reported their experiences in the Archives de médecine navale (1864-1889), and the colonial doctors afterwards in the Archives de médecine navale et coloniale (1890-1896). The health system in Madagascar developed strongly during 19(th) and 20(th) centuries, and the subsequent development of health care in the other Indian Ocean islands became closely linked to that of Madagascar. On Bourbon, the two navy hospitals in Saint-Paul and Saint-Denis treated only naval and military personnel. The colony had no hospital providing care for civilians and poor people until three civilian doctors opened a maison de santé (health house) in 1846. PMID:27412971

Hydrocarbon contaminants are ubiquitous in urban aquatic ecosystems, and the ability of some microbial strains to degrade certain polycyclic aromatic hydrocarbons (PAHs) is well established. However, detrimental effects of petroleum hydrocarbon contamination on nondegrader microbial populations and photosynthetic organisms have not often been considered. In the current study, fatty acid methyl ester (FAME) biomarkers in the sediment record were used to assess historical impacts of petroleum contamination on microbial and/or algal biomass in South San Francisco Bay, CA, USA. Profiles of saturated, branched, and monounsaturated fatty acids had similar concentrations and patterns downcore. Total PAHs in a sediment core were on average greater than 20× higher above ∼200 cm than below, which corresponds roughly to the year 1900. Isomer ratios were consistent with a predominant petroleum combustion source for PAHs. Several individual PAHs exceeded sediment quality screening values. Negative correlations between petroleum contaminants and microbial and algal biomarkers - along with high trans/cis ratios of unsaturated FA, and principle component analysis of the PAH and fatty acid records - suggest a negative impacts of petroleum contamination, appearing early in the 20th century, on microbial and/or algal ecology at the site. PMID:25303655

The Marine Biological Laboratory (MBL) in Woods Hole, MA provided opportunities for women to conduct research in the late 19th and early 20th century at a time when many barriers existed to their pursuit of a scientific career. One woman who benefited from the welcoming environment at the MBL was Mary Jane Hogue. Her remarkable career as an experimental biologist spanned over 55 years. Hogue was born into a Quaker family in 1883 and received her undergraduate degree from Goucher College. She went to Germany to obtain an advanced degree, and her research at the University of Würzburg with Theodor Boveri resulted in her Ph.D. (1909). Although her research interests included experimental embryology, and the use of tissue culture to study a variety of cell types, she is considered foremost a protozoologist. Her extraordinary demonstration of chromidia (multiple fission) in the life history of a new species of Flabellula associated with diseased oyster beds is as important as it is ignored. We discuss Hogue's career path and her science to highlight the importance of an informal network of teachers, research advisors, and other women scientists at the MBL all of whom contributed to her success as a woman scientist. PMID:25103622

We use human-skeleton samples to estimate the height of adults living in Anatolia during the Neolithic period. We also report the results of surveys taken in the 20th century on the height of the Turkish population. Neolithic and the Chalcolithic (5000-3000 B.C.) male heights are estimated as 170.9 cm and 165.0 cm, respectively. Pronounced increases were observed for both sexes between the Chalcolithic and Iron (1000-580 B.C.) periods and sharp decreases among both males and females in the Hellenistic-Roman period (333 B.C. to 395 A.D.). Moreover, recovery to the Iron Age levels was achieved in the Anatolian Medieval period (395-1453 A.D.) for both sexes (169.4 cm for males and 158.0 cm for females). In 1884 the mean height of men was 162.2 cm and by the beginning of the 1930s it increased to 166.3 cm. In the first nationwide survey in 1937 males mean height was 165.3 cm, and females was 152.3 cm, where today current heights are 174.0 cm and 158.9 cm, respectively. PMID:21316315

Stratospheric ozone depletion in the Antarctic has profoundly influenced both stratospheric and tropospheric climate, and models lacking stratospheric ozone depletion poorly reproduce past Southern Hemisphere (SH) circulation change. Several studies have shown evidence that prescribing zonal-mean ozone in models leads to underestimation of SH change, and that the zonal asymmetry in ozone depletion has a large impact on the zonal-mean climate response. In this presentation we will show simulations from the NCAR Whole Atmosphere Community Climate Model (WACCM) forced with prescribed ozone fields to revisit the issue of whether it is possible to accurately reproduce late 20th century SH circulation change. In this presentation, we will demonstrate an enhanced SH circulation response in simulations with prescribed zonal-mean ozone from a new historical database that more accurately captures the depth of ozone depletion than databases previously used in the SPARC and CMIP5 assessments. We then compare this response from a 2D prescribed ozone field to a simulation using a prescribed 3D ozone field, and assess how well these simulations compare to observed changes seen in the Microwave Sounding Unit satellite temperature datasets and reanalysis.

The solar energetic particle event on January 20th, 2005 was one of the largest ground level events ever observed. Neutron Monitor stations in the Antarctic recorded energetic particle increases of several thousand percent and it took more than 36 h to return to background level. Such huge increases in high energetic solar cosmic radiation on ground obviously is accompanied by considerable changes in the radiation environment at aviation altitudes. Measurements of 28 Neutron Monitor stations were used in this work to numerically approximate the primary solar proton spectra during the first 12 h of the event by minimizing the differences between measurements and the results of Monte-Carlo calculated count rate increases. The primary spectrum of solar energetic protons was approximated by a power law in rigidity and a linear angular distribution. The incoming direction of the solar energetic particles was determined and compared to the interplanetary magnetic field direction during the event. The resulting effects on the ion pair production during that time will be presented.

On insanity, life crises and the longing for a "right life". A contribution to the discussion on the deviant behavior and mental disorders in the psychiatry of the 19th and 20th centuries using the example of patient stories. History of psychiatry, understood as social and cultural history, provides the framework for this micro-historical article. Using the example of three patients treated in Wuerttemberg or Baden psychiatric asylums between 1875 and 1912, the article focuses on the critical analysis of types of asylums, their practices of admissions, therapies and power relations between patients and staff. Ways of thinking and acting, subjective experiences and emotions are exemplified by patient records, personal testimonials and contemporary publications again by patients and staff. The article examines options of patients to influence the institutional daily asylum routine against the background of its complexity and dynamics. Borders, manipulations, malingering and querulous paranoia are at stake here. Furthermore, the article reflects various forms of social interaction with the power regulating therapeutic and disciplinary aspects against the backdrop of the "canons of rules" of the asylum as well as the contemporary political and legal framework. PMID:27501548

Ocelots (Leopardus pardalis) in the United States currently exhibit low levels of genetic diversity. One hypothesis for this observation is that habitat fragmentation, resulting from human induced changes in the landscape during the 20th century, created island populations with highly reduced gene flow and increased genetic drift and inbreeding. In an effort to investigate this, we used a portion of the mitochondrial control region and 11 autosomal microsatellite loci to examine historical levels of genetic diversity and infer temporal changes in ocelot populations between 1853 and 2005. Levels of genetic diversity were higher in historical ocelot populations than in extant populations from Texas. The earliest documented loss of mitochondrial haplotype diversity occurred at Laguna Atascosa National Wildlife Refuge. The second extant population inhabiting private lands in Willacy County retained higher levels of genetic diversity through the 1990s, but subsequently lost diversity over the next decade. A similar pattern was observed for autosomal microsatellite loci. This supports the argument that low levels of genetic diversity in Texas are related to human induced population reductions and fragmentation, both of which threaten the remaining ocelots in the United States. At this time, the best means of mitigating the continued erosion of genetic variation are translocation of individuals either from larger populations in Mexico to Texas, or between the Texas populations. PMID:24586737

Ocelots (Leopardus pardalis) in the United States currently exhibit low levels of genetic diversity. One hypothesis for this observation is that habitat fragmentation, resulting from human induced changes in the landscape during the 20(th) century, created island populations with highly reduced gene flow and increased genetic drift and inbreeding. In an effort to investigate this, we used a portion of the mitochondrial control region and 11 autosomal microsatellite loci to examine historical levels of genetic diversity and infer temporal changes in ocelot populations between 1853 and 2005. Levels of genetic diversity were higher in historical ocelot populations than in extant populations from Texas. The earliest documented loss of mitochondrial haplotype diversity occurred at Laguna Atascosa National Wildlife Refuge. The second extant population inhabiting private lands in Willacy County retained higher levels of genetic diversity through the 1990s, but subsequently lost diversity over the next decade. A similar pattern was observed for autosomal microsatellite loci. This supports the argument that low levels of genetic diversity in Texas are related to human induced population reductions and fragmentation, both of which threaten the remaining ocelots in the United States. At this time, the best means of mitigating the continued erosion of genetic variation are translocation of individuals either from larger populations in Mexico to Texas, or between the Texas populations. PMID:24586737

Height is regarded as one of the indicators of environmental stress at population level, being an excellent barometer of standard of living. The aim of this study was to describe diversity in height among populations living in different regions of the Kingdom of Poland in terms of the economic factors in the second half of the 19th and early 20th century. This study examines the height of adult inhabitants from five guberniyas (provinces) of the Kingdom of Poland (Łomża, Warsaw, Radom, Kalisz and Płock) collected in the years 1897-1914 (N = 732 men, N = 569 women). Differences in average height of male and female inhabitants across the five guberniyas were examined using ANOVA and the Fisher's LSD (Least Significant Difference) test of multiple comparisons. Statistically significant differences in the height between the guberniyas were observed. Diversity in the economic development in the studied guberniyas of the Kingdom of Poland translated into differences in the height of their inhabitants. Moreover, an increase in mean height over time was noted. PMID:24041152

The warming associated with changes in snow cover in northern high-latitude terrestrial regions represents an important energy feedback to the climate system. Here, we simulate snow cover-climate feedbacks (i.e. changes in snow cover on atmospheric heating) across the Pan-arctic over two distinct warming periods during the 20th century, 1910-1940 and 1970-2000. We offer evidence that increases in snow cover-climate feedbacks during 1970-2000 were nearly three times larger than during 1910-1940 because the recent snow-cover change occurred in spring, when radiation load is highest, rather than in autumn. Based on linear regression analysis, we also detected a greater sensitivity of snow cover-climate feedbacks to temperature trends during the more recent time period. Pan-arctic vegetation types differed substantially in snow cover-climate feedbacks. Those with a high seasonal contrast in albedo, such as tundra, showed much larger changes in atmospheric heating than did those with a low seasonal contrast in albedo, such as forests, even if the changes in snow-cover duration were similar across the vegetation types. These changes in energy exchange warrant careful consideration in studies of climate change, particularly with respect to associated shifts in vegetation between forests, grasslands, and tundra. ?? 2007 Blackwell Publishing Ltd.

Several contemporary cross-national and intranational geographic studies have reported positive ecological (group-level) associations of intelligence and suicide mortality. These findings are consistent with facts from suicide research and with an evolutionary view of suicidal behavior. The present research extended these accounts cross-temporally. Analysis of E. L. Thorndike's state-level personal quality scores and standardized birth rates of eminent persons, taken as proxy variables for regional intelligence, along with historical state suicide rates (1913-1924 and 1928-1932) showed that intelligence and suicide mortality across the U.S.A. were already clearly positively related during the early 20th century, suggesting time stability of the effect. Within the U.S.A., the effect is possibly due to state differences in the ethnic composition, which correspond to both suicide rates and intelligence proxies. It is argued that the most parsimonious interpretation of these ecological findings remains that they indeed reflect individual-level effects, that a positive link between intelligence and suicide is entirely compatible with positive overall links between intelligence and health and longevity, and that the ultimate explanative background for the positive link between intelligence and suicide may be provided through the framework of Rushton's differential K theory. PMID:18065061

America experienced a genuinely vast development of biomedical science in the early decades of the twentieth century, which in turn impacted the community of academic psychiatry and changed the way in which clinical and basic research approaches in psychiatry were conceptualized. This development was largely based on the restructuring of research universities in both of the USA and Canada following the influential report of Johns Hopkins-trained science administrator and politician Abraham Flexner (1866–1959). Flexner's report written in commission for the Carnegie Foundation for the Advancement of Teaching in Washington, DC, also had a major influence on complementary and alternative medicine (CAM) in psychiatry throughout the 20th century. This paper explores the lasting impact of Flexner's research published on modern medicine and particularly on what he interpreted as the various forms of health care and psychiatric treatment that appeared to compete with the paradigm of biomedicine. We will particularly draw attention to the serious effects of the closing of so many CAM-oriented hospitals, colleges, and medical teaching programs following to the publication of the Flexner Report in 1910. PMID:23346209

Various human activities - including agriculture, water consumption, river damming, and aquaculture - have intensified over the last century. This has had a major impact on nitrogen (N) and phosphorus (P) cycling in global continental waters. In this study, we use a coupled nutrient-input-hydrology-in-stream nutrient retention model to quantitatively track the changes in the global freshwater N and P cycles over the 20th century. Our results suggest that, during this period, the global nutrient delivery to streams increased from 34 to 64 Tg N yr-1 and from 5 to 9 Tg P yr-1. Furthermore, in-stream retention and removal grew from 14 to 27 Tg N yr-1 and 3 to 5 Tg P yr-1. One of the major causes of increased retention is the growing number of reservoirs, which now account for 24 and 22 % of global N and P retention/removal in freshwater systems, respectively. This increase in nutrient retention could not balance the increase in nutrient delivery to rivers with the consequence that river nutrient transport to the ocean increased from 19 to 37 Tg N yr-1 and from 2 to 4 Tg P yr-1. Human activities have also led to a global increase in the molar N : P ratio in freshwater bodies.

The 122 papers in this collection were presented in 15 sessions of the 20th annual convention of the Association for Educational Data Systems which was held in Orlando, Florida, May 10-14, 1982. Individual papers covered a wide variety of topics, including computer assisted instruction, computer managed instruction, computer literacy,…

predictors of discharge (α < 0.05). The 1930s Dust Bowl drought was one of the most severe droughts in the past 300 years; from 1934-1935, average August discharge was reduced by 25-40% with respect to the anomalously wet early 20th century pluvial. Discharge estimates using reconstructed PDSI values for the 2- and 10-year MCA droughts (PDSI = -6 and -5, respectively) indicate that 60% of stream reaches where beaver were active in the late Holocene became ephemeral in these droughts. This analysis is supported by observations during the extreme drought of the 2000s, when ephemeral flow occurred along streams with known historical beaver activity in northern Yellowstone. Model predictions indicate that by 2030-2039 the GYE will endure persistent severe drought (mean annual PDSI = -4 to -6) (Dai, 2011), thus riparian area is likely to decrease in the coming decades. The early 20th century has been suggested to be an ideal reference for riparian habitat restoration despite anomalously wet conditions unlike current or likely future climate. Future efforts to restore riparian habitat by reducing elk browsing and increasing beaver damming will be hampered by reduced flows on small streams.

Sand landscapes occupy about half of the territory of the Netherlands. Apart from an insignificant amount of Tertiary deposits, these sands are of Pleistocene and Holocene age. They include Saalian push moraines, Weichselian cover sands and Holocene drift sands. To these geological landscapes, cultural variants should be added such as the essen, i.e. a landscape with plaggen soils, and reclaimed lands (e.g. former moors). Not included are the coastal sands, which we dealt with in an earlier EGU contribution (van den Ancker & Jungerius 2012). Nature and man created a wide variety of sceneries that inspired painters in the 19th and early 20th century (Jungerius et al. 2012). Painter communities on the sandy soils flourished in Oosterbeek/Wolfheze, Laren/Blaricum, Nijkerk, Nunspeet/Elspeet, Hattem and Heeze. Many of the landscape paintings are found in the database of Simonis en Buunk that can be freely consulted on line (http//www.simonis&buunk.com). For this presentation we selected specimens that show geodiversity-biodiversity relationships, some of which have changed since. Painters of push moraines were attracted by the rolling terrain, the dry valleys and occasionally the colourful podzol soil profiles. Popular themes in the cover sands were the undulating relief and heathlands with herds of sheep, sandy footpaths and country roads with erosion phenomena. The dynamics of erosion captivated the painters of Holocene drift sand scenery, as did the bare fields of cultivated lands. Their paintings show the rural areas that since the beginning of the 20th century lost their traditional charm in large-scale re-allotment schemes and artificial nature-building project, that changed geodiversity-biodiversity relationships. Changes in the sandy terrains that can be inferred from the paintings are on a landscape scale, the scale of the landform and vegetation type, and are illustrated by changes in colour, pattern, structure and texture. Examples are: · active drift sands

In this paper, it was studied how physics affected development of optometry in the United States, from aspects of formation and academization of optometry. It was also revealed that history of optometry was analogous to history of engineering. Optics in the 19th century was divided into electromagnetic study of light and visual optics. Development of the visual optics promoted professionalization of ophthalmology that had already started in the 18th century. The visual optics also stimulated formation of optometry and optometrists body in the late 19th century of the United States. The American optometrists body were originated from opticians who had studied visual optics. Publication of several English academic textbooks on visual optics induced appearance of educated opticians (and jewelers). They acquired a right to do the eye examination in the early 20th century after C. F. Prentice's trial in 1897, evolving into optometrists. The opticians could be considered as craftsmen, and they were divided into (dispensing) opticians and optometrists. Such history of American optometrists body is analogous to that of engineers body in the viewpoints of craftsmen origin and separation from craftsmen. Engineers were also originated from educated craftsmen, but were separated from craftsmen when engineering was built up. Education system and academization of optometry was strongly influenced by physics, too. When college education of optometry started at American universities, it was not belonged to medical school but to physics department. Physics and optics were of great importance in curriculum, and early faculty members were mostly physicists. Optometry was academized in the 1920s by the college education, standardization of curriculum, and formation of the American Academy of Optometry. This is also analogous to history of engineering, which was academized by natural sciences, especially by mathematics and physics. The reason why optometry was academized not by

The motion of the Earth's pole in space has been observed with great accuracy for the last 115 years. The angular variations of the pole position away from its mean are a well explained at annual and 434-day periods. Variations at annual periods are caused by changes in the mass and angular momentum forced by all Earth surface changes that have near seasonality. The 434-day period is explained as a resonance between the cumulative driving forces having periods near the Chandler wobble free eigenmode of the Earth and is well understood theoretically. The Earth also has a longer-term drift that is explained primarily as a response to the ice age changes in the moments of inertial of the Earth. However, there has been a long-standing search for the origins of pole variations that have a period near 10 years. Using GRACE space gravimetry we show that ice mass losses from Greenland and Antarctica, and when combined with changes in continental hydrology, explain almost all the main features of interannual time scale polar wander. The discovery has broad interdisciplinary implications, as we show that decadal scale pole variations are directly linked to global changes continental water. The energy sources for these pole position changes are, therefore, likely to be associated with decadal scale ocean and atmospheric oscillations that also drive 20th Century on-land wet-dry oscillations at decadal-scale across the globe. Variability in pole position, therefore, offers a tool for assessing past stability of our climate, and for the future, now faced with an increased intensity in the water cycle and more vulnerable to ice sheet instability.

To understand the strengths and limitations of a low-resolution version of Flexible Global Ocean-Atmosphere-Land-Sea-ice (FGOALS-gl) to simulate the climate of the last millennium, the energy balance, climate sensitivity and absorption feedback of the model are analyzed. Simulation of last-millennium climate was carried out by driving the model with natural (solar radiation and volcanic eruptions) and anthropogenic (greenhouse gases and aerosols) forcing agents. The model feedback factors for (model sensitivity to) different forcings were calculated. The results show that the system feedback factor is about 2.5 (W m-2) K-1 in the pre-industrial period, while 1.9 (W m-2) K-1 in the industrial era. Thus, the model’s sensitivity to natural forcing is weak, which explains why it reproduces a weak Medieval Warm Period. The relatively reasonable simulation of the Little Ice Age is caused by both the specified radiative forcing and unforced linear cold drift. The model sensitivity in the industrial era is higher than that of the pre-industrial period. A negative net cloud radiative feedback operates during whole-millennial simulation and reduces the model’s sensitivity to specified forcing. The negative net cloud radiative forcing feedback under natural forcing in the period prior to 1850 is due to the underestimation (overestimation) of the response of cloudiness (in-cloud water path). In the industrial era, the strong tropospheric temperature response enlarges the effective radius of ice clouds and reduces the fractional ice content within cloud, resulting in a weak negative net cloud feedback in the industrial period. The water vapor feedback in the industrial era is also stronger than that in the pre-industrial period. Both are in favor of higher model sensitivity and thus a reasonable simulation of the 20th century global warming.

The position and strength of the Northern Hemisphere polar jet are important modulators of mid-latitude weather extremes and the societal, ecosystem, and economic damage related to them. The position of the North Atlantic jet (NAJ) controls the location of the Atlantic storm track and anomalies in the NAJ position have been related to temperature and precipitation extremes over Europe. In summer, a southern NAJ regime can result in floods in the British Isles (BRIT) and increasing odds of heat waves in the northeastern Mediterranean (NEMED). Variability in the amplitude and speed of the Northern Hemisphere jet stream is hotly debated as a potential mechanism linking recent mid-latitude weather extremes to anthropogenic warming. However, the hypothesis of jet stream variability as a possible mechanism linking Arctic amplification to mid-latitude weather extremes is largely based on data sets with limited temporal extent that do not warrant robust results from a statistical significance perspective. Here, we combined two summer temperature-sensitive tree-ring records from BRIT and NEMED to reconstruct interannual variability in the latitudinal position of the summer NAJ back to 1725. The two well-replicated temperature proxies counter-correlate significantly over the full period and thus illustrate the temperature dipole generated by anomalous NAJ positions. Positive extremes in the NAJ reconstruction correspond to heatwaves recorded in the historical Central England temperature record and negative extremes correspond to reconstructed fire years in Greece. The reconstruction shows a northward shift in the latitudinal NAJ position since the 1930s that is most pronounced in the northern NAJ extremes, suggesting a more frequent occurrence of BRIT hot summers in the 20th century compared to previous centuries.

Soot black carbon (here expressed as GBC) is present in sediments of Central Park and Prospect Park Lakes, New York City (NYC), and peaks in the middle of the 20th Century at the highest values (1-3% dry weight) ever reported in urban lakes. During that period ({approximately} 1940-1970), the GBC represents up to 28% of the total organic carbon (OC). Radionuclide-normalized whole core inventories of accumulated GBC are similar in the two lakes which are separated by {approximately} 15 km, suggesting that emissions of fine soot particles may have accumulated homogeneously over at least the urban center of NYC. The distribution of polycyclic aromatic hydrocarbons (PAHs) in the sediments is decoupled from that of GBC. The highest levels of total PAHs correspond to peak coal use for space heating in NYC in the early 1900s. In contrast, GBC concentrations were highest in the mid 1900s, a period when oil combustion dominated local fossil fuel use and incineration of municipal solid waste (MSW) was common practice in NYC. Decreases in GBC levels observed in more recently deposited sediments are consistent with improvements in particle emissions control systems. Non-soot BC (char) was identified by a high carbon to nitrogen (C/N) ratio that persisted after correction for GBC. This likely tracer of MSW incineration was estimated to contribute an additional {approximately} 35% of total organic carbon found in the sediments deposited during the peak period of combustion. The temporal trends of soot-BC observed in our lake cores do not agree with published historical reconstructions based on fuel consumption and estimated emission factors. 43 refs., 4 figs., 1 tab.

Proximity to the coast and elevation are important geographical considerations for human settlement. Little is known, however, about how spatial variation in these factors exactly relates to human settlements and activities, and how this has developed over time. Such knowledge is important for identifying vulnerable regions that are at risk from phenomena such as food shortages and water stress. Human activities are a key driving force in global change, and thus detailed information on population distribution is an important input to any research framework on global change. In this paper we assess the global geospatial patterns of the distribution of human population and related factors, with regard to the altitude above sea level and proximity to the coast. The investigated factors are physical conditions, urbanisation, agricultural practices, economy, and environmental stress. An important novel element in this study, is that we included the temporal evolution in various factors related to human settlements and agricultural practices over the 20th century, and used projections for some of these factors up to the year 2050. We found population pressure in the proximity of the coast to be somewhat greater than was found in other studies. Yet, the distribution of population, urbanisation and wealth are evolving to become more evenly spread across the globe than they were in the past. Therefore, the commonly believed tendency of accumulation of people and wealth along coasts is not supported by our results. At the same time, food production is becoming increasingly decoupled from the trends in population density. Croplands are spreading from highly populated coastal zones towards inland zones. Our results thus indicate that even though people and wealth continue to accumulate in proximity to the coast, population densities and economic productivity are becoming less diverse in relation to elevation and distance from the coast.

Objective: To investigate the functional adaptive process of the fetal autonomic nervous system during hypnosis from the 20th week of gestation till term. Are there changes in the power spectrum analysis of fetal heart rate when the mother is having a clinical hypnosis or control period? Study Design: Fourty-nine FHR recordings were analysed. Included recordings were from singletons and abdominal fetal ECG-monitored pregnancies. All women were randomised to receive clinical hypnosis followed by a period with no intervention or vice versa. Statistical analyses were performed with the Wilcoxon signed ranks and Spearman rho correlation tests. Results: There was a significant difference found between fetal heart rate at baseline (144.3 ± 6.0) and hypnosis (142.1 ± 6.4). A difference was also detected between the standard deviation of the heart rate between baseline (6.7 ± 1.9) and hypnosis (6.8 ± 3.5). LFnu was smaller during baseline (80.2 ± 5.3) than during hypnosis (82.1 ± 5.7), whereas HFnu was significantly larger (19.8 ± 5.3 vs. 17.9 ± 5.7). There was no correlation between the gestation age and the change in LFnu, HFnu or ratio LF/HF due to the hypnosis intervention. Conclusion: The functional adaptive process of the fetal autonomic system during hypnosis is reflected by a sympathovagal shift towards increased sympathetic modulation. PMID:25284838

Organic matter in general, and humic substances (HS) in particular, are involved in many processes in soils, sediments, rocks and natural waters. These include rock weathering, plant nutrition, pH buffering, trace metal mobility and toxicity, bioavailability, degradation and transport of hydrophobic organic chemicals, formation of disinfection by-products during water treatment, heterotrophic production in blackwater ecosystems and, more generally, the global carbon cycle. Before the 1970s, natural organic matter of different ecosystem pools ( i.e., soils, sediments, and natural waters) was often studied in isolation, although many similarities exist between them. This is particularly so for HS. In this historical context, a need appeared at the international level for bringing together environmental chemists, soil scientists, hydrologists, and geologists who were interested in HS to provide a forum for the exchange of ideas, to standardize analytical procedures and agree on definitions of HS. The International Humic Substances Society (IHSS) was founded in Denver, Colorado (USA) in 1981 with several objectives among them “to bring together scientists in the coal, soil, and water sciences with interests in humic substances” (home page of the IHSS web site: http://ihss.gatech.edu/ihss2/index.html). This paper presents selected pioneering works on humus in soils and sediments during the 20th century with a special focus on the links between the studies of soil HS and the formation, during early diagenesis, of the precursors of kerogens. Temporal coverage includes key contributions preceding the founding of the IHSS, and a brief history of the organization is presented.

Terrestrial vegetation currently absorbs approximately a third of anthropogenic CO2 emissions, mitigating the rise of atmospheric CO2. However, terrestrial net primary production is highly sensitive to atmospheric CO2 levels and associated climatic changes. In C3 plants, which dominate terrestrial vegetation, net photosynthesis depends on the ratio between photorespiration and gross photosynthesis. This metabolic flux ratio depends strongly on CO2 levels, but changes in this ratio over the past CO2 rise have not been analyzed experimentally. Combining CO2 manipulation experiments and deuterium NMR, we first establish that the intramolecular deuterium distribution (deuterium isotopomers) of photosynthetic C3 glucose contains a signal of the photorespiration/photosynthesis ratio. By tracing this isotopomer signal in herbarium samples of natural C3 vascular plant species, crops, and a Sphagnum moss species, we detect a consistent reduction in the photorespiration/photosynthesis ratio in response to the ∼100-ppm CO2 increase between ∼1900 and 2013. No difference was detected in the isotopomer trends between beet sugar samples covering the 20th century and CO2 manipulation experiments, suggesting that photosynthetic metabolism in sugar beet has not acclimated to increasing CO2 over >100 y. This provides observational evidence that the reduction of the photorespiration/photosynthesis ratio was ca. 25%. The Sphagnum results are consistent with the observed positive correlations between peat accumulation rates and photosynthetic rates over the Northern Hemisphere. Our results establish that isotopomers of plant archives contain metabolic information covering centuries. Our data provide direct quantitative information on the “CO2 fertilization” effect over decades, thus addressing a major uncertainty in Earth system models. PMID:26644588

This study investigates seasonality of marriages and reproductive isolation in six long-isolated communities in the central Apennines (Italy). It had two objectives: (1) the identification of an Apennine biodemographic model in comparison with mountain communities of other regions, and with non-Apennine communities in Abruzzo, and (2) to identify the possible effects of the drainage of Lake Fucino (1854-1876) on that area. Marriages in this region show two very stable seasonal patterns: one is typical of sedentary rural societies, with summer migrations and marriages preferentially celebrated in the winter, and the other has marriages that are strongly concentrated in the summer months, i.e. between 75% and 93.5% of marriages were celebrated between June and October in these communities in the 1800s. These were traditionally pastoral communities with winter transhumance of the flocks and their shepherds towards the lowlands of southern Italy. In both groups, restrictions imposed by the Catholic Church do not seem to have affected the timing of marriages. Indeed, economic factors related to work activities seem to have had more influence. Concerning reproductive isolation, the results show high rates of endogamy: between 85% and 98% in both the 19th and 20th centuries. Rates of consanguineous marriages were between 5% and 20%, and those of isonymous marriages rarely exceeded 9%. The coefficient of inbreeding a shows that there was a delayed, limited period of increased consanguinity in the few decades around the turn of the century. This is different from the national situation, and thus could be a consequence of the Lake Fucino drainage. PMID:11446403

An array of centennial-scale ice core records of hydrogen peroxide (H2O2) was recently developed using shallow cores drilled at 24 different locations across the West Antarctic Ice Sheet (WAIS). H2O2 is a major atmospheric oxidant that is closely linked to chemical feedback mechanisms controlling the composition of the atmosphere. Ice core records of H2O2 offer the potential to reconstruct past changes in the oxidation capacity of the atmosphere if the processes controlling deposition and long-term preservation are quantitatively understood. Comparison of the 1900-50 with the 1950-2000 time period shows in all cores increases of >40% in mean H2O2 during the latter half of the 20th century. Atmospheric concentration, seasonal timing and rate of snow accumulation, as well as the site temperature largely determine the amount of H2O2 preserved in an ice core. Sensitivities of the long-term H2O2 record to changes in annual accumulation and temperature quantified with a semi-empirical deposition model suggest that interannual variability in H2O2 is dominated by the accumulation signal under the current WAIS temperature regime. However, observed trends can only be explained in part by changes in accumulation rate and timing. Recent field and model experiments in West Antarctica showed a negative correlation between stratospheric ozone and summer levels of atmospheric H2O2. Using the NASA-Goddard Flight Center (GSFC) point photochemical model the magnitude of atmospheric H2O2 enhancement due to changes in surface UV radiation over the past decades was estimated and compared to the H2O2 residual not accounted for by the deposition model. We suggest that part of the observed H2O2 increase in the core record is due to the occurrence of the spring time ozone hole since the 1970s.

Climate change impact assessment has become one of the most important subjects of the research community because of the recent increase in frequency of extreme events and changes in the spatiotemporal patterns of climate. This paper analyses the ability of 46 coupled climate models from Coupled Model Intercomparison Project phases 3 and 5 (CMIP5 and CMIP3). The performance of each climate model was assessed based on its skills in simulating the current seasonal cycles (monthly) of both maximum temperature and minimum temperature (Tmax, Tmin) over India. The performance measures such as coefficient of correlation (Skill_r), root mean square error (Skill_rmse), and the skill in simulating the observed probability density function (Skill_s) are mainly employed for evaluation of the simulated monthly seasonal cycle. A new metric called Skill_All which is an intersection of the above three metrics has been defined for the first time. A notable enhancement of Skill_All for CMIP5 vis-a-vis CMIP3 is observed. Further, three best CMIP5 models each for Tmax and Tmin were selected. The methodology employed in this study for model assessment is implemented for the first time for India, which establishes a robust foundation for the climate impact assessment study. The seasonal trends in Tmax and Tmin were analyzed over all the temperature homogenous regions of India for different time slots during the 20th century. Significant trends in Tmin can be seen during most of the seasons over the entire Indian region during last four decades. This establishes the signature of climate change over most parts of India.

The atmospheric evolution of eight non-methane hydrocarbons (ethane, acetylene, propane, n-butane, isobutane, n-pentane, isopentane and benzene) and five alkyl nitrates (2-propyl, 2-butyl, 3-methyl-2-butyl and the sum of 2+3-pentyl nitrates) are reconstructed for the latter half of the 20th century based on Arctic firn air measurements. The reconstructed trends of the non-methane hydrocarbons (NMHCs) show increasing concentrations from 1950 to a maximum in 1980 before declining towards the end of last century. These observations provide direct evidence that NMHCs in the northern hemisphere have declined substantially during the period 1980-2001. Benzene concentrations show a smaller increase between 1950 and 1980 than the other NMHCs indicating that additional sources of benzene, other than fossil fuel combustion, were likely important contributors to the benzene budget prior to and during this period. The declining benzene concentrations from 1980 to 2001 would suggest that biomass burning is unlikely to be important in the benzene budget as biomass burning emissions were reportedly increasing over the same period. Methyl and ethyl nitrate show growth patterns in the firn that suggested perturbation by in-situ production from an unidentified mechanism. However, the higher alkyl nitrates show evidence for increasing concentrations from 1950 to maxima in the mid 1990s before decreasing slightly toward the end of the last century. The differing atmospheric evolution of the alkyl nitrates relative to their parent hydrocarbons indicate an increase in their production efficiency per hydrocarbon molecule. Using a steady state analysis of hydrocarbon oxidation and alkyl nitrate production and loss we show that reactive nitrogen oxide (NOx) concentrations in the northern hemisphere have likely increased considerably between 1950 and 2001.

Terrestrial vegetation currently absorbs approximately a third of anthropogenic CO2 emissions, mitigating the rise of atmospheric CO2. However, terrestrial net primary production is highly sensitive to atmospheric CO2 levels and associated climatic changes. In C3 plants, which dominate terrestrial vegetation, net photosynthesis depends on the ratio between photorespiration and gross photosynthesis. This metabolic flux ratio depends strongly on CO2 levels, but changes in this ratio over the past CO2 rise have not been analyzed experimentally. Combining CO2 manipulation experiments and deuterium NMR, we first establish that the intramolecular deuterium distribution (deuterium isotopomers) of photosynthetic C3 glucose contains a signal of the photorespiration/photosynthesis ratio. By tracing this isotopomer signal in herbarium samples of natural C3 vascular plant species, crops, and a Sphagnum moss species, we detect a consistent reduction in the photorespiration/photosynthesis ratio in response to the ∼100-ppm CO2 increase between ∼1900 and 2013. No difference was detected in the isotopomer trends between beet sugar samples covering the 20th century and CO2 manipulation experiments, suggesting that photosynthetic metabolism in sugar beet has not acclimated to increasing CO2 over >100 y. This provides observational evidence that the reduction of the photorespiration/photosynthesis ratio was ca. 25%. The Sphagnum results are consistent with the observed positive correlations between peat accumulation rates and photosynthetic rates over the Northern Hemisphere. Our results establish that isotopomers of plant archives contain metabolic information covering centuries. Our data provide direct quantitative information on the "CO2 fertilization" effect over decades, thus addressing a major uncertainty in Earth system models. PMID:26644588

The 20th National Society for the Provision of Education in Rural Australia (SPERA) and Western Australia District High School Administrators' Association (WADHSAA) joint conference proceedings, based on the theme "Working Together, Staying Vital," was held in Fremantle, Perth, Western Australia, in June 2004. The proceedings contain 13 keynote…

Aim of this contribution is the evaluation of volumetric and surface variations of Brenva Glacier (Mont Blanc, Italian Alps) during the second half of the 20th century, by GIS-based processing of maps and aerial photogrammetry technique. Brenva Glacier is a typical debris covered glacier, located in a valley on the S-E side of the Mont Blanc. The glacier covers a surface of 7 kmq and shows a length of 7,6 km at maximum. The glacier snout reaches 1415 m a.s.l., which is the lowest glacier terminus of the Italian Alps. To evaluate glacier variations different historical maps were used: 1) The 1959 Map, at the scale 1:5.000, by EIRA (Ente Italiano Rilievi Aerofotogrammetrici, Firenze), from terrestrial photogrammetric survey, published in the Bollettino del Comitato Glaciologico Italiano, 2, n. 19, 1971. 2) The 1971 Map, at the scale 1:5.000, from aerial photogrammetry (Alifoto, Torino) published in the Bollettino del Comitato Glaciologico Italiano, 2, n. 20, 1972. 3) The 1988 Map, at the scale 1:10.000, (Region Aosta Valley, Regional Technical Map) from 1983 aerial photogrammetric survey. 4) The 1999 Map, at the scale 1:10.000, (Region Aosta Valley, Regional Technical Map) from 1991 aerial photogrammetry survey. For the same purpose the following aereal photographs were used: 1) The 1975 image, CGR (Italian General Company aerial Surveys) flight RAVDA (Administrative Autonomous Region Aosta Valley), at the scale 1:17.000. 2) The 1991 image, CGR (Italian General Company aerial Surveys) flight RAVDA (Administrative Autonomous Region Aosta Valley), at the scale 1:17.000. Aerial imageries have been acquired over a long period from 1975 to 1991. The black and white images were scanned at suitable resolution if compared with the imagery scale and several models, representing the glacier tongue area, oriented using the inner and outer orientation parameters delivered with the images, were produced. The digital photogrammetric system, after orientation and matching, produces

Background Men tend to have more upper body mass and fat than women, a physical characteristic that may predispose them to severe motor vehicle crash (MVC) injuries, particularly in certain body regions. This study examined MVC-related regional body injury and its association with the presence of driver obesity using both real-world data and computer crash simulation. Methods and Findings Real-world data were from the 2001 to 2005 National Automotive Sampling System Crashworthiness Data System. A total of 10,941 drivers who were aged 18 years or older involved in frontal collision crashes were eligible for the study. Sex-specific logistic regression models were developed to analyze the associations between MVC injury and the presence of driver obesity. In order to confirm the findings from real-world data, computer models of obese subjects were constructed and crash simulations were performed. According to real-world data, obese men had a substantially higher risk of injury, especially serious injury, to the upper body regions including head, face, thorax, and spine than normal weight men (all p<0.05). A U-shaped relation was found between body mass index (BMI) and serious injury in the abdominal region for both men and women (p<0.05 for both BMI and BMI2). In the high-BMI range, men were more likely to be seriously injured than were women for all body regions except the extremities and abdominal region (all p<0.05 for interaction between BMI and sex). The findings from the computer simulation were generally consistent with the real-world results in the present study. Conclusions Obese men endured a much higher risk of injury to upper body regions during MVCs. This higher risk may be attributed to differences in body shape, fat distribution, and center of gravity between obese and normal-weight subjects, and between men and women. Please see later in the article for the Editors' Summary PMID:20361024

The main goal of this work is to investigate which are the main factors determining interannual sea level variability of Baltic, Adriatic and Black seas, and to which extent the sea level of these three basins can deviate from the global mean. The three basins selected are semi-enclosed marginal seas connected with the adjacent seas by narrow straits. 13 sea level timeseries in Baltic Sea, 7 in Adriatic Sea and 5 in Black Sea provided by PSMSL, allowed us to compute a single seamless sea level timeseries representative for each basin from 1900 and for the entire 20th century, using statistical tools (PCA and Least Square method). Comparison with satellite data in the period 1993-2009, confirms that timeseries so computed are good representations of the observed sea level, with correlation values of 0.97, 0.87 and 0.72 for Baltic, Adriatic and Black Sea respectively. At basin scale the sea level has been decomposed in various contributions that have been separately analyzed: local effect of pressure, steric effect due to temperature and salinity variation, boundary forcing, wind effect and river discharge. The annual cycles and their variability, show that the largest contribution is due to the wind for the Adriatic Sea and for the Baltic Sea. In these two basins the inverse Barometer effect plays a minor role and the steric factor is almost negligible. The wind seems to play a negligible role on Black Sea, where the Danube river discharge plays an important role. A linear regression model, built considering large scale sea level pressure distribution as predictor, is capable to explain a further percentage of sea level variability variability (about 20%) left after subtracting all the factors considered above. Sea level of the Baltic and Black Sea show a significant positive correlation (0.3 about) revealing the likely influence of an external common forcing. Past sea level variability shows no strong evidences of large deviation from the global mean sea level

Socioeconomic developments increasingly put pressure on our global fresh water resources. Over the past century, increasing extents of land were converted into (irrigated) agricultural production areas whilst dams and reservoirs were built to get grip on the timing and availability of fresh water resources. Often targeted to be of use at local, regional, or national levels, such human interventions affect, however, terrestrial water fluxes on larger scales. Although many of these interventions have been studied intensively at global and regional scales, the impact of land use and land cover change has often been omitted, and an assessment on how land conversions impact water resources availability and water scarcity conditions was not executed before, despite its importance in the development of sound integrated river basin water management plans. To address this issue, we evaluate in this contribution how land use and land cover change impact water resources and water scarcity conditions in the 20th century, using a multi-model multi-forcing framework. A novelty of this research is that the impact models applied in this study use the dynamic HYDE 3.1 - MIRCA dataset to cover the historical (1971-2010) changes in land use and land cover. Preliminary results show that more than 60% of the global population, predominantly living in downstream areas, is adversely affected by the impacts of land use and land cover change on water resources and water scarcity conditions. Whilst incoming discharge generally (in 97% of the global land area) tends to decrease due to upstream land conversions, we found at the same time increases in local runoff levels for a significant share (27%) of the global land area. Which effect eventually dominates and whether it causes water scarcity conditions is determined by the dependency of a region to water resources originating in upstream areas, and by the increasing rates with which the (locally generated) stream flow is used to fulfil (non

Recent climate change literature has been dominated by studies which show that the equilibrium climate sensitivity is better constrained than the latest estimates from the Intergovernmental Panel on Climate Change (IPCC) and the U.S. National Climate Assessment (NCA) and that the best estimate of the climate sensitivity is considerably lower than the climate model ensemble average. From the recent literature, the central estimate of the equilibrium climate sensitivity is ~2°C, while the climate model average is ~3.2°C, or an equilibrium climate sensitivity that is some 40% lower than the model average.To the extent that the recent literature produces a more accurate estimate of the equilibrium climate sensitivity than does the climate model average, it means that the projections of future climate change given by both the IPCC and NCA are, by default, some 40% too large (too rapid) and the associated (and described) impacts are gross overestimates.A quantitative test of climate model performance can be made by comparing the range of model projections against observations of the evolution of the global average surface temperature since the mid-20th century. Here, we perform such a comparison on a collection of 108 model runs comprising the ensemble used in the IPCC's Fifth Assessment Report and find that the observed global average temperature evolution for all trend lengths (with one exception) since 1986 is less than 97.5% of the model distribution, meaning that the observed trends are significantly different from the average trend simulated by climate models. For periods approaching 40 years in length, the observed trend lies outside of (below) the range that includes 95% of all climate model simulations.We conclude that at the global scale, this suite of climate models has failed. Treating them as mathematical hypotheses, which they are, means that it is the duty of scientists to, unfortunately, reject their predictions in lieu of those with a lower climate

Growth anomaly of trees in some regions was detected under current episode of rapid warming. This raises a dilemma for temperature reconstructions by using tree-ring width which is believed to be the most important proxy on inter-annual temperature reconstruction during the past millenniums. Here we employed the tree-ring δ13C to reconstruct temperature variations for exploring their potential on capturing signals of rapid warming, and to test how its difference with the tree-ring width based reconstruction. In this study the mean May-July temperature (TM-J) was reconstructed over the past century by tree-ring δ13C of Chinese pine trees growing in the Nanwutai region. The explained variance of the reconstruction was 43.3% (42.1% after adjusting the degrees of freedom). Compared to a ring-width temperature reconstruction (May-July) from the same site, the tree-ring δ13C-based temperature reconstruction offered two distinct advantages: 1) it captured a wider range of temperature variability, i.e., at least May-July, even over a longer part of the year, January-September; and 2) the reconstruction preserved more low-frequency climate information than that of ring width did. The 20th century warming was well represented in the Nanwutai tree-ring δ13C temperature reconstruction, which implied that stable carbon isotope of tree rings potentially represents temperature variations during historical episodes of rapid warming. A spatial correlation analysis showed that our temperature reconstruction represented climate variations over the entire Loess Plateau in north-central China. Significant positive correlations (p < 0.1) were found between the temperature reconstruction and ENSO, as well as SSTs in the Pacific and Indian Oceans. The reconstruction showed the periodicities of 22.78-, 4.16-, 3.45-3.97- and 2.04-2.83-year quasi-cycles at a 95% confidence level. Our results suggested that temperature variability in the Nanwutai region may be linked to Pacific and Indian

Lake trout embryos and sac fry are very sensitive to toxicity associated with maternal exposures to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) and structurally related chemicals that act through a common aryl hydrocarbon receptor (AHR)-mediated mechanism of action. The loading of large amounts of these chemicals into Lake Ontario during the middle of the 20th century coincided with a population decline that culminated in extirpation of this species around 1960. Prediction of past TCDD toxicity equivalence concentrations in lake trout eggs (TEC(egg)s) relative to recent conditions required fine resolution of radionuclide-dated contaminant profiles in two sediment cores; reference core specific biota--sediment accumulation factors (BSAFs) for TCDD-like chemicals in lake trout eggs; adjustment of the BSAFs for the effect of temporal changes in the chemical distributions between water and sediments; and toxicity equivalence factors based on trout early life stage mortality. When compared to the dose-response relationship for overt early life stage toxicity of TCDD to lake trout, the resulting TEC(egg)s predict an extended period during which lake trout sac fry survival was negligible. By 1940, following more than a decade of population decline attributable to reduced fry stocking and loss of adult lake trout to commercial fishing, the predicted sac fry mortality due to AHR-mediated toxicity alone explains the subsequent loss of the species. Reduced fry survival, associated with lethal and sublethal adverse effects and possibly complicated by other environmental factors, occurred after 1980 and contributed to a lack of reproductive success of stocked trout despite gradually declining TEC(egg)s. Present exposures are close to the most probable no observable adverse effect level (NOAEL TECegg = 5 pg TCDD toxicity equivalence/g egg). The toxicity predictions are very consistent with the available historical data for lake trout population levels in Lake Ontario, stocking