Sample records for future technical challenges

The book introduces readers into the interrelated global problems population dynamics, energy supply, imminent climate catastrophe, environmetal pollution, finite resources and the conflict between the North and South. It encourages probing more deeply into the technicalchallenges of the future. The author demonstrates why economic and technical issues will soon be outstripped by questions of the environmental, human and social compatibility of new technologies. (orig./UA) [de

The colloquium organised by the Association of Energy Economists dealing with the theme 'Technical progress faced with the challenges of the energy sector in the future' takes place against a backdrop of ever-increasing initiatives in this field, for example at the World Energy Council or the International Energy Agency Faith in technical progress is widespread but should be supported by studies without any preconceived ideas. Research and development efforts must be fully supported, and in a climate of opening markets and liberalization the public authorities have a major role to pay. Historically, the markets have always been able to meet new needs thanks to technology, but the ambitious targets that the international community has set itself regarding the emission of greenhouse gases imply technical improvements and major investments. (authors)

issue is compounded by the fact that some of the future GEN IV reactor designs involve fast neutron spectra, and all involve increases in temperature to the range 500 o C - 1250 o C. Comparatively little is known of the effect of, for instance, creep-fatigue interactions in high irradiation fluxes on the structural integrity of the potential materials of construction. In spite of these technical concerns there is the business management expectation that all of these reactors will experience very few materials degradation problems that might affect the economics of operation. The paper starts with a review of our present capability to predict the materials degradation modes encountered in the current BWR and PWR reactor designs. This capability is the basis for any analysis of the future degradation problems (and their mitigation) in the current reactors and in the evolutionary water-cooled reactor designs. This section concludes with an overview of assessments of future materials degradation issues that might be expected in these water-cooled reactors. These preliminary discussions are then broadened to cover some of the more obvious technical problems likely to be encountered with the more advanced GEN IV designs, such as the Very High Temperature Reactor (VHTR) and the Super Critical Water Cooled Reactor (SCWR). The article concludes with a brief discussion of some of the challenges facing the technical management/leadership, with some suggestions on how to overcome them. These challenges may become especially severe given the fact that the technical problems must be overcome in a time frame that is short compared with that taken to resolve the issues that have faced us over the last 30 years. Some specific management challenges include: The decrease in the number of experienced experimentalists and analysts over the last 10 years; The decrease in 'institutional' memory as it relates to the operation of the current reactors, and the design and construction of

: Three-dimensional (3D) bioprinting is a revolutionary technology in building living tissues and organs with precise anatomic control and cellular composition. Despite the great progress in bioprinting research, there has yet to be any clinical translation due to current limitations in building human-scale constructs, which are vascularized and readily implantable. In this article, we review the current limitations and challenges in 3D bioprinting, including in situ techniques, which are one of several clinical translational models to facilitate the application of this technology from bench to bedside. A detailed discussion is made on the technical barriers in the fabrication of scalable constructs that are vascularized, autologous, functional, implantable, cost-effective, and ethically feasible. Clinical considerations for implantable bioprinted tissues are further expounded toward the correction of end-stage organ dysfunction and composite tissue deficits.

The telemonitoring of vital signs from the home is an essential element of telehealth services for the management of patients with chronic conditions, such as congestive heart failure (CHF), chronic obstructive pulmonary disease (COPD), diabetes, or poorly controlled hypertension. Telehealth is now being deployed widely in both rural and urban settings, and in this paper, we discuss the contribution made by biomedical instrumentation, user interfaces, and automated risk stratification algorithms in developing a clinical diagnostic quality longitudinal health record at home. We identify technicalchallenges in the acquisition of high-quality biometric signals from unsupervised patients at home, identify new technical solutions and user interfaces, and propose new measurement modalities and signal processing techniques for increasing the quality and value of vital signs monitoring at home. We also discuss use of vital signs data for the automated risk stratification of patients, so that clinical resources can be targeted to those most at risk of unscheduled admission to hospital. New research is also proposed to integrate primary care, hospital, personal genomic, and telehealth electronic health records, and apply predictive analytics and data mining for enhancing clinical decision support.

In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

Before the production of all the LMJ (MEGAJOULE laser) optics, the CEA had to proceed with the fabrication of about 300 large optics for the LIL (laser integration line) laser. Thanks to a fruitful collaboration with high-tech optics companies in Europe, this challenge has been successfully hit. In order to achieve the very tight requirements for cleanliness, laser damage threshold and all the other high demanding fabrication specifications, it has been necessary to develop and to set completely new fabrication process going and to build special outsize fabrication equipment. Through a couple of examples, this paper gives an overview of the work which has been done and shows some of the results which have been obtained: continuous laser glass melting, fabrication of the laser slabs, rapid-growth KDP (potassium dihydrogen phosphate) technology, large diffractive transmission gratings engraving and characterization. (authors)

In December 1996, the NEA Committee on Nuclear Regulatory Activities concluded that changes resulting from economic deregulation and other recent developments affecting nuclear power programmes have consequences both for licensees and regulatory authorities. A number of potential problems and issues which will present a challenge to nuclear regulatory bodies over the next ten years have been identified in a report just released. (author)

The threat of nuclear terrorism is real and poses a significant challenge to both U.S. and global security. For terrorists, the challenge is not so much the actual design of an improvised nuclear device (IND) but more the acquisition of the special nuclear material (SNM), either highly enriched uranium (HEU) or plutonium, to make the fission weapon. This paper provides two examples of technical solutions that were developed in support of the nonproliferation objective of reducing the opportunity for acquisition of HEU. The first example reviews technologies used to monitor centrifuge enrichment plants to determine if there is any diversion of uranium materials or misuse of facilities to produce undeclared product. The discussion begins with a brief overview of the basics of uranium processing and enrichment. The role of the International Atomic Energy Agency (IAEA), its safeguard objectives and how the technology evolved to meet those objectives will be described. The second example focuses on technologies developed and deployed to monitor the blend down of 500 metric tons of HEU from Russia's dismantled nuclear weapons to reactor fuel or low enriched uranium (LEU) under the U.S.-Russia HEU Purchase Agreement. This reactor fuel was then purchased by U.S. fuel fabricators and provided about half the fuel for the domestic power reactors. The Department of Energy established the HEU Transparency Program to provide confidence that weapons usable HEU was being blended down and thus removed from any potential theft scenario. Two measurement technologies, an enrichment meter and a flow monitor, were combined into an automated blend down monitoring system (BDMS) and were deployed to four sites in Russia to provide 24/7 monitoring of the blend down. Data was downloaded and analyzed periodically by inspectors to provide the assurances required.

The threat of nuclear terrorism is real and poses a significant challenge to both U.S. and global security. For terrorists, the challenge is not so much the actual design of an improvised nuclear device (IND) but more the acquisition of the special nuclear material (SNM), either highly enriched uranium (HEU) or plutonium, to make the fission weapon. This paper provides two examples of technical solutions that were developed in support of the nonproliferation objective of reducing the opportunity for acquisition of HEU. The first example reviews technologies used to monitor centrifuge enrichment plants to determine if there is any diversion of uranium materials or misuse of facilities to produce undeclared product. The discussion begins with a brief overview of the basics of uranium processing and enrichment. The role of the International Atomic Energy Agency (IAEA), its safeguard objectives and how the technology evolved to meet those objectives will be described. The second example focuses on technologies developed and deployed to monitor the blend down of 500 metric tons of HEU from Russia's dismantled nuclear weapons to reactor fuel or low enriched uranium (LEU) under the U.S.-Russia HEU Purchase Agreement. This reactor fuel was then purchased by U.S. fuel fabricators and provided about half the fuel for the domestic power reactors. The Department of Energy established the HEU Transparency Program to provide confidence that weapons usable HEU was being blended down and thus removed from any potential theft scenario. Two measurement technologies, an enrichment meter and a flow monitor, were combined into an automated blend down monitoring system (BDMS) and were deployed to four sites in Russia to provide 24/7 monitoring of the blend down. Data was downloaded and analyzed periodically by inspectors to provide the assurances required.

Structural dynamics and its auxiliary fields are the most progressive and challenging areas space system engineering design and operations face. Aerospace systems are dependent on structural dynamicists for their success. Past experiences (history) are colored with many dynamic issues, some producing ground or flight test failures. The innovation and creativity that was brought to these issues and problems are the aura from the past that lights the path to the future. Using this illumination to guide understanding of the dynamic phenomena and designing for its potential occurrence are the keys to successful space systems. Our great paradox, or challenge, is how we remain in depth specialists, yet become generalists to the degree that we make good team members and set the right priorities. This paper will deal with how we performed with acclaim in the past, the basic characteristics of structural dynamics (loads cycle, for example), and the challenges of the future.

The title "Challenges for the Future" implies the challenge to summarize a very complex meeting. Of necessity, I will present a personal impression. My interest is in risk assessment, which I define as a process for summarizing science in support of decision making. Risk assessment is sometimes regarded as arcane numerology, a rigid process of computing risk numbers in which much available science is unused. I am a strong advocate for the broader definition of risk assessment. It is encouraging to learn how much science is becoming available for use in risk assessment for gasoline, its components, and alternative fuels.

This article reports on a 4D-treatment planning workshop (4DTPW), held on 7-8 December 2009 at the Paul Scherrer Institut (PSI) in Villigen, Switzerland. The participants were all members of institutions actively involved in particle therapy delivery and research. The purpose of the 4DTPW was to discuss current approaches, challenges, and future research directions in 4D-treatment planning in the context of actively scanned particle radiotherapy. Key aspects were addressed in plenary sessions, in which leaders of the field summarized the state-of-the-art. Each plenary session was followed by an extensive discussion. As a result, this article presents a summary of recommendations for the treatment of mobile targets (intrafractional changes) with actively scanned particles and a list of requirements to elaborate and apply these guidelines clinically.

The increase of power consumption, the development of renewable energy sources and the emergence of new usages like the electric-powered car are as many challenges that put the reliability and the reactivity of our power grids to the test. These grids have to change to become 'intelligent' thanks to the integration of new information and communication technologies over the overall supply chain, from the energy generation to its end use by consumers. For the first time in France, the actors of this change explain their opinion about this revolution and put it in perspective with its full extent and complexity. Changing power grids to make them intelligent is first of all a technicalchallenge but also a society challenge: the consumer will become an actor involved in the mastery of his energy demand and a renewable energy producer capable to interact with the grid in an increasing manner. This worldwide change that we are going to be the witnesses comes up against numerous obstacles. The aim of this book is to examine the determining factors of the success of this large scale change through its technical, economical and social dimensions. It shows that the emergence of such an advanced power system cannot be possible neither without the reconciliation between some contradictory goals, nor without a strong coordination between the actors. Content: Part 1 - intelligent power networks to answer the 21. century challenges: 1 - the European and French dimension of the electric power sector; 2 - towards a carbon-free economy; 3 - a power grid facing new challenges; 4 - the pre-figuration of intelligent power grids; 5 - the deployment of intelligent (smart) grids; Part 2 - perspectives of smart grids development: 1 - the future of power networks; 2 - a new industrial era; Part 3 - the consumer's position in the deployment of future grids: 1 - changing behaviours; 2 - making the consumer a 'consum'actor'. Synthesis and conclusion. (J.S.)

Discussion of challenges in employment challenges in Europe and a brief discription of the Danish flexicurity system......Discussion of challenges in employment challenges in Europe and a brief discription of the Danish flexicurity system...

The sustainable development of nuclear energy can only be achieved by establishing the global nuclear safety regime. The effective management of technical knowledge will become one of the issues and challenges in establishing the global nuclear safety. We have to develop the measures to identify the nature and scope of associated problems and to explore the cooperative international actions to resolve them. The future of its effective management will depend on how to optimize the transfer and deployment of the knowledge as well as how to maintain the knowledge base. In this presentation, two specific topics are discussed: sharing of the knowledge and preservation of the workforce. In sharing the knowledge, topics are assurance of free flow of safety-related R and D information from developed to developing countries and potential imposition of a strong trade agreement between nuclear exporting and importing countries to ensure the safety. In preserving the workforce, topics are development of the knowledge transfer system from this generation to the next like a forum of IYNC, enforcement of regional and international educational systems like ANENT and WNU for workforce development, and exploration of optimal mechanism in using retired workforce. The publication of the world-wide directory of nuclear professionals, aggressive implementation of the youth internship program and introduction of the international professional certification program are also discussed. The reformation of CNS as a more enforcing and binding agreement in keeping the safety along with the introduction of 'Global Nuclear Safety Treaty' could be an excellent mechanism of achieving an effective knowledge management and eventually enforcing the global safety regime. IAEA has always been the corroborator of maintaining high levels of nuclear safety through close international nuclear cooperation. These important roles of IAEA should continue to be emphasized more than ever in order to secure the

In the last 2000 years the world's population and the worldwide total energy consumption have been continuously increasing, at a rate even greater than exponential. By now a situation has been reached in which energy resources are running short, which for a long time have been treated as though they were almost inexhaustible. The ongoing growth of the world's population and a growing hunger for energy in underdeveloped and emerging countries imply that the yearly overall energy consumption will continue to grow, by about 1.6 percent every year so that it would have doubled by 2050. This massive energy consumption has led to and is progressively leading to severe changes in our environment and is threatening a climatic state that, for the last 10 000 years, has been unusually benign. The coincidence of the shortage of conventional energy resources with the hazards of an impending climate change is a dangerous threat to the well-being of all, but it is also a challenging opportunity for improvements in our energy usage. On a global scale, conventional methods such as the burning of coal, gas and oil or the use of nuclear fission will still dominate for some time. In their case, the challenge consists in making them more efficient and environmentally benign, and using them only where and when it is unavoidable. Alternative energies must be expanded and economically improved. Among these, promising techniques such as solar thermal and geothermal energy production should be promoted from a shadow existence and further advanced. New technologies, for instance nuclear fusion or transmutation of radioactive nuclear waste, are also quite promising. Finally, a careful analysis of the national and global energy flow systems and intelligent energy management, with emphasis on efficiency, overall effectiveness and sustainability, will acquire increasing importance. Thereby, economic viability, political and legal issues as well as moral aspects such as fairness to disadvantaged

The NASA Johnson Space Center Space Life Sciences Directorate (SLSD) and Wyle Integrated Science and Engineering (Wyle) will conduct a one-day business cluster at the 62nd IAC so that IAC attendees will understand the benefits of open innovation (crowdsourcing), review successful results of conducting technicalchallenges in various open innovation projects, and learn how an organization can effectively deploy these new problem solving tools to innovate more efficiently and effectively. Results from both the SLSD open innovation pilot program and the open innovation workshop conducted by the NASA Human Health and Performance Center will be discussed. NHHPC members will be recruited to participate in the business cluster (see membership http://nhhpc.nasa.gov) and as IAF members. Crowdsourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by the organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the SLSD, with the support of Wyle, established and implemented pilot projects in open innovation (crowdsourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technicalchallenges. These unsolved technical problems were converted to problem statements, called Challenges by some open innovation service providers, and were then posted externally to seek solutions to these problems. In addition, an open call was issued internally to NASA employees Agency wide (11 Field Centers and NASA HQ) using an open innovation service provider crowdsourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external

This article attempts to identify some of the technology and research challenges facing the digital media industry in the future. We first discuss several trends in the industry, such as the rapid growth of broadband Internet networks and the emergence of networking and media-capable devices in the home. Next, we present technicalchallenges that result from these trends, such as effective media interoperability in devices, and provide a brief overview of Windows Media, which is one of the technologies in the market attempting to address these challenges. Finally, given these trends and the state of the art, we argue that further research on data compression, encoder optimization, and multi-format transcoding can potentially make a significant technical and business impact in digital media. We also explore the reasons that research on related techniques such as wavelets or scalable video coding is having a relatively minor impact in today"s practical digital media systems.

Radiation detection technologies are an important tool in the prevention of proliferation. A variety of new developments have enabled enhanced performance in terms of energy resolution, spatial resolution, predictive modeling and simulation, active interrogation, and ease of operation and deployment in the field. For example, various gamma ray imaging approaches are being explored to combine spatial resolution with background suppression in order to enhance sensitivity at reasonable standoff distances and acquisition times. New materials and approaches are being developed in order to provide adequate energy resolution in field use without the necessity for liquid nitrogen. Finally, different detectors combined into distributed networks offer promise for detection and tracking of radioactive materials. As the world moves into the 21st century, the possibility of greater reliance on nuclear energy will impose additional technical requirements to prevent proliferation. In addition to proliferation resistant reactors, a careful examination of the various possible fuel cycles from cradle to grave will provide additional technical and nonproliferation challenges in the areas of conversion, enrichment, transportation, recycling and waste disposal. Radiation detection technology and information management have a prominent role in any future global regime for nonproliferation beyond the current Advanced Protocol. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48. (author)

The training of technical writers and the objectives of such education are discussed. Special emphasis was placed on the communication between technical personnel and non-technical personnel. The liabilities that affect technical writers were also discussed.

verifying and assuring, in accordance with the statute of the Agency and the IAEA safeguards system, compliance with the its safeguards agreements with States parties undertaken in the fulfilment of their obligations under article III, paragraph I, of the Treaty, with a view to preventing the diversion of nuclear energy from peaceful uses to nuclear weapons or other nuclear explosive devices. Major non-proliferation challenges were associated with the universalization and strengthening of the IAEA safeguards system. In this regard the important work of the IAEA in implementing safeguards to verify compliance with the non-proliferation obligations of the Treaty was stressed. It was stressed that States parties must have both a comprehensive safeguards agreement and an Additional Protocol in place for IAEA to be able to provide credible assurances of both the non-diversion of declared material and the absence of undeclared nuclear material or activities in the States concerned. The importance of the contribution of nuclear weapon-free zones was stressed. The continued verification by the IAEA of the non-diversion of declared nuclear material in the Islamic Republic of Iran and the IAEA inability to verify the absence of undeclared nuclear material and activities in that country were mentioned. There was concern about the nuclear activities of the Democratic People's Republic of Korea, about reports of alleged clandestine nuclear activities by the Syrian Arab Republic and the nuclear capabilities of Israel. The new proliferation threat posed by the clandestine activities and networks for the supply of nuclear goods and technologies were noted. It was emphasized that only through proactive and full cooperation and assistance to the Agency could such proliferation threats be addressed. States parties noted the importance of enhancing cooperation among themselves and with international organizations, in particular IAEA, to prevent, detect and respond to suspected proliferation

htmlabstractThe move to IP Protocol Television (IPTV) has challenged the traditional television industry by opening the Internet to high quality real time television content delivery. Thus it has provided an enabling set of key technologies to understand and foster further innovations in the

The global magnitude of degraded and deforested areas is best approached by restoring landscapes. Heightened international perception of the importance of forests and trees outside forests (e.g., woodlands, on farms) demands new approaches to future landscapes. The current need for forest restoration is two billion ha; most opportunities are mosaic restoration in the...

HEP is only one of many sciences with sharply increasing compute requirements that cannot be met by profiting from Moore’s law alone. Commercial clouds potentially allow for realising larger economies of scale. While some small-scale experience requiring dedicated effort has been collected, public cloud resources have not been integrated yet with the standard workflows of science organisations in their private data centres; in addition, European science has not ramped up to significant scale yet. The HELIX NEBULA Science Cloud project - HNSciCloud, partly funded by the European Commission, addresses these points. Ten organisations under CERN’s leadership, covering particle physics, bioinformatics, photon science and other sciences, have joined to procure public cloud resources as well as dedicated development efforts towards this integration. The HNSciCloud project faces the challenge to accelerate developments performed by the selected commercial providers. In order to guarantee cost efficient usage of IaaS resources across a wide range of scientific communities, the technical requirements had to be carefully constructed. With respect to current IaaS offerings, dataintensive science is the biggest challenge; other points that need to be addressed concern identity federations, network connectivity and how to match business practices of large IaaS providers with those of public research organisations. In the first section, this paper will give an overview of the project and explain the findings so far. The last section will explain the key points of the technical requirements and present first results of the experience of the procurers with the services in comparison to their’on-premise’ infrastructure.

Health Impact Assessment has made impressive progress over the past 10 years achieving greater clarity over the nature of HIA, understanding that different methods were appropriate for different contexts and accepting that a variety of types of evidence were needed. However areas remain where further progress is needed. Much progress has been made on how HIA informs decision makers but HIA practitioners still need greater understanding of decision making processes and how HIA should relate to them. Predicting the future consequences of following different options is a key element of HIA but there is still need for more robust methods of prediction and in particular better prediction of the magnitude of impacts. Few HIA reports adequately describe the distribution of impacts between different groups and this is another area where improvement is needed. Considerable progress has been made in clarifying the role of participation in HIA but the practice has often been less impressive than the rhetoric. HIA practitioners also need to become more critical in evaluating their activities. In the future it is likely that commercial organisations and EIA practitioners will become more involved in HIA and quality control of HIA practice will become even more important

The next generation of high energy e + e - colliders is likely to be built with colliding linear accelerators. A lot of research and development is needed before such a machine can be practically built. Some of the problems and recent progress made toward their solution are described here. Quantum corrections to beamstrahlung, the production of low emittance beams and strong focusing techniques are covered

technical support organization called the 'Centre for Nuclear Safety' within PNRA to meet the future regulatory challenges. This paper discusses the objectives and organization of Centre for Nuclear Safety and the challenges that the centre is facing as the strategy being devised to meet those challenges. The paper includes a brief description of co-operation under an IAEA TC project titled 'Further Improvement of Nuclear Regulatory Infrastructure in Pakistan-(PAK/9/28)' which is an example of international co-operation in the establishment of technical support organization to enhance the regulatory effectiveness of the national regulator in Pakistan. (author)

The recently published Washington International Energy Group's 1993 Electric Utility Outlook states that nearly one-third (31 percent) of U.S. utility executives expect reliability to decrease in the near future. Electric power system stability is crucial to reliability. Stability analysis determines whether a system will stay intact under normal operating conditions, during minor disturbances such as load fluctuations, and during major disturbances when one or more parts of the system fails. All system elements contribute to reliability or the lack of it. However, this report centers on the transmission segment of the electric system. The North American Electric Reliability Council (NERC) says the transmission systems as planned will be adequate over the next 10 years. However, delays in building new lines and increasing demands for transmission services are serious concerns. Reliability concerns exist in the Mid-Continent Area Power Pool and the Mid-America Interconnected Network regions where transmission facilities have not been allowed to be constructed as planned. Portions of the transmission systems in other regions are loaded at or near their limits. NERC further states that utilities must be allowed to complete planned generation and transmission as scheduled. A reliable supply of electricity also depends on adhering to established operating criteria. Factors that could complicate operations include: More interchange schedules resulting from increased transmission services. Increased line loadings in portions of the transmission systems. Proliferation of non-utility generators

Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and futurechallenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)

The technical problems to be solved in future underground engineering experiments are of two kinds. One concerns adequate description of the variation of nuclear explosion effects with physical nd chemical properties of the explosion site. The other concerns engineering of the explosive detonation system to provide adequate safety and security, concurrently with minimum total costs per explosion. The semiempirical equations for explosion effects can be trusted only in the range of explosive energy, depth of burst, and rock type for which there is prior experience. Effects calculations based on the principles of continuum mechanics and measurable geophysical properties appear to work in the few test cases, such as Gasbuggy, to which they have been applied. These calculational methods must be tested in a variety of situations. The relevance of dynamic and static measurements on Dragon Trail, Bronco, Rulison, Stoop, Ketch, and Pinedale to proving the methods are discussed in this paper. The traditional methods of assembling and fielding nuclear explosives have evolved from practice at the Nevada Test Site. These provide great flexibility and assure maximum recovery of all data from each test, thus minimizing the time required to achieve desired results. Timing and firing, radiation monitoring, explosives assembly and emplacement, explosive performance, weather monitoring, and dynamic measurements of earth and building motion have all been handled traditionally as independent functions. To achieve lower costs in underground engineering experiments and projects, one prototype system combining all electronic, measurement, and communication functions is being built. Much further work will be required to complete this effort, including, especially, an examination of safety criteria and means for assuring operational and public safety at reduced costs. (author)

The technical problems to be solved in future underground engineering experiments are of two kinds. One concerns adequate description of the variation of nuclear explosion effects with physical nd chemical properties of the explosion site. The other concerns engineering of the explosive detonation system to provide adequate safety and security, concurrently with minimum total costs per explosion. The semiempirical equations for explosion effects can be trusted only in the range of explosive energy, depth of burst, and rock type for which there is prior experience. Effects calculations based on the principles of continuum mechanics and measurable geophysical properties appear to work in the few test cases, such as Gasbuggy, to which they have been applied. These calculational methods must be tested in a variety of situations. The relevance of dynamic and static measurements on Dragon Trail, Bronco, Rulison, Stoop, Ketch, and Pinedale to proving the methods are discussed in this paper. The traditional methods of assembling and fielding nuclear explosives have evolved from practice at the Nevada Test Site. These provide great flexibility and assure maximum recovery of all data from each test, thus minimizing the time required to achieve desired results. Timing and firing, radiation monitoring, explosives assembly and emplacement, explosive performance, weather monitoring, and dynamic measurements of earth and building motion have all been handled traditionally as independent functions. To achieve lower costs in underground engineering experiments and projects, one prototype system combining all electronic, measurement, and communication functions is being built. Much further work will be required to complete this effort, including, especially, an examination of safety criteria and means for assuring operational and public safety at reduced costs. (author)

Full Text Available In the publishing of this issue we want to advance two major events, the first of which is to inform the Sportis Magazine will from September 1, part of official scientific journal of the University of A Coruña, with this integration It aims to improve the dissemination of the journal in the most prestigious data bases of international level and contribute to the visibility of the University of A Coruña as committed to research in school sports college, psychomotor and Physical Education.The second story that we want to give is that the November 5-7 will be held at the Faculty of Education at the University of A Coruna, the V World Congress of Sports, Physical Education and Psychomotor School. You can participate by presenting a communication, poster or virtual poster, the best papers will be invited to publish their work as research article in the following issues of the magazine. Disponéis more information on the web www.sportis.es.In this third and final issue of the year, we beat record of article, in part due to the increase in the number of items received, a total of 10 works by experts and professionals from different countries will be published. Some of the issues addressed are: first aid teacher training, the attitude toward students with disabilities, futsal, using accelerometers, bodyboard training in young, innovative proposals such as the teaching unit based Hypopressive method, aquatic motor skills, study of sports organizations, studies involving different experts from different countries, analysis of skills in physical education ... Undoubtedly an interesting number to enjoy a good coffee or tea and gain greater technical and scientific knowledge.All equipment Sportis want is your pleasure this new publication and invite you to submit your work for future issues.Receive a warm greeting,Prof. Dr. Victor Arufe GiráldezChief Editor Sportis. International Journal of School Sports, Physical Education and Psychomotor

Please visit the Southern Forest Futures Project website for more information.The Southern Forest Futures Project provides a science-based âfuturingâ analysis of the forests of the 13 States of the Southeastern United States. With findings...

Rib fracture repair has been performed at selected centers around the world for more than 50 years; however, the operative indications have not been established and are considered controversial. The outcome of a strictly nonoperative approach may not be optimal. Potential indications for rib fracture repair include flail chest, painful, movable rib fractures refractory to conventional pain management, chest wall deformity/defect, rib fracture nonunion, and during thoracotomy for other traumatic indication. Rib fracture repair is technicallychallenging secondary to the human rib's relatively thin cortex and its tendency to fracture obliquely. Nonetheless, several effective repair systems have been developed. Future directions for progress on this important surgical problem include the development of minimally invasive techniques and the conduct of multicenter, randomized trials.

The challenges facing Ontario Hydro Nuclear (OHN), as understood at the time of the conference, are discussed. OHN had many strengths to build on in preparing for the competition ahead, including: extremely competitive production costs, strong technical capabilities, advantages of multiple units, environmental advantages favoring nuclear, strong public support, and improving station performance. Even with these advantages, OHN faced the difficult challenge of improving overall performance in the face of a large debt burden, coupled with the reinvestment demands of aging units at Pickering A and Bruce A. At the time of the conference, Bruce 2 had already been shut down, because the cost of retubing it and replacing its boilers could not be justified. The ''drive to nuclear excellence'' involves the simultaneous achievement of top performance in safety, reliability and cost; and to this end, changes were being made to reverse the trends indicated by disappointing ''peer reviews''

The following article is reproduced from 'The Hydrogen Economy: Opportunities and Challenges', edited by Michael Ball and Martin Wietschel, to be published by Cambridge University Press in June 2009. In the light of ever-increasing global energy use, the increasing cost of energy services, concerns over energy supply security, climate change and local air pollution, this book centres around the question of how growing energy demand for transport can be met in the long term. Given the sustained interest in and controversial discussion of the prospects of hydrogen, the authors highlight the opportunities and the challenges of introducing hydrogen as alternative fuel in the transport sector from an economic, technical and environmental point of view. Through its multi-disciplinary approach the book provides a broad range of researchers, decision makers and policy makers with a solid and wide-ranging knowledge base concerning the hydrogen economy. (author)

Technical) degree programme rated the organization and quality of the industrial attachment component of the programme. Challenges students faced during industrial attachment were also examined. A case study design was used. Twenty-five final ...

This study aimed at examining the challenges in implementing the Technical and Vocational Education and Training (TVET) curriculum in technical colleges in Southern Nigeria. It sought to respond to three research questions and was guided by one research hypothesis. A questionnaire with 14 items was used to collect ...

A study was conducted to develop a technical skills assessment tool for the training and development of challenge course facilitators. Researchers accessed two professional on-line listserves to collect a sample size of twenty-seven currently used technical skills assessment tools. The assessment tools were critically analysed by three independent…

Federal law requires state to develop performance measures and data-collection systems for secondary and postsecondary technical-skill attainment. This poses many challenges, such as defining a technical skills, measurement and when to assess students. In this article, the author outlines various assessment models and looks at the challenges…

Dec 1, 2015 ... challenges remain to be resolved, in designing robust wireless networks that can deliver the performance ... demonstrated the first radio transmission from the Isle of ... distances with better quality, less power, and smaller ...

Microbiology: Challenges and Future Directions D. Chandramohan Biological Oceanography Division, National Institute of Oceanography, Goa, India Introduction The planet earth is believed to have formed about 4600 million years ago and life forms originated... and all-important tasks that include everything from pathogenesis to fixing atmospheric nitrogen in the soil. An interesting question to be asked, therefore, is: has there been any realistic estimate of these bacteria on Earth? Now, for the first time, a...

Project Management increasing shapes workplace communication, especially when technical commuicators participate in cross-disciplinary development teams. This paper looks at the future of project management in technical communication and argues for a communicative approach to project management...... for technical communication students. The Project Management course in the International Bachelor Program of Marketing and Management Communication at the Aarhus School of Business is described, and the implications fortechnical communication curricula are discussed....

Will library technical services exist thirty years from now? If so, what do leading experts see as the direction of the field? In this visionary look at the future of technical services, Mary Beth Weber has compiled a veritable who's who of the field to answer just these questions.

This paper reviews some of the key technical problems that remain to be solved in nuclear cratering technology. These include: (1) developing a broader understanding of the effects that material properties and water content of the earth materials around the shot have on cratering behavior, (2) extending the experimental investigation of retarc formation to include intermediate yields and various materials, and (3) improving our ability to predict the escape of radioactive material to the atmosphere to form the cloud source responsible for fallout. The formation processes of ejecta craters, retarcs, and subsidence craters are described in the light of our present understanding, and the major gaps in our understanding are indicated. Methods of calculating crater and retarc formation are discussed, with particular reference to the input information needed. Methods for calculating fallout are presented, and their shortcomings are discussed. A preliminary analysis of the safety factors associated with the presently proposed nuclear excavation concepts is presented. (author)

This paper reviews some of the key technical problems that remain to be solved in nuclear cratering technology. These include: (1) developing a broader understanding of the effects that material properties and water content of the earth materials around the shot have on cratering behavior, (2) extending the experimental investigation of retarc formation to include intermediate yields and various materials, and (3) improving our ability to predict the escape of radioactive material to the atmosphere to form the cloud source responsible for fallout. The formation processes of ejecta craters, retarcs, and subsidence craters are described in the light of our present understanding, and the major gaps in our understanding are indicated. Methods of calculating crater and retarc formation are discussed, with particular reference to the input information needed. Methods for calculating fallout are presented, and their shortcomings are discussed. A preliminary analysis of the safety factors associated with the presently proposed nuclear excavation concepts is presented. (author)

Among the new energy sectors in development, biogas has many benefits: several valorization possibilities (bio-methane, electricity and heat), continuous production, easy storage. In Europe, and particularly in France, the bio-methane market will be in the next years a driver for the improvement of the economic, environmental and social performance of the actors of the value chain of biogas. ENEA releases a report on the current state of the bio-methane market in Europe. This publication mainly describes: An outlook of the market evolution and the corresponding stakes for the actors of this sector, the technical and economic characteristics, maturity level and specificities of each biogas upgrading process, An analysis of the French regulatory framework for bio-methane injection into the grid

Meningiomas are the most common intracranial primary neoplasm in adults. Over recent years, interest in this clinically diverse group of tumors has intensified, bringing new questions and challenges to the fore, particularly in the fields of epidemiology, radiology, pathology, genetics, and treatment. Interest in modern meningioma research has been stimulated by the high tumor prevalence and the advances in technology. The incidence of meningiomas is climbing, and may indicate increased exposure to environmental risk factors or more sensitive diagnostic modalities. Technological advances have dramatically improved radiologic imaging and radiotherapy treatments, and further refinements are under investigation. Furthermore, the current era of tumor genetics and molecular biology is challenging translational researchers to discover new, targeted, therapeutic agents. This review is an update on the recent advances in the understanding of meningiomas and their management, and highlights pertinent research questions to be addressed in the future.

Newfoundland and Labrador is on the brink of two extraordinary energy achievements: 1) becoming one of the world's only jurisdictions thermal generation almost entirely; and 2) making a huge contribution of renewable energy to North America. These achievements require the development of the 3,000 MW Lower Churchill Hydroelectric Project; however, the Project will not be developed without a business case to support it. This paper will highlight how the province, through its Energy Plan, has set the path forward for the future development of its renewable resources, including how it plans to overcome some of the challenges ahead.

A rehabilitation center is another form of health care organization that specializes in providing care for particular conditions of patients. Patients admitted in rehab centers range from being accident victims to those suffering with a specific illness. These organizations are becoming extremely valuable in providing patient care services. However, they have not marketed themselves as aggressively as other health care organizations. This article provides an insight regarding rehab centers and examines marketing issues using a SWOT (strengths, weaknesses, opportunities, and threats) analysis. It further provides some future prospects and challenges for marketers of these organizations.

Many factors come into play, most of which are discovered and resolved only during full-scale solidification testing of each of the media commonly used in nuclear power plants. Each waste stream is unique, and must be addressed accordingly. This testing process is so difficult that Diversified's Vinyl Ester Styrene and Advanced Polymer Solidification are the only two approved processes in the United States today. This paper summarizes a few of the key obstacles that must be overcome to achieve a reliable, repeatable process for producing an approved Stable Class B and C waste form. Before other solidification and encapsulation technologies can be considered compliant with the requirements of a Stable waste form, the tests, calculations and reporting discussed above must be conducted for both the waste form and solidification process used to produce the waste form. Diversified's VERI TM and APS TM processes have gained acceptance in the UK. These processes have also been approved and gained acceptance in the U. S. because we have consistently overcome technical hurdles to produce a complaint product. Diversified Technologies processes are protected intellectual property. In specific instances, we have patents pending on key parts of our process technology

Full Text Available In recent years, learning analytics (LA has attracted a great deal of attention in technology-enhanced learning (TEL research as practitioners, institutions, and researchers are increasingly seeing the potential that LA has to shape the future TEL landscape. Generally, LA deals with the development of methods that harness educational data sets to support the learning process. This paper provides a foundation for future research in LA. It provides a systematic overview on this emerging field and its key concepts through a reference model for LA based on four dimensions, namely data, environments, context (what?, stakeholders (who?, objectives (why?, and methods (how?. It further identifies various challenges and research opportunities in the area of LA in relation to each dimension.

Our future deep-space exploration faces many daunting challenges, but three of them loom high above the rest: physiological debilitation, radiation sickness and psychological stress. Many measures are presently being developed to reduce these difficulties. However, in the long run, two important new developments are required: abundant supply of power, and advanced space propulsion. The future looks bright, however. While the road is a long one, it is now well defined and many exciting explorations are within near-term reach.BiographyDr. Chang-Diaz graduated from MIT in the field of applied plasma physics and fusion research. He has been a NASA space shuttle astronaut on seven missions between 1986 and 2002. As director of the ASP Laboratory in Houston, he continues research on plasma rockets.For more details: see www.jsc.nasa.gov/Bios/htmlbios/chang.htmlNote: Tea and coffee will be served at 16:00 hrs.

There are three types of future neutrino facilities currently under study, one based on decays of stored beta-unstable ion beams (Beta Beams), one based on decays of stored muon beams (Neutrino Factory), and one based on the decays of an intense pion beam (Superbeam). In this paper we discuss the challenges each design team must face and the R and D being carried out to turn those challenges into technical opportunities. A new program, the Muon Accelerator Program, has begun in the U.S. to carry out the R D for muon-based facilities, including both the Neutrino Factory and, as its ultimate goal, a Muon Collider. The goals of this program will be briefly described.

Futurechallenges are considered that may arise from technical, socio-economic and political issues; organizational, management and human aspects; and international issues. The perceived challenges have been grouped into four categories, each covered by a chapter. Technical issues are addressed that many present regulatory challenges in the future: ageing nuclear power plants. External changes to industry are considered next that have an effect on regulators, privatization, cost reduction consequences, commercialization etc. It is followed by the impacts of internal changes: organizational, managerial, human-resources, licensing, staff training etc. Finally, international issues are discussed with potential regulatory impact. (R.P.)

Ascent of magma through Earth's crust is normally associated with, among other effects, ground deformation and gravity changes. Geodesy is thus a valuable tool for monitoring and hazards assessment during volcanic unrest, and it provides valuable data for exploring the geometry and volume of magma plumbing systems. Recent decades have seen an explosion in the quality and quantity of volcano geodetic data. New datasets (some made possible by regional and global scientific initiatives), as well as new analysis methods and modeling practices, have resulted in important changes to our understanding of the geodetic characteristics of active volcanism and magmatic processes, from the scale of individual eruptive vents to global compilations of volcano deformation. Here, we describe some of the recent developments in volcano geodesy, both in terms of data and interpretive tools, and discuss the role of international initiatives in meeting futurechallenges for the field.

The Chairman of the Senate Commission for Industry. Tourism and Commerce emphasises the legitimate ambition felt by the citizens of all democratic States with respect to safety and underlines his trust in the regulatory body. He analyses the interesting example of the Spanish Nuclear Safety Council, true reflection of the democracy achieved in the country, from the standpoint of its history, the functions that have been added to its realm of competence in recent years environmental radiological surveillance, intervention in emergencies and activities at non-regulated facilities- and the need for it to adapt, within a framework of overall consensus, to improve its response to futurechallenges and to the goal of promoting credibility and forging a closer relationship with the public. (Author)

Full Text Available Pratibha Singhi, Arushi Gahlot SainiDepartment of Pediatrics, Pediatric Neurology and Neurodevelopment Unit, Advanced Pediatrics Centre, Post Graduate Institute of Medical Education and Research, Chandigarh, IndiaAbstract: Neurocysticercosis (NCC is an acquired infection of the nervous system caused by encysted larvae of Taenia solium. It is a major cause of epilepsy in the tropics and the commonest cause of focal seizures in North Indian children. T. solium teniasis-cysticercosis is considered a parasitic “Neglected Tropical Diseases” endemic throughout Southeast Asia. NCC in children has pleomorphic manifestations depending on the location, number and viability of the cysts, and host response. Even with advancing knowledge of the disease manifestations, many aspects related to diagnosis and treatment, particularly in children, still remain controversial and pose challenges to clinical practice. There is no gold standard test to diagnose NCC and the management recommendations are still emerging. This review provides an overview of diagnosis of NCC in children and its management with special focus on current challenges and future prospects.Keywords: neurocysticercosis, children, epilepsy, ring enhancing lesions, pigs

While countries varies significantly in the financing of dental care, they are much more alike in the delivery of dentistry. Dental care is principally provided in dental offices and clinics that are independent business entities whose business leaders are most often the dentists themselves. However society expects from dentists a level of professionalism (i.e. habitually acting ethically, both in terms of competence and conduct) in contrast to the methods and motivations of the marketplace. This is why the single most important challenge of dental professional ethics continues to be giving proper priority to patients' well being and building ethically correct decision-making relationships with patients while, at the same time, trying to maintain a successful business operation. If we look into dentistry's future, the centrality of this aspect of professional ethics is not likely to change, although the ways in which dentists might violate this trust will probably multiple as funding mechanisms become increasingly complex. It is important that dentists reflect with fresh eyes on their ethical commitments. One challenge is the increased availability of oral health information to the public and the fact that so many people are uncritical of the accuracy of information in the media and on the web. A second is the increase in the amount of health care advertising in many societies. A third is the growth of aesthetic dentistry that differs from standard oral health care in important and ethically significant ways. The fourth is insurance that frequently complicates the explanation of a patient's treatment alternatives and often brings a third party into the treatment decision relationship. The ethical challenges of each of these factors will be considered and ultimately tying it to the central theme of dental professionalism.

Atmospheric OSSEs have had a much longer history of applications than OSSEs (and OSEs) in oceanography. Long standing challenges include the presence of coastlines and steep bathymetric changes, which require the superposition of a wide variety of space and time scales, leading to difficulties on ocean observation and prediction. For instance, remote sensing is critical for providing a quasi-synoptic oceanographic view, but the coverage is limited at the ocean surface. Conversely, in situ measurements are capable to monitor the entire water column, but at a single location and usually for a specific, short time. Despite these challenges, substantial progress has been made in recent years and international initiatives have provided successful OSSE/OSE examples and formed appropriate forums that helped define the future roadmap. These will be discussed, together with various challenges that require a community effort. Examples include: integrated (remote and in situ) observing system requirements for monitoring large scale and climatic changes, vs. short term variability that is particularly important on the regional and coastal spatial scales; satisfying the needs of both global and regional/coastal nature runs, from development to rigorous evaluation and under a clear definition of metrics; data assimilation in the presence of tides; estimation of real-time river discharges for Earth system modeling. An overview of oceanographic efforts that complement the standard OSSE methodology will also be given. These include ocean array design methods, such as representer-based analysis and adaptive sampling. Exciting new opportunities for both global and regional ocean OSSE/OSE studies have recently become possible with targeted periods of comprehensive data sets, such as the existing Gulf of Mexico observations from multiple sources in the aftermath of the DeepWater Horizon incident and the upcoming airborne AirSWOT, in preparation for the SWOT (Surface Water and Ocean

Volume seven of the Review will mark the tenth anniversary of the Canadian HIV/AIDS Legal Network with a series of articles that describe past developments and future directions in several areas of policy and law related to HIV/AIDS. The following article is the first of these, discussing current challenges and future directions in the development of and access to HIV vaccines. It argues that governments are under public health, ethical, and legal obligations to develop and provide access to HIV vaccines. It further explains what is required for governments to fulfill their obligations: additional commitment and resources for HIV vaccine development in the context of increased global research and development regarding diseases of the poor; increased support and advocacy for partnerships to develop HIV vaccines; enhanced regulatory capacity in every country to review, approve, and monitor HIV vaccines; and assurance of global supply of, procurement of, delivery of, and access to vaccines in the context of efforts to increase global access to public health measures and technologies.

Full text: The safeguards system is experiencing what has been seen as a revolution and, in doing so, it is confronting a series of challenges. These can be grouped into three areas. Drawing and maintaining safeguards conclusions - The process by which the safeguards conclusions are derived is based upon the analysis, evaluation and review of all the information available to the Agency. This process is on- going, but the State Evaluation Reports are compiled and reviewed periodically. For States with an additional protocol in force, the absence of indicators of the presence of undeclared nuclear material or activities provides the basis for the safeguards conclusion. Futurechallenges center on States' expectations of, and reactions to, the results of the evaluation and review process. Designing and implementing integrated safeguards - The conceptual framework of integrated safeguards is being actively pursued. Basic principles have been defined and integrated safeguards approaches have been developed for various types of facilities. Work is also progressing on the design of integrated safeguards approaches for specific States. Complementary access is being successfully implemented, and procedures for the use of unannounced inspections are being developed with the prospect of cost- effectiveness gains. Costs neutrality vs. quality and credibility - The Department faces serious staff and financial challenges. It has succeeded so far in 'doing more' and 'doing better' within a zero-real growth budget, but the scope for further significant efficiency gains is exhausted. There is no capacity to absorb new or unexpected tasks. Difficulties in recruiting and retaining qualified and experienced staff exacerbate the problems and add to costs. The Director General of the IAEA has referred to the need for new initiatives to bridge the budgetary gap; a possible measure is proposed. The tasks of meeting the challenges and demands of strengthened safeguards have been added to

The increase of power consumption, the development of renewable energy sources and the emergence of new usages like the electric-powered car are as many challenges that put the reliability and the reactivity of our power grids to the test. These grids have to change to become 'intelligent' thanks to the integration of new information and communication technologies over the overall supply chain, from the energy generation to its end use by consumers. For the first time in France, the actors of this change explain their opinion about this revolution and put it in perspective with its full extent and complexity. Changing power grids to make them intelligent is first of all a technicalchallenge but also a society challenge: the consumer will become an actor involved in the mastery of his energy demand and a renewable energy producer capable to interact with the grid in an increasing manner. This worldwide change that we are going to be the witnesses comes up against numerous obstacles. The aim of this book is to examine the determining factors of the success of this large scale change through its technical, economical and social dimensions. It shows that the emergence of such an advanced power system cannot be possible neither without the reconciliation between some contradictory goals, nor without a strong coordination between the actors. Content: Part 1 - intelligent power networks to answer the 21. century challenges: 1 - the European and French dimension of the electric power sector; 2 - towards a carbon-free economy; 3 - a power grid facing new challenges; 4 - the pre-figuration of intelligent power grids; 5 - the deployment of intelligent (smart) grids; Part 2 - perspectives of smart grids development: 1 - the future of power networks; 2 - a new industrial era; Part 3 - the consumer's position in the deployment of future grids: 1 - changing behaviours; 2 - making the consumer a 'consum'actor'. Synthesis and conclusion. (J.S.)

Full Text Available Wireless capsule endoscopy (WCE offers a feasible noninvasive way to detect the whole gastrointestinal (GI tract and revolutionizes the diagnosis technology. However, compared with wired endoscopies, the limited working time, the low frame rate, and the low image resolution limit the wider application. The progress of this new technology is reviewed in this paper, and the evolution tendencies are analyzed to be high image resolution, high frame rate, and long working time. Unfortunately, the power supply of capsule endoscope (CE is the bottleneck. Wireless power transmission (WPT is the promising solution to this problem, but is also the technicalchallenge. Active CE is another tendency and will be the next geneion of the WCE. Nevertheless, it will not come true shortly, unless the practical locomotion mechanism of the active CE in GI tract is achieved. The locomotion mechanism is the other technicalchallenge, besides the challenge of WPT. The progress about the WPT and the active capsule technology is reviewed.

The International Atomic Energy Agency (IAEA) Regulations for the Safe Transport of Radioactive Materials, TS-R-1, set the standards for the packages used in the transport of radioactive materials under both normal and accident conditions. Transport organisations are also required to implement Radiation Protection Programmes to control radiation dose exposure to both workers and the public. The industry has now operated under this regulatory regime safely and efficiently for nearly 50 years. It is vital that this record be maintained in the future when the demands on the transport industry are increasing. Nuclear power is being called upon more and more to satisfy the world's growing need for sustainable, clean and affordable electricity and there will be a corresponding demand for nuclear fuel cycle services. There will also be a growing need for other radioactive materials, notably large sources such as Cobalt 60 sources for a range of important medical and industrial uses, as well as radio-pharmaceuticals. A reliable transport infrastructure is essential to support all these industry sectors and the challenge will be to ensure that this can be maintained safely and securely in a changing world where public and political concerns are increasing. This paper will discuss the main issues which need to be addressed. The demand for uranium has led to increased exploration and the development of mines in new locations far removed from the demand centres. This inevitably leads to more transport, sometimes from areas potentially lacking in transport infrastructure, service providers, and experience. The demand for sources for medical applications will also increase, particularly from the rapidly developing regions and this will also involve new transport routes and increased traffic. This raises a variety of issues concerning the ability of the transport infrastructure to meet the futurechallenge, particularly in an environment where there already exists reluctance on

The goal of cardiac rehabilitation is to support heart patients using a multidisciplinary team in order to obtain the best possible physical and mental health and achieve long-term social reintegration. In addition to improving physical fitness, cardiac rehabilitation restores self-confidence, thus better equipping patients to deal with mental illness and improving their social reintegration ("participation"). Once the causes of disease have been identified and treated as effectively as possible, drug and lifestyle changes form the focus of cardiac rehabilitation measures. In particular diseases, rehabilitation offers the opportunity for targeted educational courses for diabetics or drug dose escalation, as well as special training for heart failure patients. A nationwide network of outpatient heart groups is available for targeted follow-up. Cardiac patients predominantly rehabilitated in follow-up rehabilitation are older and have greater morbidity than in the past; moreover, they generally come out of acute clinical care earlier and are discharged from hospital more quickly. The proportion of severely ill and multimorbid patients presents a diagnostic and therapeutic challenge in cardiac rehabilitation, although cardiac rehabilitation was not initially conceived for this patient group. The benefit of cardiac rehabilitation has been a well documented reduction in morbidity and mortality. However, hurdles remain, partly due to the patients themselves, partly due to the health insurers. Some insurance providers still refuse rehabilitation for non-ST-segment elevation infarction. In principle rehabilitation can be carried out in an inpatient or an outpatient setting. Specific allocation criteria have not yet been established, but the structure and process quality of outpatient rehabilitation should correspond to that of the inpatient setting. The choice between the two settings should be based on pragmatic criteria. Both settings should be possible for an individual

Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technicalchallenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone

For future and advanced launch vehicles emphasis is focused on single-stage-to-orbit (SSTO) concepts and on completely reusable versions with the goal to reduce the recurrent launch cost, to improve the mission success probability and also safety for the space transportation of economically attractive payloads into Low Earth Orbit. Both issues, the SSTO launcher and the low cost reusability are extremely challenging and cannot be proven by studies and on-ground tests alone. In-flight demonstration tests are required to verify the assumptions and the new technologies, and to justify the new launcher-and operations-concepts. Because a number of SSTO launch vehicles are currently under discussion in terms of configurations and concepts such as winged vehicles for vertical or horizontal launch and landing (from ground or a flying platform), or wingless vehicles for vertical take-off and landing, and also in terms of propulsion (pure rockets or a combination of air breathing and rocket engines), an experimental demonstrator vehicle appears necessary in order to serve as a pathfinder in this area of multiple challenges. A suborbital Reusable Rocket Launcher Demonstrator (RRLD) has been studied recently by a European industrial team for ESA. This is a multipurpose, evolutionary demonstrator, conceived around a modular approach of incremental improvements of subsystems and materials, to achieve a better propellant mass fraction i.e. a better performance, and specifically for the accomplishment of an incremental flight test programme. While the RRLD basic test programme will acquire knowledge about hypersonic flight, re-entry and landing of a cryogenic rocket propelled launcher — and the low cost reusability (short turnaround on ground) in the utilization programme beyond basic testing, the RRLD will serve as a test bed for generic testing of technologies required for the realization of an SSTO launcher. This paper will present the results of the European RRLD study which

Over the past 20 years, altimetry calibration has evolved from an engineering-oriented exercise to a multidisciplinary endeavor driving the state of the art. This evolution has been spurred by the developing promise of altimetry to capture the large-scale, but small-amplitude, changes of the ocean surface containing the expression of climate change. The scope of altimeter calibration/validation programs has expanded commensurately. Early efforts focused on determining a constant range bias and verifying basic compliance of the data products with mission requirements. Contemporary investigations capture, with increasing accuracies, the spatial and temporal characteristics of errors in all elements of the measurement system. Dedicated calibration sites still provide the fundamental service of estimating absolute bias, but also enable long-term monitoring of the sea-surface height and constituent measurements. The use of a network of island and coastal tide gauges has provided the best perspective on the measurement stability, and revealed temporal variations of altimeter measurement system drift. The cross-calibration between successive missions provided fundamentally new information on the performance of altimetry systems. Spatially and temporally correlated errors pose challenges for future missions, underscoring the importance of cross-calibration of new measurements against the established record.

An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

What is technical democracy? And why does it matter for urban studies? As an introduction to this special feature, we address these questions by reflecting on To Our Friends, the 2014 manifesto of the Invisible Committee. We engage in particular its provocative diagnosis of the current situation......: power no longer resides in the modern institutions of representative democracy and the market economy; instead, power has become a matter of logistics, infrastructures and expertise. This diagnosis, we suggest, brings into view the challenge of technical democracy, that is, the democratization of techno......-scientific expertise and the instauration of forms of lasting collaboration among experts and laypeople. Urban politics, we claim, increasingly turns around socio-technical controversies and it is in terms of the politics of expertise that we should analyse and engage it. Building on Science and Technology Studies...

Two decades of extensive investment in nanomaterials, nanofabrication and nanometrology have provided the global engineering community a vast array of new technologies. These technologies not only promise radical change to traditional industries, such as transportation, information and aerospace, but may create whole new industries, such as personalized medicine and personalized energy harvesting and storage. The challenge today for the defense aerospace community is determining how to accelerate the conversion of these technical opportunities into concrete benefits with quantifiable impact, in conjunction with identifying the most important outstanding scientific questions that are limiting their utilization. For example, nanomaterial fabrication delivers substantial tailorablity beyond a traditional material data sheet. How can we integrate this tailorability into agile manufacturing and design methods to further optimize the performance, cost and durability of future resilient aerospace systems? The intersection of nano-based metamaterials and nanostructured devices with biotechnology epitomizes the technological promise of autonomous systems and enhanced human-machine interfaces. What then are the key materials and processes challenges that are inhibiting current lab-scale innovation from being integrated into functioning systems to increase effectiveness and productivity of our human resources? Where innovation is global, accelerating the use of breakthroughs, both for commercial and defense, is essential. Exploitation of these opportunities and finding solutions to the associated challenges for defense aerospace will rely on highly effective partnerships between commercial development, scientific innovation, systems engineering, design and manufacturing.

Integrated positron emission tomography (PET)/magnetic resonance imaging (MRI), which can provide complementary functional and anatomical information about a specific organ or body system at the molecular level, has become a powerful imaging modality to understand the molecular biology details, disease mechanisms, and pharmacokinetics in animals and humans. Although the first experiment on the PET/MRI was performed in the early 1990s, its clinical application was accomplished in recent years because there were various technicalchallenges in integrating PET and MRI in a single system with minimum mutual interference between PET and MRI. This paper presents the technicalchallenges and recent advances in combining PET and MRI along with several approaches for improving PET image quality of the PET/MRI hybrid imaging system

The funds allocated through the Wind Powering America (WPA) grant were utilized by the State of Montana to support broad outreach activities communicating the benefits and opportunities of increased wind energy and transmission development. The challenges to increased wind development were also clearly communicated with the understanding that a clearer comprehension of the challenges would be beneficial in overcoming the obstacles to further development. The ultimate purpose of these activities was to foster the increased development of Montana's rich wind resources through increased public acceptance and wider dissemination of technical resources.

This report recaps the "Science Drivers and TechnicalChallenges for Advanced Magnetic Resonance" workshop, held in late 2011. This exploratory workshop's goal was to discuss and address challenges for the next generation of magnetic resonance experimentation. During the workshop, participants from throughout the world outlined the science drivers and instrumentation demands for high-field dynamic nuclear polarization (DNP) and associated magnetic resonance techniques, discussed barriers to their advancement, and deliberated the path forward for significant and impactful advances in the field.

The paper describes some of the most recent activities in Germany in the technical assessment of future European launcher architecture. In focus is a joint effort of DLR-SART with German launcher industry in the definition of a next generation upper-medium class expendable TSTO with an initial operational capability after 2020. Involved companies are EADS astrium and MT Aerospace. This DLR-agency funded study WOTAN investigates fully cryogenic launchers as well as those with a com...

Approximately two years after promulgation of the Energy Employees Occupational Illness Compensation Program Act, the National Institute for Occupational Safety and Health Office of Compensation and Analysis Support selected a contractor team to perform many aspects of the radiation dose reconstruction process. The project scope and schedule necessitated the development of an organization involving a comparatively large number of health physicists. From the initial stages, there were many technical and managerial challenges that required continuous planning, integration, and conflict resolution. This paper identifies those challenges and describes the resolutions and lessons learned. These insights are hopefully useful to managers of similar scientific projects, especially those requiring significant data, technical methods, and calculations. The most complex challenge has been to complete defensible, individualized dose reconstructions that support timely compensation decisions at an acceptable production level. Adherence to applying claimant-favorable and transparent science consistent with the requirements of the Act has been the key to establishing credibility, which is essential to this large and complex project involving tens of thousands of individual stakeholders. The initial challenges included garnering sufficient and capable scientific staff, developing an effective infrastructure, establishing necessary methods and procedures, and integrating activities to ensure consistent, quality products. The continuing challenges include maintaining the project focus on recommending a compensation determination (rather than generating an accurate dose reconstruction), managing the associated very large data and information management challenges, and ensuring quality control and assurance in the presence of an evolving infrastructure. The lessons learned concern project credibility, claimant favorability, project priorities, quality and consistency, and critical

Forecasts of space power needs are presented. The needs fall into three broad categories: survival, self-sufficiency, and industrialization. The cost of delivering payloads to orbital locations and from Low Earth Orbit (LEO) to Mars are determined. Future launch cost reductions are predicted. From these projections the performances necessary for future solar and nuclear space power options are identified. The availability of plentiful cost effective electric power and of low cost access to space are identified as crucial factors in the future extension of human presence in space

Full Text Available Preimplantation genetic diagnosis (PGD is a clinically feasible technology to prevent the transmission of monogenic inherited disorders in families afflicted the diseases to the future offsprings. The major technical hurdle is it does not have a general formula for all mutations, thus different gene locus needs individualized, customized design to make the diagnosis accurate enough to be applied on PGD, in which the quantity of DNA is scarce, whereas timely result is sometimes requested if fresh embryo transfer is desired. On the other hand, preimplantation genetic screening (PGS screens embryo with aneuploidy and was also known as PGD-A (A denotes aneuploidy in order to enhance the implantation rates as well as livebirth rates. In contrasts to PGD, PGS is still under ferocious debate, especially recent reports found that euploid babies were born after transferring the aneuploid embryos diagnosed by PGS back to the womb and only very few randomized trials of PGS are available in the literature. We have been doing PGD and/or PGS for more than 10 years as one of the core PGD/PGS laboratories in Taiwan. Here we provide a concise review of PGD/PGS regarding its current status, both domestically and globally, as well as its futurechallenges.

Full Text Available Vehicular ad hoc networks (VANETs have been studied intensively due to their wide variety of applications and services, such as passenger safety, enhanced traffic efficiency, and infotainment. With the evolution of technology and sudden growth in the number of smart vehicles, traditional VANETs face several technicalchallenges in deployment and management due to less flexibility, scalability, poor connectivity, and inadequate intelligence. Cloud computing is considered a way to satisfy these requirements in VANETs. However, next-generation VANETs will have special requirements of autonomous vehicles with high mobility, low latency, real-time applications, and connectivity, which may not be resolved by conventional cloud computing. Hence, merging of fog computing with the conventional cloud for VANETs is discussed as a potential solution for several issues in current and future VANETs. In addition, fog computing can be enhanced by integrating Software-Defined Network (SDN, which provides flexibility, programmability, and global knowledge of the network. We present two example scenarios for timely dissemination of safety messages in future VANETs based on fog and a combination of fog and SDN. We also explained the issues that need to be resolved for the deployment of three different cloud-based approaches.

The future appears rich in missions that will extend the frontiers of knowledge, human presence in space, and opportunities for profitable commerce. The key to success of these ventures is the availability of plentiful, cost effective electric power and assured, low cost access to space. While forecasts of space power needs are problematic, an assessment of future needs based on terrestrial experience was made. These needs fall into three broad categories-survival, self sufficiency and industrialization. The cost of delivering payloads to orbital locations from low earth orbit (LEO) to Mars was determined and future launch cost reductions projected. From these factors, then, projections of the performance necessary for future solar and nuclear space power options were made. These goals are largely dependent upon orbital location and energy storage needs

Logo of the Indian Academy of Sciences ... Biomicroﬂuidics is an active area of research at present, exploring the synergy of microﬂuidics with cellular and molecular biology, ... Future directions of research on these topics are also discussed.

IAEA safeguard system is considered the corner stone of the international nuclear nonproliferation regime. Effective implementation of this legal instrument enables the IAEA to draw a conclusion with a high degree of confidence on the peaceful use of nuclear material and activities in the state. This paper aims to provide an opportunity to address various challenges encountered by IAEA. Strengthening safeguards system for verification is one of the most urgent challenges facing the IAEA. The IAEA should be able to provide credible assurance not only about declared use of nuclear material and facilities but also about the absence of undeclared material and activities. Implementation of IAEA safeguards continue to play a vital role within the nuclear non-proliferation regime. IAEA must move towards more enhanced safeguards system that is driven by the full use of all the safeguards available relevant information. Safeguards system must be responsive to evolving challenges and continue innovation through efficient implementations of more effective safeguards.

This slide presentation reviews the use of traditional and innovative techniques to solve technicalchallenges in food storage technology. The planning for a mission to Mars is underway, and the food storage technology improvements requires that improvements be made. This new technology is required, because current food storage technology is inadequate,refrigerators or freezers are not available for food preservation, and that a shelf life of 5 years is expected. A 10 year effort to improve food packaging technology has not enhanced significantly food packaging capabilities. Two innovation techniques were attempted InnoCentive and Yet2.com and have provided good results, and are still under due diligence for solver verification.

'Technical progress safeguards future', the guiding theme of the 1985 conference of German engineers, calls for discussion. In five lectures representatives of the subdivisions of 'VDI' issued their statements from the viewpoints of their special fields. These lectures were completed by reports on the part of the remaining VDI subdivisions, which are published together with the lectures in this volume. The complex guiding theme is meant to stimulate discussion, which should be conducted also with representatives of other sciences and the public. The volume contains a.o. contributions regarding future prospects, given certain modifications in construction engineering and user behaviour in the sector heating and air-conditioning, regarding the development of new construction techniques to protect the environment, and regarding clean air as an international concern of engineers. For these three contributions separate entries were made. Other presentations relate to: automobile production technology; energy supply as an engineering task; information, invention, innovation as stages of technical progress; progress in materials technology; noise of motor vehicles - current state and future prospects. (orig./HSCH).

This study provides basic background information about the establishment of the IAEA, its mission, major activities, General Conference , and Board of Governors, structure and functions of the Secretariat. The IAEA Mid-term plan, to be implemented in the years 1998 - 2003, includes the enhancement of its functional effectiveness, analysis of the changing developments, adjustment of its priorities, and evaluation of its programmes, are describes in full detail. This plan is divided into 6 major areas ; nuclear power and the fuel cycle, nuclear applications, nuclear, radiation and radwaste safety, verification and security of nuclear material, management of technical cooperation for development, policy making, coordination and support. It is also expected that the IAEA plan provides an opportunity to understand the future directions of IAEA programmes and its operational philosophy, thus greatly contributing to Koreas establishment of its own future directions for expanded cooperation with the IAEA, and urges to device effective domestic strategies. This plan will also contribute to the evaluation of Koreas responsibility as a member of the Board of Governors as well as enhance Koreas role as an Advisory Group Member. It is expected that this study is useful for nuclear-related organizations wishing to establish basic directions for the efficient implementation of IAEA technical cooperation programs in the future. (author). 16 refs., 6 tabs., 16 figs

Full Text Available Over the past two years, George Washington University Libraries developed Social Feed Manager (SFM, a Python and Django-based application for collecting social media data from Twitter. Expanding the project from a research prototype to a more widely useful application has presented a number of technicalchallenges, including changes in the Twitter API, supervision of simultaneous streaming processes, management, storage, and organization of collected data, meeting researcher needs for groups or sets of data, and improving documentation to facilitate other institutions’ installation and use of SFM. This article will describe how the Social Feed Manager project addressed these issues, use of supervisord to manage processes, and other technical decisions made in the course of this project through late summer 2014. This article is targeted towards librarians and archivists who are interested in building collections around web archives and social media data, and have a particular interest in the technical work involved in applying software to the problem of building a sustainable collection management program around these sources.

This is the second half of a two paper series covering aspects of the no fault found (NFF) phenomenon, which is highly challenging and is becoming even more important due to increasing complexity and criticality of technical systems. Part 1 introduced the fundamental concept of unknown failures from an organizational, behavioral and cultural stand point. It also reported an industrial outlook to the problem, recent procedural standards, whilst discussing the financial implications and safety concerns. In this issue, the authors examine the technical aspects, reviewing the common causes of NFF failures in electronic, software and mechanical systems. This is followed by a survey on technological techniques actively being used to reduce the consequence of such instances. After discussing improvements in testability, the article identifies gaps in literature and points out the core areas that should be focused in the future. Special attention is paid to the recent trends on knowledge sharing and troubleshooting tools; with potential research on technical diagnosis being enumerated

This paper presents an overview of the applications of ultrasound for the treatment of an ever-growing range of medical conditions. After presenting a brief history of the development of therapeutic ultrasound, the different mechanisms by which beneficial bio-effects are triggered will be discussed. This will be followed by a discussion of some of the more promising applications, some of which have already been licensed and introduced into the clinic. The case of liver tumour ablation will be discussed to demonstrate some of the engineering challenges that still need to be overcome before this technology finds wider uptake in the medical world.

Pertussis, or whooping cough, has recently reemerged as a major public health threat despite high levels of vaccination against the etiological agent, Bordetella pertussis. In this Review, we describe the pathogenesis of this disease, with a focus on recent mechanistic insights into virulence factor function. We also discuss the changing epidemiology of pertussis and the challenges of vaccine development. Despite decades of research, many aspects of B. pertussis physiology and pathogenesis remain poorly understood. We highlight knowledge gaps that must be addressed to develop improved vaccines and therapeutic strategies. PMID:24608338

The Royal Netherlands Meteorological Institute (KNMI) has over 150 years of knowledge and gathered information related to weather, Climate and Seismology. A huge part of this information is from numerical models, insitu sensor networks and remote sensing satellites. This digital collection is becoming more and more available in the newly developed KNMI Data Centre, that is now 2 years operational. The KNMI Data Centre project has a user driven development approach where SCRUM is chosen to get maximum user involvement in a relative short timeframe. The system is build on open standards and proven opensource technology (which includes in-house developed software like ADAGUC WMS and Portal). The presentation will focus on the aspects of developing the initial KNMI Data Centre, the operational use of the last 2 years, and how a major release for the coming year will be realized. The new release which will focus on better user experience and extending the technical data interfaces to the data centre. Keywords: Agile, Usage Statistics, Open Data, Inspire, DOI, WMS, WCS, OPeNDAP

At the present time, there are a number of future linear collider designs with a center-of-mass energy of 500 GeV or more with luminosities in excess of 10{sup -34}cm{sup -2}s{sup -1} . Many of these designs are at an advanced state of development. However, to attain the high luminosity, the colliders require very small beam emittances, strong focusing, and very good stability. In this paper, some of the outstanding issues related to producing and maintaining the small beam sizes are discussed. Although the different designs are based on very different rf technologies, many of these problems are common.

Due to variations of the radiocarbon content in the biosphere over time, radiocarbon determinations need to be calibrated to obtain calendar years. Over the past decade a series of researchers have investigated the possibility of using Bayesian statistics to calibrate radiocarbon determinations, the main feature being the inclusion of contextual information into the calibration process. This allows for a coherent calibration of groups of determinations arising from related contexts (stratigraphical layers, peat cores, cultural events, ect.). Moreover, the 'related contexts' are also dated, and not only the material radiocarbon dated itself. We review Bayesian Calibration and state some of its current challenges like: software development, prior specification, robustness, etc. (author). 14 refs., 4 figs

Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

Neutrons are among the fundamental building blocks of matter. Some of the processes in which they are involved are responsible for energy generation in nuclear power plants. In this context, CERN’s n_TOF and other facilities participating in the ERINDA EU-funded programme help the community integrate all the scientific efforts needed to produce high-quality nuclear data for future nuclear technologies. The 4π calorimeter inside the n_TOF experimental area. Image courtesy of the n_TOF Collaboration. Accurate measurements of the interactions between neutrons and each of the elements present in nuclear reactors are vital tools enabling scientists to explore solutions – other than simple protected storage – for the treatment of radioactive waste deriving from a number of applications, ranging from energy production to the medical field. Particularly valuable is the contribution provided by the 13 accelerator-based neutron sources, which the ERINDA EU-funded...

The purpose of the symposium is to foster dialogue and information exchange involving Member States, the nuclear industry and members of the broader nuclear non-proliferation community to prepare for future verification challenges. Topics addressed during the 2010 symposium include the following: - Supporting the global nuclear non-proliferation regime: Building support for strengthening international safeguards; Enhancing confidence in compliance with safeguards obligations; Legal authority as a means to enhance effectiveness and efficiency; Verification roles in support of arms control and disarmament. - Building collaboration and partnerships with other international forums: Other verification and non-proliferation regimes; Synergies between safety, security and safeguards regimes. - Improving cooperation between IAEA and States for safeguards implementation: Strengthening State systems for meeting safeguards obligations; Enhancing safeguards effectiveness and efficiency through greater cooperation; Lessons learned: recommendations for enhancing integrated safeguards implementation. - Addressing safeguards challenges in an increasingly interconnected world: Non-State actors and covert trade networks; Globalization of nuclear information and technology. - Preparing for the global nuclear expansion and increasing safeguards workload: Furthering implementation of the State-level concept and integrated safeguards; Information-driven safeguards; Remote data-driven safeguards inspections; Safeguards in States without comprehensive safeguards agreements. - Safeguarding advanced nuclear facilities and innovative fuel cycles: Proliferation resistance; Safeguards by design; Safeguards approaches for advanced facilities. - Advanced technologies and methodologies: For verifying nuclear material and activities; For detecting undeclared nuclear material and activities; For information collection, analysis and integration. - Enhancing the development and use of safeguards

The preservation of ultra-low emittances in the main linac and Beam Delivery System area is one of the main challenges for linear colliders. This requires alignment tolerances never achieved before at that scale, down to the micrometre level. As a matter of fact, in the LHC, the goal for the smoothing of the components was to obtain a 1σ deviation with respect to a smooth curve of 0.15 mm over a 150 m long sliding window, while for the CLIC project for example, it corresponds to 10 μm over a sliding window of 200 m in the Beam Delivery System area. Two complementary strategies are being studied to fulfil these requirements: the development and validation of long range alignment systems over a few hundreds of metres and short range alignment systems over a few metres. The studies undertaken, with associated tests setups and the latest results will be detailed, as well as their application for the alignment of both CLIC and ILC colliders.

The paper surveys the major challenges to stabilizing the atmospheric CO 2 concentration. Climate change, and policies to deal with it, is viewed as an energy problem. The energy problem stems from the fact that no combination of carbon-free energies is currently capable of displacing fossil fuels as the main sources of the world's base load energy requirements. The paper provides rough estimates of the amount of carbon-free energy required to stabilize climate, the potential contribution of 'conventional' carbon-free energies, the contribution of renewable energies, and the size of an 'advanced energy technology gap'. The findings indicate that stabilizing CO 2 concentration will require a long-term commitment to research, develop, and eventually deploy new energy sources and technologies including hydrogen. The paper suggests that the role of technology is what makes stabilizing CO 2 concentration economically feasible. In this respect energy technology and economics are complementary, with advances in the former requiring something more than a reliance on market-based instruments, such as carbon taxes and emission permits. The analysis has implications for the credibility of commitments to target climate change-related factors such as CO 2 emissions.(author)

The paper surveys the major challenges to stabilizing the atmospheric CO 2 concentration. Climate change, and policies to deal with it, is viewed as an energy problem. The energy problem stems from the fact that no combination of carbon-free energies is currently capable of displacing fossil fuels as the main sources of the world's base load energy requirements. The paper provides rough estimates of the amount of carbon-free energy required to stabilize climate, the potential contribution of 'conventional' carbon-free energies, the contribution of renewable energies, and the size of an 'advanced energy technology gap'. The findings indicate that stabilizing CO 2 concentration will require a long-term commitment to research, develop, and eventually deploy new energy sources and technologies including hydrogen. The paper suggests that the role of technology is what makes stabilizing CO 2 concentration economically feasible. In this respect energy technology and economics are complementary, with advances in the former requiring something more than a reliance on market-based instruments, such as carbon taxes and emission permits. The analysis has implications for the credibility of commitments to target climate change-related factors such as CO 2 emissions

Previously used mirror technologies are not able to fulfil the requirements of future X-ray telescopes due to challenging requests from the scientific community. Consequently new technical approaches for X-ray mirror production are under development. In Europe the technical baseline for the planned X-ray observatory ATHENA is the radical new approach of silicon pore optics. NASÁs recently launched NuSTAR mission uses segmented mirrors shells made from thin bended glasses, successfully demonstrating the feasibility of the glass forming technology for X-ray mirrors. For risk mitigation also in Europe the hot slumping of thin glasses is being developed as an alternative technology for lightweight X-ray telescopes. The high precision mirror manufacturing requires challengingtechnical developments; several design trades and trend-setting decisions need to be made and are discussed within this paper. Some new technical and economic aspects of the intended glass mirror serial production are also studied within the recently started interdisciplinary project INTRAAST, an acronym for "industry transfer of astronomical mirror technologies". The goal of the project, embedded in a cooperation of the Max-Planck-Institute for extraterrestrial Physics and the University of Applied Sciences Aschaffenburg, is to master the challenge of producing thin mirror shells for future X-ray telescopes. As a first project task the development of low stress coatings for thin glass mirror substrates have been started, the corresponding technical approach and first results are presented.

Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

The International Laser Ranging Service (ILRS) is expanding its ground tracking capability with new stations and upgrades to current stations. Our Russian colleagues have installed new stations in Brasilia and South Africa, and have several other sites in process or in planning. The NASA Space Geodesy Program is preparing equipment for U.S. sites (McDonald and Haleakala) and with the Norwegian National Mapping Agency in Ny Ålesund; further deployments are planned. Upgrades continue at sites in China, and new sites are underway or planned in Europe and India. Stations are moving to higher repetition rates and more efficient detection to enhance satellite interleaving capability; some stations have already implemented automated processes that could lead to around-the-clock operation to increase temporal coverage and to make more efficient use of personnel. The ILRS roster of supported satellites continues to grow with the addition of the LARES satellite to augment tracking for the improvement of the ITRF. New GNSS constellations and geosynchronous satellites now bring the total roster to over 80 satellites - so much so, that new tracking strategies and time and location multiplexing are under consideration. There continues to be strong interest in Lunar Ranging. New applications of one-way and two-way laser ranging include ps-accurate time transfer, laser transponders for interplanetary ranging, and tracking of space debris. New laser ranging data products are being developed, including satellite orbit products, satellite orientation, gravity field products, and products to characterize the quality of data and station performance. This talk will give a brief summary of recent progress, current challenges and a view of the path ahead.

To face the very broad range of technical matters on which the regulatory and licensing activity are based, and related research and development activity, the Nuclear Safety Authorities (NSA) may need to rely upon external technical and scientific support. In providing technical support to NSA, the experience shows, from one side, the importance to have technical support organizations (TSO) with recognized competence, independence and appropriate regulatory view, and from the other side, the importance to have within the NSAs well developed management and technical capability to address, coordinate and use the results of the external technical support. Retaining the NSA the full responsibility for the final decision. Under which conditions and modus operandi the external support shall be provided in order to comply with requirements of being independent, competent and timely provided, fulfilling the administrative procedures, is the subject of attention and consideration of TSO function today. The Italian regulatory body is currently going to be institutionally re-established according to new law approved in 2009 /1/ and it needs to be resourced and fully organized with necessary capacities in the nearest future. The perspective of a new nuclear program, recently launched by the government, with significant incoming tasks for regulation and licensing, against the existing limited resources, let foresee a substantial potential need for technical support and advice. ITER-Consult (Ltd), created in 2003 in Italy, has well developed capabilities to provide independent technical evaluation and support to NSAs, to maintain safety culture and updated knowledge, to transfer know how and to establish international cooperation and networking. This mission is guided assuming as values the independence, the professional competence, the transparency, the credibility and the establishment of respectful relationship with the partners. Challenges exist for funding and operational

A future energy system that includes a high proportion of renewable energy will be expected to meet the same requirements for security of supply and economic efficiency as the energy systems of today, while delivering better environmental performance, especially with regard to CO{sub 2} emissions. Security of supply refers to the long-term reliability of fuel supply; especially in power systems, it also covers short-term requirements for system stability and adequacy. Economic efficiency is concerned with getting the best from the significant amounts of money, human capital and natural resources involved in an energy system. Integral to economic efficiency in energy systems is the presence of well-functioning markets for energy services. The variability and reduced predictability of a number of renewable energy sources, notably wind power, create specific challenges for future energy systems compared to those of today. Power transmission will also become an issue, as the areas with good potential for wind power and wave energy are often located some distance from the centres of power consumption. This chapter describes the challenges involved, and possible solutions to these, with a focus on power systems. The chapter is divided into two sections reflecting the fact that some challenges relate to managing the power system in its normal operation mode, whereas others are specific to fault conditions. (au)

Full Text Available It has long been discovered that human pluripotent cells could be isolated from the blastocyst state of embryos and called human embryonic stem cells (ESCs. These cells can be adapted and propagated indefinitely in culture in an undifferentiated manner as well as differentiated into cell representing the three major germ layers: endoderm, mesoderm, and ectoderm. However, the derivation of human pluripotent cells from donated embryos is limited and restricted by ethical concerns. Therefore, various approaches have been explored and proved their success. Human pluripotent cells can also be derived experimentally by the nuclear reprogramming of somatic cells. These techniques include somatic cell nuclear transfer (SCNT, cell fusion and overexpression of pluripotent genes. In this paper, we discuss the technicalchallenges of these approaches for nuclear reprogramming, involving their advantages and limitations. We will also highlight the possible applications of these techniques in the study of stem cell biology.

Full Text Available One of the challenges in writing an article reviewing the current state of cyber education and workforce development is that there is a paucity of quantitative assessment regarding the cognitive aptitudes, work roles, or team organization required by cybersecurity professionals to be successful. In this review, we argue that the people who operate within the cyber domain need a combination of technical skills, domain specific knowledge, and social intelligence to be successful. They, like the networks they operate, must also be reliable, trustworthy, and resilient. Defining the knowledge, skills, attributes, and other characteristics is not as simple as defining a group of technical skills that people can be trained on; the complexity of the cyber domain makes this a unique challenge. There has been little research devoted to exactly what attributes individuals in the cyber domain need. What research does exist places an emphasis on technical and engineering skills while discounting the important social and organizational influences that dictate success or failure in everyday settings. This paper reviews the literature on cyber expertise and cyber workforce development to identify gaps and then argues for the important contribution of social fit in the highly complex and heterogenous cyber workforce. We then identify six assumptions for the future of cybersecurity workforce development, including the requirement for systemic thinkers, team players, a love for continued learning, strong communication ability, a sense of civic duty, and a blend of technical and social skill. Finally, we make recommendations for social and cognitive metrics which may be indicative of future performance in cyber work roles to provide a roadmap for future scholars.

One of the challenges in writing an article reviewing the current state of cyber education and workforce development is that there is a paucity of quantitative assessment regarding the cognitive aptitudes, work roles, or team organization required by cybersecurity professionals to be successful. In this review, we argue that the people who operate within the cyber domain need a combination of technical skills, domain specific knowledge, and social intelligence to be successful. They, like the networks they operate, must also be reliable, trustworthy, and resilient. Defining the knowledge, skills, attributes, and other characteristics is not as simple as defining a group of technical skills that people can be trained on; the complexity of the cyber domain makes this a unique challenge. There has been little research devoted to exactly what attributes individuals in the cyber domain need. What research does exist places an emphasis on technical and engineering skills while discounting the important social and organizational influences that dictate success or failure in everyday settings. This paper reviews the literature on cyber expertise and cyber workforce development to identify gaps and then argues for the important contribution of social fit in the highly complex and heterogenous cyber workforce. We then identify six assumptions for the future of cybersecurity workforce development, including the requirement for systemic thinkers, team players, a love for continued learning, strong communication ability, a sense of civic duty, and a blend of technical and social skill. Finally, we make recommendations for social and cognitive metrics which may be indicative of future performance in cyber work roles to provide a roadmap for future scholars.

Key regulatory challenges for future nuclear power plants are concerned with fuel and cladding materials taken to higher burnup and operated at higher temperatures. Particular challenges are related to reduction in waste toxicity, understanding and control of coolant corrosion, qualification of fuel particles, new maintenance practices

Full Text Available Yousif E Himeidan Vector Control Unit, Africa Technical Research Centre, Vector Health International, Arusha, Tanzania Abstract: Rift Valley fever (RVF is a zoonotic, mosquito-borne viral disease that affects human health and causes significant losses in the livestock industry. Recent outbreaks have led to severe human infections with high mortality rates. There are many challenges to applying effective preventive and control measures, including weak infrastructure of health facilities, lack of capacity and support systems for field logistics and communication, access to global expert organizations, and insufficient information on the epidemiological and reservoir status of the RVF virus. The health systems in East African countries are underdeveloped, with gaps in adaptability to new, more accurate and rapid techniques, and well-trained staff that affect their capacity to monitor and evaluate the disease. Surveillance and response systems are inadequate in providing accurate information in a timely manner for decision making to deal with the scope of interrupting the disease transmission by applying mass animal vaccination, and other preventive measures at the early stage of an outbreak. The historical vaccines are unsuitable for use in newborn and gestating livestock, and the recent ones require a booster and annual revaccination. Future live-attenuated RVF vaccines should possess lower safety concerns regardless of the physiologic state of the animal, and provide rapid and long-term immunity after a single dose of vaccination. In the absence of an effective vaccination program, prevention and control measures must be immediately undertaken after an alert is generated. These measures include enforcing and adapting standard protocols for animal trade and movement, extensive vector control, safe disposal of infected animals, and modification of human–animal contact behavior. Directing control efforts on farmers and workers who deal with

serious economical, ecological, and social problems. Trying to evaluate these problems, an attempt was made to foresee trends of further economy development and energy demand for next 20 years. The econometric models, comparative analysis and analytical expertise evaluation methods were used. Three different scenarios of future energy consumption growth were analysed. Results of the performed analysis show, that in the year 2020 energy demand should be about 15,3 - 22,1Twh in cases of the slow or very fast growth scenario (3197 MW and 4484 MW respectively, taking into account necessary reserves). This leads to a shortage of Lithuania's power generating capacity already by 2010. In 2020 this shortage can increase to 556 MW in case of the slow growth scenario and 1843 MW in case of the very fast growth scenario. Three possible ways to compensate this shortage increasing the share of power plants using fossil fuel, broadening the exploitation of renewable energy resources, and nuclear option are analysed. Both economical and ecological problems, including the price dynamics of main imported energy resources, especially of oil and natural gas, are taken into account. It is pointed out that according to Energy Strategy of Russia average contract price of gas can reach 119 - 138 USD/10 3 m 3 in 2020 (growth of price 138 - 160% compared to 86 USD/10 3 m 3 in the year 2000). The unreliability of fuel supply from single supplier (Russia) is emphasized. Analysis and assessment of positive and negative aspects of different energy generation means shows that perhaps the best solution in perspective for Lithuania is the nuclear option. It can be realised by following means: a) extension of exploitation of the second unit of Ignalina NPP after the year 2010, b) replacement of existing RBMK-1500 reactors by modern BWR or PWR reactors, using existing turbines and infrastructure, and c) construction of new nuclear power unit or plant. Results of this study illustrate, that all nuclear

This article offers a theoretical discussion on the current problems and futurechallenges of school capacity building in early childhood education (ECE), aiming to highlight some key areas for future research. In recent years, there has been a notable policy shift from monitoring quality through inspection to improving quality through school…

This paper investigates CSCW aspects of large-scale technical projects based on a case study of a specific Danish engineering company and uncovers s challenges to CSCW applications in this setting. The company is responsible for management and supervision of one of the worlds largest tunnel....... The initial qualitative analysis identified a number of bottlenecks in daily work, where support for cooperation is needed. Examples of bottlenecks are: sharing materials, issuing tasks, and keeping track of task status. Grounded in the analysis, cooperative design workshops based on scenarios of future work...

There has been no large-scale naval combat in the last 30 years. With the rapid development of battleships, weapons manufacturing and electronic technology, naval combat will present some new characteristics. Additionally, naval combat is facing unprecedented challenges. In this paper, we discuss the topic of medical rescue at sea: what challenges we face and what we could do. The contents discussed in this paper contain battlefield self-aid buddy care, clinical skills, organized health services, medical training and future medical research programs. We also discuss the characteristics of modern naval combat, medical rescue challenges, medical treatment highlights and future developments of medical rescue at sea.

Over the past twenty-five years, new businesses based on innovative technology have been the driving force for the US economy. Due to the abundance of early-stage capital, each year, thousands of scientists and engineers receive support to start new, technology-based businesses. However, the transition from technologist to entrepreneur is often difficult. It requires a shift in emphasis from a technology focus to a market focus. We shall discuss the challenges facing the technical entrepreneur in launching a new enterprise, and a variety of resources that are available to help the entrepreneur succeed. Many technologists fall victim to the myth that if you ``build a better mousetrap, the world will beat a path to your door." To be sure, it is important to base your business on sound technology that offers a clear advantage over current practice, and, if possible, to secure title to the technology, either by obtaining patent protection, or securing an exclusive license. Once that is done, however, the principal concern of the fledgling enterprise is building a business and obtaining the financial resources to enable it to grow. The entrepreneur must develop a clear and compelling business model, that can be communicated to a non-technical investor in a few minutes. This requires a mode of thinking and expression quite different from that commonly used in engineering or scientific discussions. Fortunately, abundant resources are available to help the technologist become a successful entrepreneur. We shall discuss the kinds of assistance that are generally available through local and national programs, and give specific examples based on the activities of the Bay Area Regional Technology Alliance in northern California.

Drawing on the importance of future orientation for adolescent development this analysis presents a model describing how future orientation is affected by high challenge (or resilience) in the face of political violence. The analysis consists of three parts. The first two present future orientation conceptualization and the psychological processes…

Qualitative Risk Assessments (QRAS) have been selected as the method for providing the risk-driver indications for interim, remedial, and cleanup actions for the Hanford Site operable units' ecological risk assessments. This expedited response action path has been developed for the Hanford Site to facilitate time-critical decisions and generate immediate emergency cleanup actions. Tight budgets and aggressive time schedules are a major factor in the development of the QRA process. The QRA is a quick way to find immediate threats and a good precursor to a full risk assessment. However, numerous technicalchallenges have been identified with the QRA approach. The QRA approach differs from a baseline risk assessment in several ways. The main differences involve the use of data that have previously been gathered from the site, and the development of a ''bias-for-action'' document that would reveal qualitative risks from the contaminants identified at the operable units. Technicalchallenges concerning the ecological portion of these QRAs have raised questions about using the ORA for decision-making and may have weakened the validity of its use in the established procedural framework. Challenges involving such issues as the extrapolation of the contaminant data, data validation and screening techniques, receptor selections, and the final risk characterization outcome threaten the feasibility of the QRA as a decision-making tool. This discussion provides insight into resolving technicalchallenges and may be a ''lessons-learned'' device for those interested in the QRA approach. Ultimately, these challenges are proving to be learning tools for scientists, regulators, and ecologists and are identifying the data gaps and research direction for future ecological baseline risk assessments

The 2012 Technical Documentation workshop addressed both problems and solutions associated with technical : documentation for maintenance. These issues are known to cause errors, rework, maintenance delays, other : safety hazards, and FAA administrat...

Approximately 120 new chemicals are created each year due to ever-improving industry and technology markets. Releases of new contaminants into the environment can occur during production, use and disposal of these chemicals thereby leading to potential contamination of water supply sources. Very few emerging contaminants are regulated. In addition, knowledge gaps regarding emerging contaminants include lack health effects, occurrence (either because these compounds are not measured or because concentrations are below detection limits of readily available analytical techniques) and fate and transport in the environment especially with regards to mobility and persistence. The sources of these compounds are numerous. One source is treated wastewater, which is re-injected into groundwater aquifers for indirect potable reuse purposes. Emerging compounds of concern can be classified in various classes. This presentation will focus on contaminants, which have emerged in the last 10 years including pharmaceuticals (antibiotics/drugs), personal care products (polycyclic musks), pesticides/herbicides, industrial solvents (1,4-dioxane), gasoline additives (MTBE), disinfection byproducts such as NDMA (N-nitrosodimethylamine), and inorganic compounds such as perchlorate and arsenic. This presentation will present technical, legal and legislative challenges posed by the presence of these contaminants in water. Background information including chemical's history of use, sources in the environments, nationwide occurrence, physical and chemical properties, behavior in the environment and technologies for removal from soil and water will be presented. In addition, case studies on MTBE, pharmaceuticals and personal care products, 1,4-dioxane, arsenic and NDMA will be discussed.

In spite of considerable spending on research and technical development, the management of nuclear wastes continues to be a difficult issue in public decision making. The nuclear industry says that it has safe solutions for the ultimate disposal of nuclear wastes, but the message has not really got through to the public at large. Although communications problems reflect the general stigmatization of nuclear power, there are obvious issues in safety and performance assessment of nuclear waste disposal which evade scientific resolution. Any scientist is concerned for his personal credibility must respect the rules and limits of scientific practice, but the intriguing question is whether he would not do better to address the layman's worries about radioactive substances? The discussion in this paper points out the intricacies of the distinction between scientific proof and judgement, with emphasis on safety assessment for nuclear waste disposal. Who are the final arbitrators? In a democratic society it is probably those who vote.Building confidence in expert judgements is a challenge for waste managers and scientists. The media may create their own 'experts', whose only necessary credential is the trust of their audience, but scientific judgements must stand the test of time.'Confidence building' is currently a key word on the whole nuclear waste management scene, and confidence in science and scientists is certainly needed for any progress towards practical implementation of plans. The means for building confidence in the decision-making process are probably different from those applied for science and scientists. (author)

During the last 5 years, digital techniques have become extremely important in the graphic arts industry. All sections in the production flow for producing multicolor printed products - prepress, printing and postpress - are influenced by digitalization, in an evolutionary and revolutionary way. New equipment and network techniques bring all the sections closer together. The focus is put on high-quality multicolor printing, together with high productivity. Conventional offset printing technology is compared with the leading nonimpact printing technologies. Computer to press is contrasted with computer to print techniques. The newest available digital multicolor presses are described - the direct imaging offset printing press from HEIDELBERG with new laser imaging technique as well as the INDIGO and XEIKON presses based on electrophotography. Regarding technical specifications, economic calculations and print quality, it is worked out that each technique has its own market segments. An outlook is given for future computer to press techniques and the potential of nonimpact printing technologies for advanced high-speed multicolor computer to print equipment. Synergy effects from the NIP-technologies to the conventional printing technologies and vice versa are possible for building up innovative new products, for example hybrid printing systems. It is also shown that there is potential for improving the print quality, based on special screening algorithms, and a higher number of grey levels per pixel by using NIP-technologies. As an intermediate step in digitalization of the production flow, but also as an economical solution computer to plate equipment is described. By producing printed products totally in a digital way, digital color proofing as well as color management systems are needed. The newest high-tech equipment using NIP-technologies for producing proofs is explained. All in all it is shown that the state of the art in digital multicolor printing has reached

Career and technical education concurrent enrollment may pose unique challenges in programming and enrollment for program administrators, and this chapter describes the experiences and challenges of a CTE concurrent enrollment administrator.

The document reproduces the text of the conference given by the Director General of the IAEA at the Swedish Institute of International Affairs in Stockholm on 23 April 1998. After a short presentation of the Agency's current verification activities, particularly in Iraq and Democratic People's Republic of Korea, the Director General focuses on the present and future role of the IAEA in the control of nuclear proliferation through its strengthened safeguards system, in the prevention of nuclear terrorism, and futurechallenges of controlling nuclear proliferation from both political and technical point of view

Libya is a vast country situated in North Africa, having a relatively better functioning economy with a scanty population. This article is the first known attempt to review the current state of oral health care in Libya and to explore the present trends and futurechallenges. Libyan health system, oral health care, and human ...

Remembering our past is an essential first step into the future. Building upon that philosophy, our objective is to summarize two presentations from a 2012 Soil Science Society of America (SSSA) symposium focused on soil management challenges in response to climate change in order to examine: (1) ho...

Living the future now: `Race' and challenges of transformation in higher education. ZE Erasmus. Abstract. Drawing on research among medical students at the University of Cape Town's Faculty of Health Sciences, this article explores two questions: How do students and staff work with `race' in their relations to one another?

Highlights: • 5, 10 and 20 MW wind turbines are developed using multidisciplinary design optimization. • Technical feasibility and economy of large wind turbines are investigated. • Critical upscaling trends of existing wind turbines are presented up to 20 MW. • Design challenges of large wind turbines are identified, and design solutions proposed. • With no design innovation, upscaling of existing turbines will increase the costs. - Abstract: Wind energy has experienced a continuous cost reduction in the last decades. A popular cost reduction technique is to increase the rated power of the wind turbine by making it larger. However, it is not clear whether further upscaling of the existing wind turbines beyond the 5–7 MW range is technically feasible and economically attractive. To address this question, this study uses 5, 10, and 20 MW wind turbines that are developed using multidisciplinary design optimization as upscaling data points. These wind turbines are upwind, 3-bladed, pitch-regulated, variable-speed machines with a tubular tower. Based on the design data and properties of these wind turbines, scaling trends such as loading, mass, and cost are developed. These trends are used to study the technical and economical aspects of upscaling and its impact on the design and cost. The results of this research show the technical feasibility of the existing wind turbines up to 20 MW, but the design of such an upscaled machine is cost prohibitive. Mass increase of the rotor is identified as a main design challenge to overcome. The results of this research support the development of alternative lightweight materials and design concepts such as a two-bladed downwind design for upscaling to remain a cost effective solution for future wind turbines.

Technical education contributes a major share to the overall education system and plays a vital role in the social and economic development of our nation. Since independence, the technical education system in our country has grown into a fairly large-sized system, offering opportunities for education and training in a wide variety of trades and…

Education Institutions (SALEIE), an EU supported project, gathers together a global team aiming to provide higher education models in the EIE disciplines that can respond to the key global technicalchallenges. This paper deals with findings within the SALEIE project's work package WP3 (Global Challenges......), namely: state-of-the-art in implementation of the Bologna recommendation for Bachelor and Master, technicalchallenges that the EIE higher education faces nowadays, and existing models in EIE higher education and their degree of response to key global technicalchallenges....

Most of the studies on the Indian energy sector focus on the possible future scenarios of Indian energy system development without considering the management dimension to the problem-how to ensure a smooth transition to reach the desired future state. The purpose of this paper is to highlight some sector management concerns to a sustainable energy future in the country. The paper follows a deductive approach and reviews the present status and possible future energy outlooks from the existing literature. This is followed by a strategy outline to achieve long-term energy sustainability. Management challenges on the way to such a sustainable future are finally presented. The paper finds that the aspiration of becoming an economic powerhouse and the need to eradicate poverty will necessarily mean an increase in energy consumption unless a decoupling of energy and GDP growth is achieved. Consequently, the energy future of the country is eminently unsustainable. A strategy focussing on demand reduction, enhanced access, use of local resources and better management practices is proposed here. However, a sustainable path faces a number of challenges from the management and policy perspectives.

This Springer Brief provides a comprehensive overview of the background and recent developments of big data. The value chain of big data is divided into four phases: data generation, data acquisition, data storage and data analysis. For each phase, the book introduces the general background, discusses technicalchallenges and reviews the latest advances. Technologies under discussion include cloud computing, Internet of Things, data centers, Hadoop and more. The authors also explore several representative applications of big data such as enterprise management, online social networks, healthcar

Simulation-based training (SBT) has become a standard component of modern surgical education, yet successful implementation of evidence-based training programs remains challenging. In this narrative review, we use Kern's framework for curriculum development to describe where we are now and what lies ahead for SBT within surgery with a focus on technical skills in operative procedures. Despite principles for optimal SBT (proficiency-based, distributed, and deliberate practice) having been identified, massed training with fixed time intervals or a fixed number of repetitions is still being extensively used, and simulators are generally underutilized. SBT should be part of surgical training curricula, including theoretical, technical, and non-technical skills, and be based on relevant needs assessments. Furthermore, training should follow evidence-based theoretical principles for optimal training, and the effect of training needs to be evaluated using relevant outcomes. There is a larger, still unrealized potential of surgical SBT, which may be realized in the near future as simulator technologies evolve, more evidence-based training programs are implemented, and cost-effectiveness and impact on patient safety is clearly demonstrated.

The first electric powder lamp operated that 150 years ago, since then the evolution of light sources is astonishing. Today, more than 10 % of the global electric power produced worldwide serve fore light production from several billions lamps. Since last three decades incandescent lamps are gradually replaced by more energy efficient discharge lamps. In parallel, new generation of light emitting diodes, producing bright colours (including white) with luminous efficacy challenging even discharge lamps, appeared in past years. The objective of this paper is to focus on the state of art in the domain of light sources and discuss the challenges for the near future. (author)

Increasing attention is directed towards thermonuclear fusion as a possible future energy source. Major advantages of this energy conversion technology are the almost inexhaustible resources and the option to produce energy without CO2-emissions. However, in the most advanced field of magnetic plasma confinement a number of technological challenges have to be met. In particular high-temperature resistant and plasma compatible materials have to be developed and qualified which are able to withstand the extreme environments in a commercial thermonuclear power reactor. The plasma facing materials (PFMs) and components (PFCs) in such fusion devices, i.e. the first wall (FW), the limiters and the divertor, are strongly affected by the plasma wall interaction processes and the applied intense thermal loads during plasma operation. On the one hand, these mechanisms have a strong influence on the plasma performance; on the other hand, they have major impact on the lifetime of the plasma facing armour. In present-day and next step devices the resulting thermal steady state heat loads to the first wall remain below 1 MWm-2; the limiters and the divertor are expected to be exposed to power densities being at least one order of magnitude above the FW-level, i.e. up to 20 MWm-2 for next step tokamaks such as ITER or DEMO. These requirements are responsible for high demands on the selection of qualified PFMs and heat sink materials as well as reliable fabrication processes for actively cooled plasma facing components. The technical solutions which are considered today are mainly based on the PFMs beryllium, carbon or tungsten joined to copper alloys or stainless steel heat sinks. In addition to the above mentioned quasi-stationary heat loads, short transient thermal pulses with deposited energy densities up to several tens of MJm-2 are a serious concern for next step tokamak devices. The most frequent events are so-called Edge Localized Modes (type I ELMs) and plasma disruptions

improved economy when compared to currently the existing plants. The APR 1400 has been developed since 1991 and it is expected that its first commercial operation will be in 2012. In the short term by 2011, the APR-1400 design will be improved from the viewpoints of safety, economics and performance. We are also developing a small integral reactor SMART, which is a promising advanced small and medium-size power category of nuclear reactors. It is an integral type reactor with a sensible mixture of new innovative design features and proven technologies aimed at achieving a highly enhanced safety and improved economics. SMART is purposed for dual applications such as for seawater desalination and electricity generation. Since the SMART technology is technically sound and has sufficient economics, the SMART desalination plant has good prospects of being deployed as a nuclear desalination plant. We are also actively participating in the GEN IV collaboration (GIF: GEN IV International Forum) for a VHTR and a SFR technology development. Through close collaboration with GIF, a proliferation-resistant SFR technology will be developed based on KALIMAER for an effective uranium utilization and waste minimization. Also a high temperature reactor is currently under development to demonstrate a nuclear based hydrogen production technology. Korea is really looking ahead by developing new generation of advanced nuclear reactor systems for a sustainable development, economical benefits, a clean environment and public confidence. In this paper, Korean nuclear reactor technology development program is described together with lessons learned from self-reliance in nuclear reactor technology. In addition, this paper presents the status of the next generation reactor system development program and the future reactor system development program for addressing these challenges

Speakers Maylath, Mousten and Vandepitte, co-authors of two chapters on what they call the Trans-Atlantic Project, will describe the programmatic framework for establishing the collaborative partnerships in which students studying technical writing in the U.S. work with students studying...... help achieve common program objectives, particularly in regard to intercultural negotiation and mediation processes. In addition, they will describe how they met course-specific objectives. For the technical writing course, such objectives included broadening students' awareness of the needs of readers...... translation in Europe to create procedural documents in Danish, Dutch, English, French, German and/or Italian. They will provide guidelines for how international partnerships of this kind can be established between technical communication programs and translation programs anywhere, even in the abscence...

Hemoglobinopathies constitute the most common severe monogenic disorders worldwide, with an increasing global burden each year. The benefit of applying programmes for preconception carrier screening, with the option of prenatal diagnosis, to minimize the incidence of new cases is recognized in many countries. Areas covered: The challenges associated with identifying carrier couples using hematology-based screening, along with DNA diagnosis and prenatal diagnosis were addressed, based on a literature search and the authors expertise. Expert commentary: The hemoglobinopathies are extremely heterogeneous at the haematological, molecular and clinical level, requiring appropriately equipped and staffed laboratories with experience to support comprehensive screening and diagnosis. However complete services with adequate infrastructure to address the associated technicalchallenges do not exist widely, especially in low-income countries that, coincidentally, are often those with the highest frequency of hemoglobinopathies in their population. Additionally, overcoming limited public awareness, education and absence of systematic dissemination of information also constitutes a challenge. This article aims to highlight these challenges and to evaluate potential future developments that may address at least some of them, focusing mainly on the technicalchallenges related to molecular diagnostics.

ABSTRACT If the world population continues to increase exponentially, wealth and education inequalities might become more pronounced in the developing world. Thus, offering affordable, high-quality protein food to people will become more important and daunting than ever. Past and futurechallenges will increasingly demand quicker and more innovative and efficient solutions. Animal scientists around the globe currently face many challenging issues: from ensuring food security to prevent excess...

Sleep medicine is a relatively new specialty in the medical community. The practice of sleep medicine in Saudi Arabia (KSA) began in the mid to late nineties. Since its inception, the specialty has grown, and the number of specialists has increased. Nevertheless, sleep medicine is still underdeveloped in the KSA, particularly in the areas of clinical service, education, training and research. Based on available data, it appears that sleep disorders are prevalent among Saudis, and the demand for sleep medicine service is expected to rise significantly in the near future. A number of obstacles have been defined that hinder the progress of the specialty, including a lack of trained technicians, specialists and funding. Awareness about sleep disorders and their serious consequences is low among health care workers, health care authorities, insurance companies and the general public. A major challenge for the future is penetrating the educational system at all levels to demonstrate the high prevalence and serious consequences of sleep disorders. To attain adequate numbers of staff and facilities, the education and training of health care professionals at the level of sleep medicine specialists and sleep technologists is another important challenge that faces the specialty. This review discusses the current position of sleep medicine as a specialty in the KSA and the expected challenges of the future. In addition, it will guide clinicians interested in setting up new sleep medicine services in the KSA or other developing countries through the potential obstacles that may face them in this endeavor. PMID:21264164

Full Text Available Sleep medicine is a relatively new specialty in the medical community. The practice of sleep medicine in Saudi Arabia (KSA began in the mid to late nineties. Since its inception, the specialty has grown, and the number of specialists has increased. Nevertheless, sleep medicine is still underdeveloped in the KSA, particularly in the areas of clinical service, education, training and research. Based on available data, it appears that sleep disorders are prevalent among Saudis, and the demand for sleep medicine service is expected to rise significantly in the near future. A number of obstacles have been defined that hinder the progress of the specialty, including a lack of trained technicians, specialists and funding. Awareness about sleep disorders and their serious consequences is low among health care workers, health care authorities, insurance companies and the general public. A major challenge for the future is penetrating the educational system at all levels to demonstrate the high prevalence and serious consequences of sleep disorders. To attain adequate numbers of staff and facilities, the education and training of health care professionals at the level of sleep medicine specialists and sleep technologists is another important challenge that faces the specialty. This review discusses the current position of sleep medicine as a specialty in the KSA and the expected challenges of the future. In addition, it will guide clinicians interested in setting up new sleep medicine services in the KSA or other developing countries through the potential obstacles that may face them in this endeavor.

Most research on online learning in higher education has been focused on general education at four-year institutions. There is a need for more research that focuses on online and hybrid education at community colleges in technical education fields. This issue includes articles from eight National Science Foundation funded projects doing innovative…

Qualitative research has had a significant impact within rehabilitation science over time. During the past 20 years the number of qualitative studies published per year in Disability and Rehabilitation has markedly increased (from 1 to 54). In addition, during this period there have been significant changes in how qualitative research is conceptualized, conducted, and utilized to advance the field of rehabilitation. The purpose of this article is to reflect upon the progress of qualitative research within rehabilitation to date, to explicate current opportunities and challenges, and to suggest future directions to continue to strengthen the contribution of qualitative research in this field. Relevant literature searches were conducted in electronic data bases and reference lists. Pertinent literature was examined to identify current opportunities and challenges for qualitative research use in rehabilitation and to identify future directions. Six key areas of opportunity and challenge were identified: (a) paradigm shifts, (b) advancements in methodology, (c) emerging technology, (d) advances in quality evaluation, (e) increasing popularity of mixed methods approaches, and (f) evolving approaches to knowledge translation. Two important future directions for rehabilitation are posited: (1) advanced training in qualitative methods and (2) engaging qualitative communities of research. Qualitative research is well established in rehabilitation and has an important place in the continued growth of this field. Ongoing development of qualitative researchers and methods are essential. Implications for Rehabilitation Qualitative research has the potential to improve rehabilitation practice by addressing some of the most pervasive concerns in the field such as practitioner-client interaction, the subjective and lived experience of disability, and clinical reasoning and decision making. This will serve to better inform those providing rehabilitation services thereby benefiting

This publication relates to environmental challenges of the energy sector and options for future action. Following themes are discussed: Globalisation of the energy sector; environmental challenges; the challenge of climate change; options for future action

Can the field of futures research help advance participatory management of river basins? This question is supposed to be answered by the present study of which this paper will mainly address the theoretical and conceptual point of view. The 2000 EU Framework directive on water emphasises at least two aspects that will mark the future management of river basins: the need for long-term planning, and a demand for participation. Neither the former nor the latter are new concepts as such, but its combination is in some sense revolutionary. Can long-term plans be made (and implemented) in a participative way, what tools could be useful in this respect, and does this lead to a satisfactory situation in terms of both reaching physical targets and enhancing social-institutional manageability? A possibly rich way to enter the discussion is to challengefutures research as a concept and a practice for enabling multiple stakeholders to design appropriate policies. Futures research is the overall field in which several methods and techniques (like scenario analysis) are mobilised to systematically think through and/or design the future. As such they have proven to be rich exercises to trigger ideas, stimulate debate and design desirable futures (and how to get there). More importantly these exercises have the capability to reconstitute actor relations, and by nature go beyond the institutional boundaries. Arguably the relation between futures research and the planning process is rather distant. Understandably commitments on the direct implementation of the results are hardly ever made, but its impact on changes in the capabilities of the network of actors involved may be large. As a hypothesis we consider that the distant link between an image of the future and the implementation in policy creates sufficient distance for actors to participate (in terms of responsibilities, legal constraints, etc.) and generate potentials, and enough degrees of freedom needed for a successful

To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

Full Text Available Management of pelvi-ureteric junction obstruction (PUJO in a duplex system is technicallychallenging as dissection at the pelvis may jeopardize the vascularity of the normal moiety ureter. Anastomosing the pelvis to the one single ureter will have a risk of future development of stricture which then will risk both the moieties. Robotic assistance enables appropriate tissue dissection; minimal handling of normal ureter and precision in suturing, overcoming the potential challenges involved in the minimally invasive management of such complex cases. We report the feasibility and efficacy of robot-assisted laparoscopic pyeloplasty in such case.

The need and prospects for international technical cooperation in the petroleum industry are reviewed. Since it directly affects the survival of the inhabitants of the planet, environmental protection is a field that could enjoy maximum international cooperation; oil spills, product environmental hazards, waste minimization and disposal and oil field fires are the main areas identified. Technical cooperation in other areas of the industry, namely exploration, production, oil field development, refining and petrochemicals, may involve some controversy. Attention is drawn to the conflicting interests of multinational companies, who almost completely control the technology of these activities, and host developing countries. It is advocated that arrangements involving technology transfer should make provision for the growth of indigenous technology. (UK)

To survey the State of the Art and Technic in Material, Process and Products by Plasma, it was needed to assemble economic, political and managerial variables that will affect the absortion and implantation of this technology in Brazil. Through a survey in industries, universities, research centers, energy agencies and financial and forster institutions it was possible to build a frame that gave us conditions to forecasting and suggest some mesures to Plasma sector. (author) [pt

An increased flexibility of the electricity demand side through demand response (DR) is an opportunity to support the integration of renewable energies. By optimising the use of the generation, transmission and distribution infrastructure, DR reduces the need for costly investments and contributes to system security. There is a significant technical DR potential for load reduction from industrial production processes in Germany, as well as from cross-cutting technologies in industry and the t...

Sewage sludge is a serious problem due to the high treatment costs and the risks to environment and human health. Future sludge treatment will be progressively focused on an improved efficiency and environmental sustainability of the process. In this context a survey is given of most relevant sludge treatment options and separate treatment steps. Special attention is paid to those processes that are simultaneously focused on the elimination of the risks for environment and human health and on the recovery or beneficial use of the valuable compounds in the sludge such as organic carbon compounds, inorganic non-toxic substances, phosphorous and nitrogen containing compounds. Also, a brief assessment is given of the specific future technological developments regarding the various treatment steps. Furthermore, it is discussed how to assess the various pathways which can lead to the required developments. In such an assessment the technical and economic feasibility, the environmental sustainability, the societal acceptance and the implementation route are important factors. The optimal approach also strongly depends on the local and regional situation of concern and the relevant current and future boundary conditions. (author)

Water distinguishes our planet compared to all the others we know about. While the global supply of available freshwater is more than adequate to meet all current and foreseeable water demands, its spatial and temporal distributions are not. There are many regions where our freshwater resources are inadequate to meet domestic, economic development and environmental needs. In such regions, the lack of adequate clean water to meet human drinking water and sanitation needs is indeed a constraint on human health and productivity and hence on economic development as well as on the maintenance of a clean environment and healthy ecosystems. All of us involved in research must find ways to remove these constraints. We face multiple challenges in doing that, especially given a changing and uncertain future climate, and a rapidly growing population that is driving increased social and economic development, globalization, and urbanization. How best to meet these challenges requires research in all aspects of water management. Since 1965, the journal Water Resources Research has played an important role in reporting and disseminating current research related to managing the quantity and quality and cost of this resource. This paper identifies the issues facing water managers today and future research needed to better inform those who strive to create a more sustainable and desirable future.

Full Text Available Marine environments have undergone large-scale changes in recent decades as a result of multiple anthropogenic pressures, such as overfishing, eutrophication, habitat fragmentation, etc., causing often nonlinear ecosystem responses. At the same time, management institutions lack the appropriate measures to address these abrupt transformations. We focus on existing examples from social-ecological systems of European seas that can be used to inform and advise future management. Examples from the Black Sea and the Baltic Sea on long-term ecosystem changes caused by eutrophication and fisheries, as well as changes in management institutions, illustrate nonlinear dynamics in social-ecological systems. Furthermore, we present two major futurechallenges, i.e., climate change and energy intensification, that could further increase the potential for nonlinear changes in the near future. Practical tools to address these challenges are presented, such as ensuring learning, flexibility, and networking in decision-making processes across sectors and scales. A combination of risk analysis with a scenario-planning approach might help to identify the risks of ecosystem changes early on and may frame societal changes to inform decision-making structures to proactively prevent drastic surprises in European seas.

There is a vast body of literature on deliberative, participative, or engaged democracy. In the area of health care there is a rapidly expanding literature on deliberative democracy as embodied in various notions of public engagement, shared decision-making (SDM), patient-centered care, and patient/care provider autonomy over the past few decades. It is useful to review such literature to get a sense of the challenges and prospects of introducing deliberative democracy in health care. This paper reviews the key literature on deliberative democracy and SDM in health care settings with a focus on identifying the main challenges of promoting this approach in health care, and recognizing its progress so far for mapping out its future prospects in the context of advanced countries. Several databases were searched to identify the literature pertinent to the subject of this study. A total of 56 key studies in English were identified and reviewed carefully for indications and evidence of challenges and/or promising avenues of promoting deliberative democracy in health care. Time pressure, lack of financial motivation, entrenched professional interests, informational imbalance, practical feasibility, cost, diversity of decisions, and contextual factors are noted as the main challenges. As for the prospects, greater clarity on conception of public engagement and policy objectives, real commitment of the authorities to public input, documenting evidence of the effectiveness of public involvement, development of patient decision supports, training of health professionals in SDM, and use of multiple and flexible methods of engagement leadership suited to specific contexts are the main findings in the reviewed literature. Seeking deliberative democracy in health care is both challenging and rewarding. The challenges have been more or less identified. However, its prospects are potentially significant. Such prospects are more likely to materialize if deliberative democracy is

In South Africa, significant changes in Academic Health have taken place since the first democratic elections in 1994. Academic Health came from a separated academic hospital, departmental-based curriculum and research focussed on achievement, and an abundance of money, to a position of integrated service delivery with specific reference to primary health care, separation of service levels, a new integrated curriculum, research focussed according to the need and contract research, and financial constraints with limited budgets. The management of this change is a task challenging the manager in all fields of Academic Health. Leaders need to know their environment and organisation to be able to manage change. Academic Health centres are experiencing major changes as a result of the effects of managed care, reduced rate and growing expenditure on health services. In addition to restructuring of the clinical services, Academic Health centres are being challenged to sustain their academic mission and priorities in the face of resource constraints. In order to tackle these challenges, institutions need physicians in administrative positions at all levels who can provide leadership and thoughtful managerial initiatives. The futurechallenge for managers focuses on service delivery, research, health education and training, Academic Health management, professionalism and financial management.

Full Text Available There are twenty-five known inherited cardiac arrhythmia susceptibility genes, all of which encode either ion channel pore-forming subunits or proteins that regulate aspects of ion channel biology such as function, trafficking and localization. The human KCNE gene family comprises five potassium channel regulatory subunits, sequence variants in each of which are associated with cardiac arrhythmias. KCNE gene products exhibit promiscuous partnering and in some cases ubiquitous expression, hampering efforts to unequivocally correlate each gene to specific native potassium currents. Likewise, deducing the molecular etiology of cardiac arrhythmias in individuals harboring rare KCNE gene variants, or more common KCNE polymorphisms, can be challenging. In this review we provide an update on putative arrhythmia-causing KCNE gene variants, and discuss current thinking and futurechallenges in the study of molecular mechanisms of KCNE-associated cardiac rhythm disturbances.

For nearly two decades, the International Atomic Energy Agency (lAEA) has been transforming its safeguards system to address the challenges posed by undeclared nuclear programs, the associated revelation of an extensive non-State nuclear procurement network and other issues, including past limits to its verification mandate and the burden of noncompliance issues. Implementing the new measures, including those in the Additional Protocol, and integrating new and old safeguards measures, remains a work in progress. Implementation is complicated by factors including the limited teclmological tools that are available to address such issues as safeguarding bulk handling facilities, detection of undeclared facilities/activities, especially related to enrichment, etc. As this process continues, new challenges are arising, including the demands of expanding nuclear power production worldwide, so-called safeguards by design for a new generation of facilities, the possible IAEA role in a fissile material cutoff treaty and other elements of the arms control and disarmament agenda, the possible role in 'rollback' cases, etc. There is no doubt safeguards will need to evolve in the future, as they have over the last decades. In order for the evolutionary path to proceed, there will inter alia be a need to identify technological gaps, especially with respect to undeclared facilities, and ensure they are filled by adapting old safeguards technologies, by developing and introducing new and novel safeguards teclmologies and/or by developing new procedures and protocols. Safeguards will also need to respond to anticipated emerging threats and to future, unanticipated threats. This will require strategic planning and cooperation among Member States and with the Agency. This paper will address challenges to IAEA safeguards and the technological possibilities and R and D strategies needed to meet those challenges in the context of the forty-year evolution of safeguards, including the

The main concept currently in use in wind energy involves horizontal-axis wind turbines with blades of fiber composite materials. This turbine concept is expected to remain as the major provider of wind power in the foreseeable future. However, turbine sizes are increasing, and installation......, preventing buckling failure, ensuring adequate fatigue life under variable wind loading combined with gravitational loading, and minimizing the occurrence and consequences of production defects. A major challenge is to develop cost-effective ways to ensure that production defects do not cause unacceptable...

In two-ring facilities operating with a crossing-angle collision scheme, luminosity can be limited due to an incomplete overlapping of the colliding bunches. Crab cavities then are introduced to restore head-on collisions by providing the destined opposite deflection to the head and tail of the bunch. An increase in luminosity was demonstrated at KEKB with global crab-crossing, while the Large Hardron Collider (LHC) at CERN currently is designing local crab crossing for the Hi-Lumi upgrade. Future colliders may investigate both approaches. In this paper, we review the challenges in the technology, and the implementation of crab cavities, while discussing experience in earlier colliders, ongoing R&D, and proposed implementations for future facilities, such as HiLumi-LHC, CERN’s compact linear collider (CLIC), the international linear collider (ILC), and the electron-ion collider under design at BNL (eRHIC).

In two-ring facilities operating with a crossing-angle collision scheme, luminosity can be limited due to an incomplete overlapping of the colliding bunches. Crab cavities then are introduced to restore head-on collisions by providing the destined opposite deflection to the head and tail of the bunch. An increase in luminosity was demonstrated at KEKB with global crab-crossing, while the Large Hardron Collider (LHC) at CERN currently is designing local crab crossing for the Hi-Lumi upgrade. Future colliders may investigate both approaches. In this paper, we review the challenges in the technology, and the implementation of crab cavities, while discussing experience in earlier colliders, ongoing R&D, and proposed implementations for future facilities, such as HiLumi-LHC, CERN@@@s compact linear collider (CLIC), the international linear collider (ILC), and the electron-ion collider under design at BNL (eRHIC).

In two-ring facilities operating with a crossing-angle collision scheme, luminosity can be limited due to an incomplete overlapping of the colliding bunches. Crab cavities then are introduced to restore head-on collisions by providing the destined opposite deflection to the head and tail of the bunch. An increase in luminosity was demonstrated at KEKB with global crab- crossing, while the Large Hardron Collider (LHC) at CERN currently is designing local crab crossing for the Hi-Lumi upgrade. Future colliders may investigate both approaches. In this paper, we review the challenges in the technology, and the implementation of crab cavities, while discussing experience in earlier colliders, ongoing R&D, and proposed implementations for future facilities, such as HiLumi-LHC, CERN’s compact linear collider (CLIC), the international linear collider (ILC), and the electronion collider under design at BNL (eRHIC).

Interactive artificial intelligence systems employ preferences in both their reasoning and their interaction with the user. This survey considers preference handling in applications such as recommender systems, personal assistant agents, and personalized user interfaces. We survey the major questions and approaches, present illustrative examples, and give an outlook on potential benefits and challenges.

Outline recent epidemiologic data regarding hypertension in developing countries, distinguish differences from developed countries, and identify challenges in management and future perspectives. Increased sugar intake, air and noise pollution, and low birth weight are emerging hypertension risk factors. The major challenges in management are difficulties in accurate diagnosis of hypertension and adequate blood pressure control. In contrast to developed countries, hypertension prevalence rates are on the rise in developing countries with no improvement in awareness or control rates. The increasing burden of hypertension is largely attributable to behavioral factors, urbanization, unhealthy diet, obesity, social stress, and inactivity. Health authorities, medical societies, and drug industry can collaborate to improve hypertension control through education programs, public awareness campaigns, legislation to limit salt intake, encourage generic drugs, development and dissemination of national guidelines, and involving nurses and pharmacists in hypertension management. More epidemiologic data are needed in the future to identify reasons behind increased prevalence and poor blood pressure control and examine trends in prevalence, awareness, treatment, and control. National programs for better hypertension control based on local culture, economic characteristics, and available resources in the population are needed. The role of new tools for hypertension management should be tested in developing world.

In the field of genetically modified organism (GMO) diagnostics, real-time PCR has been the method of choice for target detection and quantification in most laboratories. Despite its numerous advantages, however, the lack of a true multiplexing option may render real-time PCR less practical in the face of future GMO detection challenges such as the multiplicity and increasing complexity of new transgenic events, as well as the repeated occurrence of unauthorized GMOs on the market. In this context, we recently reported the development of a novel multiplex quantitative DNA-based target amplification method, named NASBA implemented microarray analysis (NAIMA), which is suitable for sensitive, specific and quantitative detection of GMOs on a microarray. In this article, the performance of NAIMA is compared with that of real-time PCR, the focus being their performances in view of the upcoming challenge to detect/quantify an increasing number of possible GMOs at a sustainable cost and affordable staff effort. Finally, we present our conclusions concerning the applicability of NAIMA for future use in GMO diagnostics.

Full Text Available Pathology undergoes presently changes due to new developments in diagnostic opportunities and cost saving efforts in health care. Out of the wide field of telepathology the paper selects three prototype applications: telepathology in teleeducation, expert advice for preselected details of a slide and finally telepathology for remote diagnosis. The most challenging field for remote diagnosis is the application in the frozen section scenario. The paper starts with the mental experiment to map conventional procedures to counterparts in telepathology.

Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1) scale issue; (2) transportability issue; (3) data availability; and (4) uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges.

Full Text Available Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1 scale issue; (2 transportability issue; (3 data availability; and (4 uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges.

Maintaining a healthy ecosystem is essential for maximizing sustainable ecological services of the best quality to human beings. Ecological and conservation research has provided a strong scientific background on identifying ecological health indicators and correspondingly making effective conservation plans. At the same time, ecologists have asserted a strong need for spatially explicit and temporally effective ecosystem health assessments based on remote sensing data. Currently, remote sensing of ecosystem health is only based on one ecosystem attribute: vigor, organization, or resilience. However, an effective ecosystem health assessment should be a comprehensive and dynamic measurement of the three attributes. This paper reviews opportunities of remote sensing, including optical, radar, and LiDAR, for directly estimating indicators of the three ecosystem attributes, discusses the main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system, and provides some future perspectives. The main challenges to develop a remote sensing-based spatially-explicit comprehensive ecosystem health system are: (1) scale issue; (2) transportability issue; (3) data availability; and (4) uncertainties in health indicators estimated from remote sensing data. However, the Radarsat-2 constellation, upcoming new optical sensors on Worldview-3 and Sentinel-2 satellites, and improved technologies for the acquisition and processing of hyperspectral, multi-angle optical, radar, and LiDAR data and multi-sensoral data fusion may partly address the current challenges. PMID:25386759

Thursday 19.January 2012 at 14:15 - IT Auditorium (bldg. 31 3-004) Future neutrino facilities: the neutrino factory by Gersende Prior / University of Geneva and CERN EN/MEF The neutrino factory is one of the proposed designs for a future intense neutrino beam facility. In its current layout, a high-power proton beam impinges on an Hg jet target producing pions, decaying in turn into muons. In order to reduce the particle beam emittance, the muon transverse momentum is reduced through ionization cooling by a technically demanding set-up made of closely-packed RF cavities alternating with absorbers. In this talk I will present the motivation for building an intense neutrino beam and some of the proposed neutrino facilities' design. I will discuss the challenges inherent to the cooling of muons, possible optimization of the current baseline and the on-going R&D. ________________ ATS Seminars Organisers: H. Burkhardt (BE), S. Sgobba (EN), G. deRijk (TE)

Full Text Available Jalil Safaei Department of Economics, University of Northern British Columbia, Prince George, BC, CanadaBackground: There is a vast body of literature on deliberative, participative, or engaged democracy. In the area of health care there is a rapidly expanding literature on deliberative democracy as embodied in various notions of public engagement, shared decision-making (SDM, patient-centered care, and patient/care provider autonomy over the past few decades. It is useful to review such literature to get a sense of the challenges and prospects of introducing deliberative democracy in health care.Objective: This paper reviews the key literature on deliberative democracy and SDM in health care settings with a focus on identifying the main challenges of promoting this approach in health care, and recognizing its progress so far for mapping out its future prospects in the context of advanced countries.Method: Several databases were searched to identify the literature pertinent to the subject of this study. A total of 56 key studies in English were identified and reviewed carefully for indications and evidence of challenges and/or promising avenues of promoting deliberative democracy in health care.Results: Time pressure, lack of financial motivation, entrenched professional interests, informational imbalance, practical feasibility, cost, diversity of decisions, and contextual factors are noted as the main challenges. As for the prospects, greater clarity on conception of public engagement and policy objectives, real commitment of the authorities to public input, documenting evidence of the effectiveness of public involvement, development of patient decision supports, training of health professionals in SDM, and use of multiple and flexible methods of engagement leadership suited to specific contexts are the main findings in the reviewed literature.Conclusion: Seeking deliberative democracy in health care is both challenging and rewarding. The

Abstract Background Skeletal muscle fibres represent one of the most abundant cell types in mammals. Their highly specialised contractile and metabolic functions depend on a large number of membrane-associated proteins with very high molecular masses, proteins with extensive posttranslational modifications and components that exist in highly complex supramolecular structures. This makes it extremely difficult to perform conventional biochemical studies of potential changes in protein clusters during physiological adaptations or pathological processes. Results Skeletal muscle proteomics attempts to establish the global identification and biochemical characterisation of all members of the muscle-associated protein complement. A considerable number of proteomic studies have employed large-scale separation techniques, such as high-resolution two-dimensional gel electrophoresis or liquid chromatography, and combined them with mass spectrometry as the method of choice for high-throughput protein identification. Muscle proteomics has been applied to the comprehensive biochemical profiling of developing, maturing and aging muscle, as well as the analysis of contractile tissues undergoing physiological adaptations seen in disuse atrophy, physical exercise and chronic muscle transformation. Biomedical investigations into proteome-wide alterations in skeletal muscle tissues were also used to establish novel biomarker signatures of neuromuscular disorders. Importantly, mass spectrometric studies have confirmed the enormous complexity of posttranslational modifications in skeletal muscle proteins. Conclusions This review critically examines the scientific impact of modern muscle proteomics and discusses its successful application for a better understanding of muscle biology, but also outlines its technical limitations and emerging techniques to establish new biomarker candidates.

Full Text Available Cochlear implants (CIs often work very well for many children and adults with profound sensorineural (SNHL hearing loss. Unfortunately, while many CI patients display substantial benefits in recognizing speech and understanding spoken language following cochlear implantation, a large number of patients achieve poor outcomes. Understanding and explaining the reasons for poor outcomes following implantation is a very challenging research problem that has received little attention despite the pressing clinical significance. In this paper, we discuss three challenges for future research on CIs. First, we consider the issue of individual differences and variability in outcomes following implantation. At the present time, we still do not have a complete and satisfactory account of the causal underlying factors that are responsible for the enormous individual differences and variability in outcomes. Second, we discuss issues related to the lack of preimplant predictors of outcomes. Very little prospective research has been carried out on the development of preimplant predictors that can be used to reliably identify CI candidates who may be at high risk for a poor outcome following implantation. Other than conventional demographics and hearing history, there are no prognostic tools available to predict speech recognition outcomes after implantation. Finally, we discuss the third challenge — what to do with a CI-user who has a poor outcome. We suggest that new research efforts need to be devoted to studying this neglected clinical population in greater depth to find out why they are doing poorly with their CI and what novel interventions and treatments can be developed to improve their speech recognition outcomes. Using these three challenges as objectives for future research on CIs, we suggest that the field needs to adopt a new narrative grounded in theory and methods from Cognitive Hearing Science and information processing theory. Without knowing

Missions for future orbit transfer vehicles (1995-2010) are identified and the technology, operations and vehicle concepts that satisfy the transportation requirements are defined. Comparison of reusable space and ground based LO2/LH2 OTV's was made. Both vehicles used advanced space engines and aero assist capability. The SB OTV provided advantages in life cycle cost, performance and potential for improvement. Comparison of an all LO2/LH2 OTV fleet with a fleet of LO2/LH2 OTVs and electric OTV's was also made. The normal growth technology electric OTV used silicon cells with heavy shielding and argon ion thrusters. This provided a 23% advantage in total transportation cost. The impact of accelerated technology was considered in terms of improvements in performance and cost effectiveness. The accelerated technology electric vehicle used GaAs cells and annealing but did not result in the mixed fleet being any cheaper than an all LO2/LH2 OTV fleet. It is concluded that reusable LO2/LH2 OTV's can serve all general purpose cargo roles between LEO and GEO for the forseeable future. The most significant technology for the second generation vehicle would be space debris protection, on-orbit propellant storage and transfer and on-orbit maintenance capability.

We focused on several technical approaches to flexible liquid crystal (LC) display in this report. We have been developing flexible displays using plastic film substrates based on polymer-dispersed LC technology with molecular alignment control. In our representative devices, molecular-aligned polymer walls keep plastic-substrate gap constant without LC alignment disorder, and aligned polymer networks create monostable switching of fast-response ferroelectric LC (FLC) for grayscale capability. In the fabrication process, a high-viscosity FLC/monomer solution was printed, sandwiched and pressed between plastic substrates. Then the polymer walls and networks were sequentially formed based on photo-polymerization-induced phase separation in the nematic phase by two exposure processes of patterned and uniform ultraviolet light. The two flexible backlight films of direct illumination and light-guide methods using small three-primary-color light-emitting diodes were fabricated to obtain high-visibility display images. The fabricated flexible FLC panels were driven by external transistor arrays, internal organic thin film transistor (TFT) arrays, and poly-Si TFT arrays. We achieved full-color moving-image displays using the flexible FLC panel and the flexible backlight film based on field-sequential-color driving technique. Otherwise, for backlight-free flexible LC displays, flexible reflective devices of twisted guest-host nematic LC and cholesteric LC were discussed with molecular-aligned polymer walls. Singlesubstrate device structure and fabrication method using self-standing polymer-stabilized nematic LC film and polymer ceiling layer were also proposed for obtaining LC devices with excellent flexibility.

Pulmonary emphysema is characterized by irreversible destruction of lung parenchyma. Emphysema is a major contributor to chronic obstructive pulmonary disease (COPD), which by itself is a major cause of morbidity and mortality in the western world. Computed tomography (CT) is an established method for the in-vivo analysis of emphysema. This review first details the pathological basis of emphysema and shows how the subtypes of emphysema can be characterized by CT. The review then shows how CT is used to quantify emphysema, and describes the requirements and foundations for quantification to be accurate. Finally, the review discusses new challenges and their potential solution, notably focused on multi-detector-row CT, and emphasizes the open questions that future research on CT of pulmonary emphysema will have to address. (orig.)

In the era of information technology, the elderly and disabled can be monitored with numerous intelligent devices. Sensors can be implanted into their home for continuous mobility assistance and non-obtrusive disease prevention. Modern sensor-embedded houses, or smart houses, cannot only assist people with reduced physical functions but help resolve the social isolation they face. They are capable of providing assistance without limiting or disturbing the resident's daily routine, giving him or her greater comfort, pleasure, and well-being. This article presents an international selection of leading smart home projects, as well as the associated technologies of wearable/implantable monitoring systems and assistive robotics. The latter are often designed as components of the larger smart home environment. The paper will conclude by discussing futurechallenges of the domain.

Full Text Available In the last decade there has been a significant increase in the interest of the educational and scientific community on cyberbullying, a new form of peer abuse and intimidation. Despite the widespread proliferation of studies and assessment tools on the phenomenon, there are still major conceptual and methodological gaps. This paper offers a comprehensive and updated review of the results of research on the definition of the construct, its prevalence and its impact on the people involved. Finally, it focuses specifically on the assessment of the construct and provides a brief review of the general and psychometric characteristics of the instruments used in some of the most relevant national and international studies conducted on the subject. This work places special emphasis on the present and futurechallenges and concludes with a number of general recommendations intended to guide the correct selection and/or construction of assessment instruments in this field of study.

The EC FARMING network (Food and Agriculture Restoration Management Involving Networked Groups) was set up to bring together the many and diverse stakeholders who would be involved in intervention following wide scale radioactive contamination of the food chain, so that acceptable strategies can be developed for maintaining agricultural production and safe food supply. The network comprises stakeholder panels in the UK, Finland, Belgium, France and Greece that have met regularly since 2001 to debate, discuss and exchange opinion on the acceptability, constraints and impact of various countermeasure options and strategies. The objectives of this paper are to consolidate the main achievements of the FARMING project over the period 2000-2004, to highlight the various difficulties that were encountered and to discuss the challenges for engaging stakeholders in off-site emergency management and long-term rehabilitation in the future

Tissue engineering has been a promising field of research, offering hope for bridging the gap between organ shortage and transplantation needs. However, building three-dimensional (3-D) vascularized organs remains the main technological barrier to be overcome. Organ printing, which is defined as computer-aided additive biofabrication of 3-D cellular tissue constructs, has shed light on advancing this field into a new era. Organ printing takes advantage of rapid prototyping (RP) technology to print cells, biomaterials, and cell-laden biomaterials individually or in tandem, layer by layer, directly creating 3-D tissue-like structures. Here, we overview RP-based bioprinting approaches and discuss the current challenges and trends toward fabricating living organs for transplant in the near future.

Full Text Available The growing importance of quality of life in diverse domains, such as health, school performance and social participation, has led to the development of new conceptualisations and assessments of the construct. This diversity of perspectives brings about many benefits, but it also creates an obstacle for the formulation of a single unifying definition of the construct and, therefore, an agreed instrument or assessment framework. The aim of this study is to discuss the current methodological challenges in the measurement of quality of life. Firstly, we provide a brief description of the construct as defined in various areas, then we examine the new methodological developments and different applications. We also present an overview of the different possibilities for future developments in defining and measuring quality of life in national and international studies.

Full Text Available Research in the deep terrestrial biosphere is driven by interest in novel biodiversity and metabolisms, biogeochemical cycling, and the impact of human activities on this ecosystem. As this interest continues to grow, it is important to ensure that when subsurface investigations are proposed, materials recovered from the subsurface are sampled and preserved in an appropriate manner to limit contamination and ensure preservation of accurate microbial, geochemical, and mineralogical signatures. On February 20th, 2014, a workshop on Trends and FutureChallenges in Sampling The Deep Subsurface was coordinated in Columbus, Ohio by The Ohio State University and West Virginia University faculty, and sponsored by The Ohio State University and the Sloan Foundation’s Deep Carbon Observatory. The workshop aims were to identify and develop best practices for the collection, preservation, and analysis of terrestrial deep rock samples. This document summarizes the information shared during this workshop.

ITER will be the first experimental fusion facility, which brings together the key physical, material and technological issues related to development of fusion reactors. The design of ITER is complete and the construction will start soon. This paper discusses the main directions of the project oriented materials activity and main challenges related to selection of materials for the ITER components. For each application in ITER the main materials issues were identified and these issues were addressed in the dedicated ITER R and D program. The justification of materials performance was fully documented, which allows traceability and reliability of design data. Several examples are given to illustrate the main achievements and recommendations from the recently updated ITER Materials Properties Handbook. The main ongoing and future materials activities are described.

Research in the deep terrestrial biosphere is driven by interest in novel biodiversity and metabolisms, biogeochemical cycling, and the impact of human activities on this ecosystem. As this interest continues to grow, it is important to ensure that when subsurface investigations are proposed, materials recovered from the subsurface are sampled and preserved in an appropriate manner to limit contamination and ensure preservation of accurate microbial, geochemical, and mineralogical signatures. On February 20th, 2014, a workshop on "Trends and FutureChallenges in Sampling The Deep Subsurface" was coordinated in Columbus, Ohio by The Ohio State University and West Virginia University faculty, and sponsored by The Ohio State University and the Sloan Foundation's Deep Carbon Observatory. The workshop aims were to identify and develop best practices for the collection, preservation, and analysis of terrestrial deep rock samples. This document summarizes the information shared during this workshop.

Full Text Available The economic analysis of typical agri-food products requires to be focused on the following issues: i the specific features of the offering system; ii the technical restrictions established by the EU regulations on Protected designation of origin (Pdo and Pgi and, iii the strategies aimed at product differentiation and for value creation for the consumer. Considering this latest aspect, it is important to notice that the specificity of the agricultural raw materials, the use of traditional production techniques of production coming from the tradition of the place and certification represent only a prerequisite for the differentiation of the product on the market against standard products. The problem is that the specificity of local product comes from attributes (tangible and intangible of quality which are not directly accessible, nor verifiable, by the consumer when he/she makes purchasing choices. This situation persists despite the greater propensity of modern consumer to make investments in information and his/her greater attention and larger background towards the acknowledgement of different offers based on quality. This paper tends to develop an analysis on a theoretical and operative basis upon open strategies that can be implemented at the enterprise level, and that of agro-food chain and of territorial system in order to promote the quality of products to consumers. In particular, the work addresses the problems connected to the establishment of competitive advantages for Protected Designation of Origin (Pdo and Protected Geographical Indication (Pgi, highlighting that in order to achieve those advantages, firms offering typical products need to differentiate their offering on both material and immaterial ground acting on intrinsic and extrinsic attributes of quality of products, on specific features (natural, historical, cultural, etc. of territorial, on the efficiency of the offering organizational structure, and finally on the

Full Text Available The economic analysis of typical agri-food products requires to be focused on the following issues: i the specific features of the offering system; ii the technical restrictions established by the EU regulations on Protected designation of origin (Pdo and Pgi and, iii the strategies aimed at product differentiation and for value creation for the consumer. Considering this latest aspect, it is important to notice that the specificity of the agricultural raw materials, the use of traditional production techniques of production coming from the tradition of the place and certification represent only a prerequisite for the differentiation of the product on the market against standard products. The problem is that the specificity of local product comes from attributes (tangible and intangible of quality which are not directly accessible, nor verifiable, by the consumer when he/she makes purchasing choices. This situation persists despite the greater propensity of modern consumer to make investments in information and his/her greater attention and larger background towards the acknowledgement of different offers based on quality. This paper tends to develop an analysis on a theoretical and operative basis upon open strategies that can be implemented at the enterprise level, and that of agro-food chain and of territorial system in order to promote the quality of products to consumers. In particular, the work addresses the problems connected to the establishment of competitive advantages for Protected Designation of Origin (Pdo and Protected Geographical Indication (Pgi, highlighting that in order to achieve those advantages, firms offering typical products need to differentiate their offering on both material and immaterial ground acting on intrinsic and extrinsic attributes of quality of products, on specific features (natural, historical, cultural, etc. of territorial, on the efficiency of the offering organizational structure, and finally on the

The goal of the European Fusion Roadmap is to deliver fusion electricity to the grid early in the second half of this century. It breaks the quest for fusion energy into eight missions, and for each of them it describes a research and development programme to address all the open technical gaps in physics and technology and estimates the required resources. It points out the needs to intensify industrial involvement and to seek all opportunities for collaboration outside Europe. The roadmap covers three periods: the short term, which runs parallel to the European Research Framework Programme Horizon 2020, the medium term and the long term. ITER is the key facility of the roadmap as it is expected to achieve most of the important milestones on the path to fusion power. Thus, the vast majority of present resources are dedicated to ITER and its accompanying experiments. The medium term is focussed on taking ITER into operation and bringing it to full power, as well as on preparing the construction of a demonstration power plant DEMO, which will for the first time demonstrate fusion electricity to the grid around the middle of this century. Building and operating DEMO is the subject of the last roadmap phase: the long term. Clearly, the Fusion Roadmap is tightly connected to the ITER schedule. Three key milestones are the first operation of ITER, the start of the DT operation in ITER and reaching the full performance at which the thermal fusion power is 10 times the power put in to the plasma. The Engineering Design Activity of DEMO needs to start a few years after the first ITER plasma, while the start of the construction phase will be a few years after ITER reaches full performance. In this way ITER can give viable input to the design and development of DEMO. Because the neutron fluence in DEMO will be much higher than in ITER, it is important to develop and validate materials that can handle these very high neutron loads. For the testing of the materials, a

Full Text Available Intrabody communication (IBC is a wireless communication technology using the human body to develop body area networks (BANs for remote and ubiquitous monitoring. IBC uses living tissues as a transmission medium, achieving power-saving and miniaturized transceivers, making communications more robust against external interference and attacks on the privacy of transmitted data. Due to these advantages, IBC has been included as a third physical layer in the IEEE 802.15.6 standard for wireless body area networks (WBANs designated as Human Body Communication (HBC. Further research is needed to compare both methods depending on the characteristics of IBC application. Challenges remain for an optimal deployment of IBC technology, such as the effect of long-term use in the human body, communication optimization through more realistic models, the influence of both anthropometric characteristics and the subject’s movement on the transmission performance, standardization of communications, and development of small-size and energy-efficient prototypes with increased data rate. The purpose of this work is to provide an in-depth overview of recent advances and futurechallenges in human body/intrabody communication for wireless communications and mobile computing.

During the last decades huge amounts of data have been collected in clinical databases representing patients' health states (e.g., as laboratory results, treatment plans, medical reports). Hence, digital information available for patient-oriented decision making has increased drastically but is often scattered across different sites. As as solution, personal health record systems (PHRS) are meant to centralize an individual's health data and to allow access for the owner as well as for authorized health professionals. Yet, expert-oriented language, complex interrelations of medical facts and information overload in general pose major obstacles for patients to understand their own record and to draw adequate conclusions. In this context, recommender systems may supply patients with additional laymen-friendly information helping to better comprehend their health status as represented by their record. However, such systems must be adapted to cope with the specific requirements in the health domain in order to deliver highly relevant information for patients. They are referred to as health recommender systems (HRS). In this article we give an introduction to health recommender systems and explain why they are a useful enhancement to PHR solutions. Basic concepts and scenarios are discussed and a first implementation is presented. In addition, we outline an evaluation approach for such a system, which is supported by medical experts. The construction of a test collection for case-related recommendations is described. Finally, challenges and open issues are discussed.

Full Text Available During the last decades huge amounts of data have been collected in clinical databases representing patients’ health states (e.g., as laboratory results, treatment plans, medical reports. Hence, digital information available for patient-oriented decision making has increased drastically but is often scattered across different sites. As as solution, personal health record systems (PHRS are meant to centralize an individual’s health data and to allow access for the owner as well as for authorized health professionals. Yet, expert-oriented language, complex interrelations of medical facts and information overload in general pose major obstacles for patients to understand their own record and to draw adequate conclusions. In this context, recommender systems may supply patients with additional laymen-friendly information helping to better comprehend their health status as represented by their record. However, such systems must be adapted to cope with the specific requirements in the health domain in order to deliver highly relevant information for patients. They are referred to as health recommender systems (HRS. In this article we give an introduction to health recommender systems and explain why they are a useful enhancement to PHR solutions. Basic concepts and scenarios are discussed and a first implementation is presented. In addition, we outline an evaluation approach for such a system, which is supported by medical experts. The construction of a test collection for case-related recommendations is described. Finally, challenges and open issues are discussed.

Gastric cancer still represents one of the major causes of cancer mortality worldwide. Patients survival is mainly related to stage, with a high proportion of patients with metastatic disease at presentation. Thus, the cure rate largely depend upon surgical resection. Despite the additional, albeit small, benefit of adjuvant chemotherapy has been clearly demonstrated, no general consensus has been reached on the best treatment option. Moreover, the narrow therapeutic index of adjuvant chemotherapy (i.e., limited survival benefit with considerable toxicity) requires a careful assessment of expected risks and benefits for individual patients. Treatment choices vary widely based on the different geographic areas, with chemotherapy alone more often preferred in Europe or Asia and chemoradiotherapy in the United States. In the present review we discuss the current evidence and futurechallenges regarding adjuvant chemotherapy in curatively resected gastric cancer with particular emphasis on the recently completed landmark studies and meta-analyses. The most recent patient-level meta-analysis demonstrated the benefit of adjuvant chemotherapy over curative surgery; the same Authors also showed that disease-free survival may be used as a surrogate end-point for overall survival. We finally discuss future research issues such as the need of economic evaluations, development of prognostic or predictive biomarkers, and the unmet clinical need of trials comparing perioperative chemotherapy with adjuvant treatment. PMID:24782604

Full Text Available Meeting the educational needs of students currently requires moving towardcollaborative electronic and mobile learning systems that parallel the vision ofWeb 2.0. However, factors such as data freedom, brokerage, interconnectivityand the Internet of Things add to a vision for Web 3.0 that will require con-sideration in the development of future campus-based, distance and vocationalstudy. So, education can, in future, be expected to require deeper technologicalconnections between students and learning environments, based on signiﬁcantuse of sensors, mobile devices, cloud computing and rich-media visualization.Therefore, we discuss challenges associated with such a futuristic campus con-text, including how learning materials and environments may be enriched byit. As an additional novel element the potential for much of that enrichmentto be realized through development by students, within the curriculum, is alsoconsidered. We will conclude that much of the technology required to embracethe vision of Web 3.0 in education already exists, but that further research inkey areas is required for the concept to achieve its full potential.

Background: Worksite health promotion (WHP addresses diverse individual and work-related health determinants. Thus, multiple, non-standardized interventions as well as company outcomes other than health have to be considered in WHP research.

Methods: The article builds primarily on published research reviews in WHP and related fields. It discusses key practical and research challenges of the workplace setting. The evidence available on the effectiveness of WHP is summarised and conclusions are drawn for future WHP practice and research.

Results: WHP research on health-oriented, behavioural interventions shows that the level of evidence ranges from suggestive to acceptable for key prevention areas such as physical activity, nutrition, fitness, smoking, alcohol and stress. Such interventions are effective if key conditions are met. Future research is needed on long-term effects, on multi-component programs and on programs, which address environmental determinants of health behaviour as well. Research on work-related determinants of health shows the economic and public health relevance of WHP interventions. Reviews of work-oriented, organisational interventions show that they produce a range of individual and organisational outcomes. However, due to the complexity of the organisational context, the generalisability and predictability of such outcomes remain limited.

Conclusions: WHP research shows success factors of WHP and provides evidence of its effectiveness. In future, the evidence base should be expanded by developing adaptive, company-driven intervention approaches which allow for continuous optimisation of companies from a health perspective. Also, approaches for active dissemination of such a systemic-salutogenic occupational health management approach should be developed to increase the public health impact of WHP.

The mission of the Office of Nuclear Energy's Fuel Cycle Technologies office (FCT program) is to provide options for possible future changes in national nuclear energy programs. While the recent draft report of the Blue Ribbon Commission on America's Nuclear Future stressed the need for organization changes, interim waste storage and the establishment of a permanent repository for nuclear waste management, it also recognized the potential value of alternate fuel cycles and recommended continued research and development in that area. With constrained budgets and great expectations, the current challenges are significant. The FCT program now performs R and D covering the entire fuel cycle. This broad R and D scope is a result of the assignment of new research and development (R and D) responsibilities to the Office of Nuclear Energy (NE), as well as reorganization within NE. This scope includes uranium extraction from seawater and uranium enrichment R and D, used nuclear fuel recycling technology, advanced fuel development, and a fresh look at a range of disposal geologies. Additionally, the FCT program performs the necessary systems analysis and screening of fuel cycle alternatives that will identify the most promising approaches and areas of technology gaps. Finally, the FCT program is responsible for a focused effort to consider features of fuel cycle technology in a way that promotes nonproliferation and security, such as Safeguards and Security by Design, and advanced monitoring and predictive modeling capabilities. This paper and presentation will provide an overview of the FCT program R and D scope and discuss plans to analyze fuel cycle options and support identified R and D priorities into the future. The FCT program is making progress in implanting a science based, engineering driven research and development program that is evaluating options for a sustainable fuel cycle in the U.S. Responding to the BRC recommendations, any resulting legislative

One of the first fruits of cooperation with LBL was the use of the MBone (Multi-Cast Backbone) to broadcast the Conference on the Future of Mathematical Communication, held at MSRI November 30--December 3, 1994. Late last fall, MSRI brought together more than 150 mathematicians, librarians, software developers, representatives of scholarly societies, and both commercial and not-for-profit publishers to discuss the revolution in scholarly communication brought about by digital technology. The conference was funded by the Department of Energy, the National Science Foundation, and the Paul and Gabriella Rosenbaum Foundation. It focused on the impact of the technological revolution on mathematics, but necessarily included issues of a much wider scope. There were talks on electronic publishing, collaboration across the Internet, economic and intellectual property issues, and various new technologies which promise to carry the revolution forward. There were panel discussions of electronic documents in mathematics, the unique nature of electronic journals, technological tools, and the role of scholarly societies. There were focus groups on Developing Countries, K-12 Education, Libraries, and Te{sub X}. The meeting also embodied the promises of the revolution; it was multicast over the MBone channel of the Internet to hundreds of sites around the world and much information on the conference will be available on their World Wide Web server at the URL http://www.msri.org/fmc. The authors have received many comments about the meeting indicating that it has had a profound impact on how the community thinks about how scientists can communicate and make their work public.

Full Text Available The International Maritime Organization (IMO has revised air pollution regulations in MARPOL Annex VI. In 2012 Emission Control Areas (ECA will limit fuel sulphur content to 1% and from 2015 to 0.1%. NOx emissions based on ships engine speed are also reduced for new vessels (2012 & 2016. Facing this legislation, ship owners have the alternative either to operate ships with costly low-sulphur fuels, or to keep using HFO but together with a gas cleaning equipment at the ship stack in order to reduce the rejected amount of SO2 gas in the atmosphere. To achieve this requirement, research and development organizations came out with proposing a solution that uses a device for cleaning exhaust gas of marine diesel engines. The paper presents a short communication about the DEECON project, which aim is to create a novel on-board after-treatment unit more advanced than any currently available. Each sub-unit of the system will be optimized to remove a specific primary pollutant. In particular, the technology within the DEECON system is based on novel or improved abatement techniques for reducing SOx, NOx, Particulate Matter (PM, CO and Volatile Organic Compounds (VOC. Some of these technologies are completely new for the maritime sector and they will represent a breakthrough in the reduction of the atmospheric emissions of ships, moving forward the performance of exhaust gas cleaning systems and fostering and anticipating the adoption of future and tighter regulatory requirements. In addition, an after-treatment strategy enables the possible adoption of alternative fuels, which often have their own emissions characteristics.

Several publications have identified technicalchallenges facing Uganda's National Transmission Backbone Infrastructure project. This research addresses the technical limitations of the National Transmission Backbone Infrastructure project, evaluates the goals of the project, and compares the results against the technical capability of the backbone. The findings of the study indicate a bandwidth deficit, which will be addressed by using dense wave division multiplexing repeaters, leasing bandwidth from private companies. Microwave links for redundancy, a Network Operation Center for operation and maintenance, and deployment of wireless interoperability for microwave access as a last-mile solution are also suggested.

Full Text Available The paper approaches the specific case of the technical discourse in the context of a modern world which facilitates and promotes a more and more refined diversification of specialized texts. Created, imposed, promoted and sustained by economic reasons, the translation of technical texts finds new challenges as it is confronted with the opportunities offered by the cyberspace. While being quick, available and free, online instant translation services prove to be essentially inappropriate for the translation of technical texts, where accuracy is a prerogative.

Sustainable drive systems and innovative safety technologies are the mainstays of Daimler's vision of mobility for the future. Vehicles with hydrogen-powered fuel cells and battery powered drivetrains provide ideal conditions for environmentally friendly mobility that saves natural resources. Already several years ago Daimler launched a vehicle fleet of 100 smart electric drive that are operated by customers in London Metropolitan area. Key enabler for this powertrain technology is the high voltage battery. The customer feedbacks of the smart electric drive vehicles well prove that battery electric vehicles are a successful answer to zero emission mobility in urban areas. As the pioneer of the fuel cell technology, Daimler already presented the first vehicle with this highly efficient and environment-friendly drive concept in 1994. With more than 100 test vehicles that have altogether covered more than four million kilometres, Daimler has the most experience in fuel cell vehicles worldwide - from compact A-Class passenger cars to Sprinter vans and large Citaro fuel cell buses. The Mercedes-Benz B-Class F-CELL is the first series-produced vehicles with a zeroemission fuel-cell drive. Small-series production of the passenger car has started in late 2009. A new generation of fuel-cell drive is used to power this innovative vehicle. The fuel cell system is much more compact while at the same time offers higher performance. It is also completely suitable for everyday use. The fuel cell system used in the Mercedes-Benz B-Class F-CELL is also demonstrating its suitability for heavy-duty operation in commercial vehicles. By means of combining two B-Class systems with an energy storage unit, a highly powerful aggregate is created for application in the new FuelCELL-Hybrid bus. (orig.)

In a world with rapidly increasing population and technological development new space based remote sensing tools allowed for new discoveries and production of water, energy- and mineral-resources, including minerals, soils and construction materials. This has impact on politics, socio-economic development and thus calls for a strong involvement of geosciences because one of humanities biggest challenges will be, to rise living standards particularly in less developed countries. Any growth will lead to an increase of demand for natural resources. But especially for readily available mineral resources supply appears to be limited. Particularly demand for so called high-tech commodities - platinum group or rare earth elements - increased. This happened often faster than new discoveries were made. All this, while areas available for exploration decreased as the need for urban and agricultural use increased. Despite strong efforts in increasing efficiency of recycling, shortage in some commodities has to be expected. A major concern is that resources are not distributed evenly on our planet. Thus supplies depend on political stability, socio-economic standards and pricing. In the light of these statements IUGS is scoping a new initiative, Resourcing Future Generations (RFG), which is predicated on the fact that mining will continue to be an essential activity to meet the needs of future generations. RFG is aimed at identifying and addressing key challenges involved in securing natural resources to meet global needs post-2030. We consider that mineral resources should be the initial focus, but energy, soils, water resources and land use should also be covered. Addressing the multi-generational needs for mineral and other natural resources requires data, research and actions under four general themes: 1. Comprehensive evaluation and quantification of 21st century supply and demand. 2. Enhanced understanding of subsurface as it relates to mineral (energy and groundwater

Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.

Full Text Available Scabies is a contagious skin disease caused by a mite of Sarcoptes scabiei . It is found worldwide particularly in regions related with of poverty, remote area, poor sanitations and nutritional status in both human and animals . Scabies is transmitted by direct contact . The clinical signs are intensive pruritis or itchiness, erythrema, papula and vesicle . Infestation S. scabiei caused damage skin and raised animal death (50 - 100% while 300 millions people per year were reported to suffer from scabies . Diagnosis of scabies is based on clinical signs and confirmed with gently scrapping the skin off burrow (seeking for eggs, faecals and mites . Beside that, the diagnosis can be obtained by ink test, mineral oil or fluorescence tetracycline test . ELISA method for detecting human scabies still has a disadvantage because there is a cross-reaction between host skin and var . S. scabiei antigens . The development of scabies vaccine also has many problems . Some human scabies cases were suspected from their livestock or pet animals . It is required a good and synergic collaboration between both health and livestock agencies that involved both human and vet medicals, investigators, quarantine staffs including researchers. Those factors become a challenge at present and in the future to prevent the spreading of scabies to a larger area and to minimize scabies cases in both human and animal, particularly in the endemic area .

The Universal Declaration of Human Rights recognizes the family as the basic cell of society, highlighting its importance, the need to protect it, to promote it as a natural and fundamental group unit of society. To reflect on the effects that the actual culture is in the family is important from the situation as it is now presented, then move to raise the changes seen necessary to ensure their own future and that of their habitat that is Earth. To accomplish this first task some study results World Family Map 2015 is glossed. Later reflections on the binomial environmental ecology and human ecology, where the importance of adequate from anthropological concepts to succeed in each of these fields. The current decline of culture directly affects the family by undermining one of its main bases, human love, which is a real threat. They are proposed some challenges for the family and mentioned strategies to address them. As outlined that conclusion not to follow the activity of endangering the human species and destroy the planet, it is necessary to change the current social structure and culture, beginning with the family; must be helped to become aware of their problems and limitations, so that later you can give truly human solutions to these problems, almost always produced by the same man.

Pulmonary diseases and infections are among the top contributors to human morbidity and mortality worldwide, and despite the successful history of vaccines and antimicrobial therapeutics, infectious disease still presents a significant threat to human health. Effective vaccines are frequently unavailable in developing countries, and successful vaccines have yet to be developed for major global maladies, such as tuberculosis. Furthermore, antibiotic resistance poses a growing threat to human health. The “Challenges and Future in Vaccines, Drug Development, and Immunomodulatory Therapy” session of the 2013 Pittsburgh International Lung Conference highlighted several recent and current studies related to treatment and prevention of antibiotic-resistant bacterial infections, highly pathogenic influenza, respiratory syncytial virus, and tuberculosis. Research presented here focused on novel antimicrobial therapies, new vaccines that are either in development or currently in clinical trials, and the potential for immunomodulatory therapies. These studies are making important contributions to the areas of microbiology, virology, and immunology related to pulmonary diseases and infections and are paving the way for improvements in the efficacy of vaccines and antimicrobials. PMID:25148426

Pulmonary diseases and infections are among the top contributors to human morbidity and mortality worldwide, and despite the successful history of vaccines and antimicrobial therapeutics, infectious disease still presents a significant threat to human health. Effective vaccines are frequently unavailable in developing countries, and successful vaccines have yet to be developed for major global maladies, such as tuberculosis. Furthermore, antibiotic resistance poses a growing threat to human health. The "Challenges and Future in Vaccines, Drug Development, and Immunomodulatory Therapy" session of the 2013 Pittsburgh International Lung Conference highlighted several recent and current studies related to treatment and prevention of antibiotic-resistant bacterial infections, highly pathogenic influenza, respiratory syncytial virus, and tuberculosis. Research presented here focused on novel antimicrobial therapies, new vaccines that are either in development or currently in clinical trials, and the potential for immunomodulatory therapies. These studies are making important contributions to the areas of microbiology, virology, and immunology related to pulmonary diseases and infections and are paving the way for improvements in the efficacy of vaccines and antimicrobials.

Mental health literacy (MHL) is one increasingly researched factor thought to influence mental health behaviors. Researchers have argued for expanding the definition of MHL to include additional constructs, but no consensus has yet been reached on what constructs should be included as part of MHL. The purpose of this paper is to (i) elucidate how the expansion of the MHL construct has impeded the growth of MHL research and (ii) through the lens of construct and theory development, highlight how these challenges might be remedied. An inclusive search of the literature was undertaken to identify MHL studies. The principles of construct and theory development guided a critical analysis of MHL. The review of the literature found that MHL violates many principles of what constitutes an acceptable construct definition. To address these concerns, we proposed conceptualizing MHL as a theory and recommended principles of theory development that should be taken into consideration. A theory of MHL can guide future researchers to clearly delineate important constructs and their interrelationships. For practitioners, a theory of MHL can help inform how to improve MHL at both the individual and community level.

This white paper explores the technicalchallenges and solutions for acquiring (capturing) and managing enterprise images, particularly those involving visible light applications. The types of acquisition devices used for various general-purpose photography and specialized applications including dermatology, endoscopy, and anatomic pathology are reviewed. The formats and standards used, and the associated metadata requirements and communication protocols for transfer and workflow are consider...

The challenge study is a project based learning curriculum at Technical High School aimed at the construction of a wireless communication system. The first period was engineering issues in the construction of an artificial satellite and the second period was a positional locating system based on the general purpose wire-less device--ZigBee device.…

A large set of 5350 trend following technical trading rules is applied to LIFFE and CSCE cocoa futures prices, and to the Pound-Dollar exchange rate, in the period 1983:1-1997:6. We find that 72% of the trading rules generates positive profits, even when correcting for transaction and borrowing

A large set of 5350 trend following technical trading rules is applied to LIFFE and CSCE cocoa futures prices, and to the Pound-Dollar exchange rate, in the period 1983:1-1997:6. We find that 58% of the trading rules generates a strictly positive excess return, even when correcting for transaction

Inspired by the recommendations of the NSF report “Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge” (NSF08204), the NSF National STEM Digital Learning program funded “Planning for the Future of Geocybereducation” Workshop sought to bring together leaders from the geoscience education community, from major geoscience research initiatives, and from the growing public- and private-sector geoscience information community. The objectives of the workshop were to begin conversations aimed at identifying best practices and tools for geoscience cyber-education, in the context of both the changing nature of learners and of rapidly evolving geo-information platforms, and to provide guidance to the NSF as to necessary future directions and needs for funding. 65 participants met and interacted live for the two-day workshop, with ongoing post-meeting virtual interactions via a collaborative workspace (www.geocybered.ning.com). Topics addressed included the rapidly changing character of learners, the growing capabilities of geoscience information systems and their affiliated tools, and effective models for collaboration among educators, researchers and geoinformation specialists. Discussions at the meeting focused on the implications of changing learners on the educational process, the challenges for teachers and administrators in keeping pace, and on the challenges of communication among these divergent professional communities. Ongoing virtual discussions and collaborations have produced a draft workshop document, and the workshop conveners are maintaining the workshop site as a venue for ongoing discussion and interaction. Several key challenges were evident from the workshop discussions and subsequent interactions: a) the development of most of the large geoinformatics and geoscience research efforts were not pursued with education as a significant objective, resulting in limited financial support for such activities after the

Full Text Available Viruses cause epidemics on all major cultures of agronomic importance, representing a serious threat to global food security. As strict intracellular pathogens, they cannot be controlled chemically and prophylactic measures consist mainly in the destruction of infected plants and excessive pesticide applications to limit the population of vector organisms. A powerful alternative frequently employed in agriculture relies on the use of crop genetic resistances, approach that depends on mechanisms governing plant-virus interactions. Hence, knowledge related to the molecular bases of viral infections and crop resistances is key to face viral attacks in fields. Over the past 80 years, great advances have been made on our understanding of plant immunity against viruses. Although most of the known natural resistance genes have long been dominant R genes (encoding NBS-LRR proteins, a vast number of crop recessive resistance genes were cloned in the last decade, emphasizing another evolutive strategy to block viruses. In addition, the discovery of RNA interference pathways highlighted a very efficient antiviral system targeting the infectious agent at the nucleic acid level. Insidiously, plant viruses evolve and often acquire the ability to overcome the resistances employed by breeders. The development of efficient and durable resistances able to withstand the extreme genetic plasticity of viruses therefore represents a major challenge for the coming years. This review aims at describing some of the most devastating diseases caused by viruses on crops and summarizes current knowledge about plant-virus interactions, focusing on resistance mechanisms that prevent or limit viral infection in plants. In addition, I will discuss the current outcomes of the actions employed to control viral diseases in fields and the future investigations that need to be undertaken to develop sustainable broad-spectrum crop resistances against viruses.

NASA has been actively involved in studying the planet Earth and its changing environment for well over thirty years. Within the last decade, NASA's Earth Science Enterprise has become a major observational and scientific element of the U.S. Global Change Research Program. NASA's Earth Science Enterprise management has developed a comprehensive observation-based research program addressing all the critical science questions that will take us into the next century. Furthermore, the entire program is being mapped to answer five Science Themes (1) land-cover and land-use change research (2) seasonal-to-interannual climate variability and prediction (3) natural hazards research and applications (4) long-term climate-natural variability and change research and (5) atmospheric ozone research. Now the emergence of newer technologies on the horizon and at the same time continuously declining budget environment has lead to an effort to refocus the Earth Science Enterprise activities. The intent is not to compromise the overall scientific goals, but rather strengthen them by enabling challenging detection, computational and space flight technologies those have not been practically feasible to date. NASA is planning faster, cost effective and relatively smaller missions to continue the science observations from space for the next decade. At the same time, there is a growing interest in the world in the remote sensing area which will allow NASA to take advantage of this by building strong coalitions with a number of international partners. The focus of this presentation is to provide a comprehensive look at the NASA's Earth Science Enterprise in terms of its brief history, scientific objectives, organization, activities and future direction.

Industrialization that Latin America has experienced during the past 50 years, the increase of population and the growth of chemical-related industries has generated a variety of environmental problems that must be addressed. After assessing these profound changes, greater emphasis should be placed on the study of environmental health and toxicology. Latin American countries face many problems that are common to other developing nations. Therefore, there is a demand for safety assessment and regulatory control of chemicals that create a need for increasing numbers of toxicologists. To meet this demand, educational programs in toxicology have to be designed. This paper utilizes a consultation questionnaire that includes toxicology-network members, scientists and educational institutions where toxicology is taught. An analysis of the information collected is made, with an emphasis on what we currently lack and on futurechallenges for toxicology professionals. Although the response from the study institutions was 65% (13 countries out of 20), the paper aims to assess the present situation of toxicology. The convenience for a certification/recognition for toxicologists is also evaluated. Action needs to be taken to promote scientific development based on regional specific needs that require increasing at the number of toxicology programs, and promoting of cooperation between academics and researchers. Among the limitations we have are the variability of curricula, objectives and priorities. The increasing globalization of markets and regulations requires the harmonization of graduate/postgraduate programs to ensure that risk assessment and management are dealt with uniformly. Cooperation among our countries and international assistance should play a more prominent role in the promotion of regional integration and the more efficient utilization of international experience in defining educational policies

This is an invited paper for a special issue on international perspectives on training and practice in clinical neuropsychology. We provide a review of the status of clinical neuropsychology in Israel, including the history of neuropsychological, educational, and accreditation requirements to become a clinical neuropsychologist and to practice clinical neuropsychology. The information is based primarily on the personal knowledge of the authors who have been practicing clinical neuropsychology for over three decades and hold various administrative and academic positions in this field. Second, we conducted three ad hoc surveys among clinical and rehabilitation psychologists; heads of academic programs for rehabilitation and neuropsychology; and heads of accredited service providers. Third, we present a literature review of publications by clinical neuropsychologists in Israel. Most of the clinical neuropsychologists are graduates of either rehabilitation or clinical training programs. The vast majority of neuropsychologists are affiliated with rehabilitation psychology. The training programs (2-3 years of graduate school) provide solid therapeutic and diagnostic skills to the students. Seventy-five percent of the participants in this survey are employed at least part-time by public or state-funded institutions. Israeli neuropsychologists are heavily involved in case management, including vocational counseling, and rehabilitation psychotherapy. Conclusions and future goals: Although clinical neuropsychologists in Israel are well educated and valued by all health professionals, there are still several challenges that must be addressed in order to further advance the field and the profession. These included the need for Hebrew-language standardized and normalized neuropsychological tests and the application of evidence-based interventions in neuropsychological rehabilitation.

Society greatly benefits from space-based infrastructure and technology. For example, signals from Global Navigation Satellite Systems (GNSS) are used across a wide range of industrial sectors; including aviation, mining, agriculture and finance. Current trends indicate that the use of these space-based technologies is likely to increase over the coming decades as the global economy becomes more technology-dependent. Space weather represents a key vulnerability to space-based technology, both in terms of the space environment effects on satellite infrastructure and the influence of the ionosphere on the radio signals used for satellite communications. In recent decades, the impact of the ionosphere on GNSS signals has re-ignited research interest into the equatorial ionosphere, particularly towards understanding Equatorial Plasma Bubbles (EPBs). EPBs are a dominant source of nighttime plasma irregularities in the low-latitude ionosphere, which can cause severe scintillation on GNSS signals and subsequent degradation on GNSS product quality. Currently, ionospheric scintillation event forecasts are not being routinely released by any space weather prediction agency around the world, but this is likely to change in the near future. In this contribution, an overview of recent efforts to develop a global ionospheric scintillation prediction capability within Australia will be given. The challenges in understanding user requirements for ionospheric scintillation predictions will be discussed. Next, the use of ground- and space-based datasets for the purpose of near-real time ionospheric scintillation monitoring will be explored. Finally, some modeling that has shown significant promise in transitioning towards an operational ionospheric scintillation forecasting system will be discussed.

The Technical Committee Meeting on Approaches to Safety of Future Nuclear Power Plants in Different Countries, held from 29 May to 2 June 1995, contributed to this process. Experts from 14 different countries and two international organizations participated in the meeting, which provided the opportunity to exchange information and to review the answers developed to date to these issues (primarily form the IAEA's technical document ''Development of Safety Principles for the Design of Future Nuclear Power Plants'' IAEA-TECDOC-801) and the report of the International Nuclear Safety Advisory Group ''Basic Safety Principles for Nuclear Power Plants'' (INSAG-3). These references were then used as a starting point for answering the question ''to what degree does general agreement (or harmonization) exist on these desired safety approaches for future reactors, and what opportunities remain for further harmonization? 11 refs, 1 tab

This document provides a summary of technical information on the synthetic radioisotope 233 U. It is one of a series of four reports that map out a national strategy for the future use and disposition of 233 U. The technical information on 233 U in this document falls into two main areas. First, material characteristics are presented along with the contrasts of 233 U to the more well known strategic fissile materials, 235 U and plutonium (Pu). Second, information derived from the scientific information, such as safeguards, waste classifications, material form, and packaging, is presented. Throughout, the effects of isotopically diluting 233 U with nonfissile, depleted uranium (DU) are examined

In May 2011, a workshop was held to develop broader awareness of the technical and operational challenges that could be used to enhance effective transparency and/or verification in the medium to long-term. Building confidence in a broader multi-lateral engagement scenario adds even greater challenges than the traditional bi-lateral approaches. The multi-disciplinary group that attended included decision-makers needing to understand present and possible futuretechnical capabilities, and the technical community needing clearer definition of possible requirements and operational constraints. In additional to traditional presentations, the group conducted an exercise to stimulate new perspectives on verification requirements for a scenario based on nuclear arms reductions at very low numbers of nuclear weapons. The workshop participants were divided into two groups and asked to explore the political and technical requirements needed for States to move towards significant arms reductions. Using a technique called 'back-casting' participants were asked to imagine a world without nuclear weapons and describe what would be needed to achieve levels of one thousand, one hundred, ten, and ultimately zero weapons in the world. Most participants agreed that a strong political commitment will be necessary and that complete disarmament will only be possible if states are convinced that nuclear weapons serve no purpose. Both groups believed that a time period of greater instability would be encountered when moving from 1000 to 100 nuclear weapons and that it would be imperative to accelerate quickly through this period. The group discussed the need to have an international body monitor the disarmament process to maintain legitimacy for the international community. One possibility could be the development of an intergovernmental panel on verification and disarmament to monitor and facilitate disarmament. The groups recognized the problem of fissile material disposition after

Nothing brings out the best in eighth-grade physical science students quite like an engineering challenge. The wind turbine design challenge described in this article has proved to be a favorite among students with its focus on teamwork and creativity and its (almost) sneaky reinforcement of numerous physics concepts. For this activity, pairs of…

This article takes stock of public service motivation research to identify achievements, challenges, and an agenda for research to build on progress made since 1990. After enumerating achievements and challenges, the authors take stock of progress on extant proposals to strengthen research. In

There are huge challenges facing the library and information science profession. Librarians need to be 'blended professionals' who can take their professional skills and experience, and adapt them to different business models and strategic challenges. This work intends to stimulate strategic and innovative thinking and questions the status quo.

The overall purpose of the Technical Meeting was to recognize and analyse the implications of the accident occurred at the Fukushima Dai-ichi Nuclear Power Station on current and future fast neutron systems design and operation. The aim was to provide a global forum for discussing the principal lessons learned from this event, and thus to review safety principles and characteristics of existing and future fast neutron concepts, especially in relation with extreme natural events which potentially may lead to severe accident scenarios. The participants also presented and discussed innovative technical solutions, design features and countermeasures for design extension conditions - including earthquakes, tsunami and other extreme natural hazards - which can enhance the safety level of existing and future fast neutron systems. Furthermore, the meeting gave the opportunity to present advanced methods for the evaluation of the robustness of plants against design extension conditions. Another important goal of this TM was to discuss how to harmonize safety approaches and goals for next generation’s fast reactors. Finally, the meeting was intended to identify areas where further research and development in nuclear safety, technology and engineering in the light of the Fukushima accident are needed. In the frame of the implementation of its Nuclear Safety Action Plan endorsed by all Member States, the IAEA will consider these areas as potential technical topics for new Coordinated Research Projects, to be launched in the near future

The overall purpose of the Technical Meeting was to recognize and analyse the implications of the accident occurred at the Fukushima Dai-ichi Nuclear Power Station on current and future fast neutron systems design and operation. The aim was to provide a global forum for discussing the principal lessons learned from this event, and thus to review safety principles and characteristics of existing and future fast neutron concepts, especially in relation with extreme natural events which potentially may lead to severe accident scenarios. The participants also presented and discussed innovative technical solutions, design features and countermeasures for design extension conditions - including earthquakes, tsunami and other extreme natural hazards - which can enhance the safety level of existing and future fast neutron systems. Furthermore, the meeting gave the opportunity to present advanced methods for the evaluation of the robustness of plants against design extension conditions. Another important goal of this TM was to discuss how to harmonize safety approaches and goals for next generation’s fast reactors. Finally, the meeting was intended to identify areas where further research and development in nuclear safety, technology and engineering in the light of the Fukushima accident are needed. In the frame of the implementation of its Nuclear Safety Action Plan endorsed by all Member States, the IAEA will consider these areas as potential technical topics for new Coordinated Research Projects, to be launched in the near future

During the early development of remote systems for processing and examining fuel and materials from nuclear reactors, the facility designer and operator worked closely together to meet the challenges of this new field. Numerous challenges still face the nuclear remote systems engineer, e.g., the development of systems that reduce the exposure of workers, the need for advances in basic technology, and the development of cost-effective facilities. The solution to these and other challenges can be accelerated by an expanded program of information exchange, an aggressive development program, and improved project management procedures

The future appears rich in missions that will extend the frontiers of knowledge, human presence in space, and opportunities for profitable commerce. Key to the success of these ventures is the availability of plentiful, cost effective electric power and assured, low cost access to space. While forecasts of space power needs are problematic, an assessment of future needs based on terrestrial experience has been made. These needs fall into three broad categories: survival, self sufficiency, and industrialization. The cost of delivering payloads to orbital locations from LEO to Mars has been determined and future launch cost reductions projected. From these factors, then, projections of the performance necessary for future solar and nuclear space power options has been made. These goals are largely dependent upon orbital location and energy storage needs. Finally the cost of present space power systems has been determined and projections made for future systems.

Future exploration missions will require NASA to integrate more automation and robotics in order to accomplish mission objectives. This presentation will describe on the futurechallenges facing the human operator (astronaut, ground controllers) as we increase the amount of automation and robotics in spaceflight operations. It will describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. This presentation will outline future human-automation-robotic integration challenges.

There has been no large-scale naval combat in the last 30?years. With the rapid development of battleships, weapons manufacturing and electronic technology, naval combat will present some new characteristics. Additionally, naval combat is facing unprecedented challenges. In this paper, we discuss the topic of medical rescue at sea: what challenges we face and what we could do. The contents discussed in this paper contain battlefield self-aid buddy care, clinical skills, organized health servi...

The development of massive open online courses (MOOCs) has launched an era of large-scale interactive participation in education. While massive open enrolment and the advances of learning technology are creating exciting potentials for lifelong learning in formal and informal ways, the implementation of efficient and effective assessment is still problematic. To ensure that genuine learning occurs, both assessments for learning (formative assessments), which evaluate students' current progress, and assessments of learning (summative assessments), which record students' cumulative progress, are needed. Providers' more recent shift towards the granting of certificates and digital badges for course accomplishments also indicates the need for proper, secure and accurate assessment results to ensure accountability. This article examines possible assessment approaches that fit open online education from formative and summative assessment perspectives. The authors discuss the importance of, and challenges to, implementing assessments of MOOC learners' progress for both purposes. Various formative and summative assessment approaches are then identified. The authors examine and analyse their respective advantages and disadvantages. They conclude that peer assessment is quite possibly the only universally applicable approach in massive open online education. They discuss the promises, practical and technicalchallenges, current developments in and recommendations for implementing peer assessment. They also suggest some possible future research directions.

eliminating the source of pollution, but also on blocking the pathways from contaminants to receptors or reducing the exposure to contaminants,. Futurechallenge integration of sustainability into remediation decision-making. Soil is not a waste! There is a growing interest in the clean up approaches that maintain soil quality after remediation treatments. This issue is of great importance in the U.S.A. where the EPA from 2009 is promoting innovative clean-up strategies (Green Remediation). Green remediation is defined as the practice of considering all environmental effects of remedy and incorporating options to maximize environmental benefit of cleanup actions . These remediation strategies restore contaminated sites to productive use with a great attention to the global environmental quality, including the preservation of soil functionality according to the following principles: use minimally invasive technologies; use passive energy technologies such as bioremediation and phytoremediation as primary remedies or finishing steps where possible and effective; minimize soil and habitat disturbance; minimize bioavailability of contaminants trough adequate contaminant source and plume control If we move from the current definition of remedial targets based on total concentrations, technologies with low impact on the environment can be utilized reducing the wrong choice to disposal soil in landfill destroying quickly a not renewable essential resource.

'Full text:' The Fukushima Daiichi accident raises a fundamental question: Can science and technology prevent the inevitability of serious accidents, especially those with low probabilities and high consequences? This question reminds us of a longstanding issue with the trans-sciences, originally addressed by a great physicist, Alvin Weinberg, well before the Three Mile Island and Chernobyl accidents. A most robust nuclear energy future would not come to pass without answering it. With the human-behavior-linked nature of the Fukushima Daiichi accident in mind, an innovative way to assure the safety is to be developed not merely from the standpoint of technical measures but rather from a view of point of inter-related institutional structure between the technology and the social system. In other words, social and institutional arrangements to strengthen self-efforts for continuous improvement of safety are to be made specifically dedicated to pursuing procedural rationality between the inner and outer environments, in which a communicative action with transparency and a self-regulating system are especially useful. This is the challenge facing the global nuclear future as well as Japan to be tackled in the wake of the accident. (author)

Plant infection is a complicated process. On encountering a plant, pathogenic microorganisms must first adapt to life on the epiphytic surface, and survive long enough to initiate an infection. Responsiveness to the environment is critical throughout infection, with intracellular and community-level signal transduction pathways integrating environmental signals and triggering appropriate responses in the bacterial population. Ultimately, phytopathogens must migrate from the epiphytic surface into the plant tissue using motility and chemotaxis pathways. This migration is coupled with overcoming the physical and chemical barriers to entry into the plant apoplast. Once inside the plant, bacteria use an array of secretion systems to release phytotoxins and protein effectors that fulfil diverse pathogenic functions (Fig. ) (Melotto and Kunkel, ; Phan Tran et al., ). As our understanding of the pathways and mechanisms underpinning plant pathogenicity increases, a number of central research challenges are emerging that will profoundly shape the direction of research in the future. We need to understand the bacterial phenotypes that promote epiphytic survival and surface adaptation in pathogenic bacteria. How do these pathways function in the context of the plant-associated microbiome, and what impact does this complex microbial community have on the onset and severity of plant infections? The huge importance of bacterial signal transduction to every stage of plant infection is becoming increasingly clear. However, there is a great deal to learn about how these signalling pathways function in phytopathogenic bacteria, and the contribution they make to various aspects of plant pathogenicity. We are increasingly able to explore the structural and functional diversity of small-molecule natural products from plant pathogens. We need to acquire a much better understanding of the production, deployment, functional redundancy and physiological roles of these molecules. Type III

The Institute for Energy Process Engineering (IWV-3) is one of three departments of the Institute for Materials and Processes in Energy Systems at Research Centre Juelich. The research tasks of IWV-3 are oriented, on the one hand, to the design and construction of polymer and high-temperature fuel cells and stacks for stationary, portable or mobile applications extending up to complete systems with fuel cells. On the other hand, process- and system-engineering developments comprise the provision of apparatus for fuel processing. These activities are accompanied by basic physico-chemical studies and systems analyses of energy process engineering. The present report wants to provide an insight into the diversified aspects of scientific and technical work at IWV-3. Examples of success demonstrate the implementation of the claim to provide top-class results of social, ecological and economic relevance in an international comparison. The Institute contributes to education and further training in cooperation with universities, universities of applied sciences and training workshops. The description of the fields of activity and of relevant work results illustrates the connection of basic research with technical development work in priority topics. It thus becomes apparent that the Institute's scientific and technical work is oriented to the transformation of research results into innovative products, methods and processes. The presentation of selected R and D projects documents the significance and the role of international cooperations with partners from research and industry. Explanations concerning priority activities and the work approach as well as the allocation of special departments, competence fields and R and D goals will round off the report. (orig.)

This whitepaper describes the major IT challenges in scientific research at CERN and several other European and international research laboratories and projects. Each challenge is exemplified through a set of concrete use cases drawn from the requirements of large-scale scientific programs. The paper is based on contributions from many researchers and IT experts of the participating laboratories and also input from the existing CERN openlab industrial sponsors. The views expressed in this document are those of the individual contributors and do not necessarily reflect the view of their organisations and/or affiliates.

are met in conjunction with situations, where the esthetical design issues are addressed. Finally, our study also points out to the necessity of finding a trans-disciplinary cooperation across sectors to more effectively answer to the climate change challenge, when designing low-carbon technologies...

This year has been proclaimed the International Year of Chemistry by the United Nations. This year long celebration allows chemists to highlight the rich history and successes of their scientific discipline and to explain how chemistry can help to solve the global challenges that mankind faces today

Purpose -- The purpose of this paper is to undertake a survey of the external and internal forces changing the nature of business schools and business education. It aims to investigate how management education responds to increasing productivity, innovation and capability challenges, examine how MBA programs currently meet these demands, and how…

among other things at ... in the planning of the new extensions to Korle Bu ... one ward (ward M) housed all the paediatric pa- ... Health Care Delivery to children by the Women Medi- .... Street, London to try to learn some paediatrics. ... career. Times indeed have changed, but there will al- ways be personal challenges for ...

This paper explores the implications of moderately expanding plutonium "pit" production capability within the strongly R&D culture of Los Alamos National Laboratory, especially in terms of the lab's current capacity or "fitness for the future" in which institutional stewardship of the nation's nuclear deterrent capability becomes a primary objective. The institutional properties needed to assure "future fitness" includes the organizational requisites highly reliable operations and sustained institutional constancy in a manner that evokes deep public trust and confidence. Estimates are made of the degree to which the key Division and most relevant Program office in this evolution already exhibits them.

Teacher educators need to take the responsibility for providing future teachers with long-term evaluative skills necessary to select good literature. Educators must also take responsibility for modeling the powerful notion that books or literature aid in everyone's personal search for meaning. The process of analyzing literature is helpful in…

The global food and agribusiness industry is in the midst of major changes, and the pace of change seems to be increasing. These changes suggest three fundamental critical future issues for the sector: 1) decisions must be made in an environment of increasing risk and uncertainty, 2) developing and

This article summarises the most important topics that were discussed during the Archaeology session in the ¿Tree Rings and People¿ conference in Davos, Switzerland. Main topics are the relation between the archaeologists and dendrochronologists and the future perspectives of dendrochronology in the

To enter a future that waits to be born, educational leaders must continually assess their own ethical stance as well as that of the organizations they serve. Three frames form a model for examining the ethics of both individual and organization, with internal monologue and engaged conversation as the means for reflection and action.

Technology foresight on environment sector was carried out under the supervision of Pakistan Technology Board on the theme “Environment 2025: Our future, our choices”. Social, technological, environmental, economical, political and values (STEEPV) is an internationally recognized tool for brainstorming used in ...

A look at pharmaceutical care needs in the future is the basis for discussion of the educational needs of clinical pharmacists. Issues discussed include the appropriate degree (bachelor's vs. doctoral), costs of instruction, faculty/student ratios, the pharmacy practice faculty as role models, and computer-assisted instruction. (MSE)

ZERO TO THREE's executive director and members of the board offer their thoughts on working to support the healthy development of infants, toddlers, and their families over the past 30 years, and where we need to go as we look to the future.

Full Text Available The increasing complexity of the conventional grid due to population growth, advancement in technology and infrastructures which contribute immensely to instability, insecurity, and inefficiency and environmental energy sustainability calls for the use of renewable energy for sustainability of power supply. Intermittency and fluctuation of the renewable energy is a great challenge on the smart grid. This paper reveal the potential challenges of renewable energy on the smart grid and proffer solution with the application of high voltage DC (HVDC and Flexible AC transmission system (FACTS devices. The functions and advantages of FACTS devices are presented in this paper. Voltage control and stability control with FACTS application are also discussed. This was achieved because FACTS has fast controllability and capability to exchange active and reactive power independently.

Seafood is the primary source of animal protein for more than one billion people. Many economies and communities, in particular those in developing nations and coastal regions, depend on fisheries. Whereas the dire effects of overfishing on open-access ocean fisheries are already recognized, impacts of catches on freshwater systems are still underestimated. IIASA’s fisheries research elucidates how to secure and expand aquatic food resources, emphasizing three topical challenges. First, impro...

Neuroscience is advancing at a rapid pace, with new technologies and approaches that are creating ethical challenges not easily addressed by current ethical frameworks and guidelines. One fascinating technology is neuroimaging, especially functional Magnetic Resonance Imaging (fMRI). Although still in its infancy, fMRI is breaking new ground in neuroscience, potentially offering increased understanding of brain function. Different populations and faith traditions will likely have different reactions to these new technologies and the ethical challenges they bring with them. Muslims are approximately one-fifth of world population and they have a specific and highly regulated ethical and moral code, which helps them deal with scientific advances and decision making processes in an Islamically ethical manner. From this ethical perspective, in light of the relevant tenets of Islam, neuroimaging poses various challenges. The privacy of spirituality and the thought process, the requirement to put community interest before individual interest, and emphasis on conscious confession in legal situations are Islamic concepts that can pose a challenge for the use of something intrusive such as an fMRI. Muslim moral concepts such as There shall be no harm inflicted or reciprocated in Islam and Necessities overrule prohibitions are some of the criteria that might appropriately be used to guide advancing neuroscience. Neuroscientists should be particularly prudent and well prepared in implementing neuroscience advances that are breaking new scientific and ethical ground. Neuroscientists should also be prepared to assist in setting the ethical frameworks in place in advance of what might be perceived as runaway applications of technology.

Neurosurgery in awake patients incorporates newer technologies that require the anesthesiologists to update their skills and evolve their methodologies. They need effective communication skills and knowledge of selecting the right anesthetic drugs to ensure adequate analgesia, akinesia, along with patient satisfaction with the anesthetic conduct throughout the procedure. The challenge of providing adequate anesthetic care to an awake patient for intracranial surgery requires more than routine vigilance about anesthetic management. PMID:25422613

Full Text Available Neurosurgery in awake patients incorporates newer technologies that require the anesthesiologists to update their skills and evolve their methodologies. They need effective communication skills and knowledge of selecting the right anesthetic drugs to ensure adequate analgesia, akinesia, along with patient satisfaction with the anesthetic conduct throughout the procedure. The challenge of providing adequate anesthetic care to an awake patient for intracranial surgery requires more than routine vigilance about anesthetic management.

In the last three decades a profound transformation of the medical profession has taken place. The modern clinician is required to consume vast amounts of information from clinical studies, critically reviewing evidence that may or may not lead to changes in clinical practice. The present article presents some challenges that this era of information poses to clinicians and patients. Georg Thieme Verlag KG Stuttgart · New York.

In the past decade, significant progress has been made in the imaging of tumors, three dimensional (3D) treatment planning, and radiation treatment delivery. At this time one of the greatest challenges for conformal radiation therapy is the accurate delineation of tumor and target volumes. The physician encounters many uncertainties in the process of defining both tumor and target. The sources of these uncertainties are discussed, as well as the issues requiring study to reduce these uncertainties

The need to develop new large energy resources is discussed. One of three inexhaustible energy resource possibilities is fusion energy, whose history and scientific goals are described. The current world-wide research and development program for fusion is outlined. As an example of today's perception of what fusion energy will be like, a commercial tokamak fusion electric powerplant is described. Special attention is devoted to some of the challenging material problems that face fusion power development. (Author) [pt

Liquidity risk management ranks to key concepts applied in finance. Liquidity is defined as a capacity to obtain funding when needed, while liquidity risk means as a threat to this capacity to generate cash at fair costs. In the paper we present challenges of liquidity risk management resulting from the 2007- 2009 global financial upheaval. We see five main regulatory liquidity risk management issues requiring revision in coming years: liquidity measurement, intra-day and...

Full Text Available Everyday we witness uncertain future. Modern sociologists today's social situation name as "liquid modern period", "end of certainty","world without ligatures". As this is a time of social and economic bifurcation, it is also a time of opportunities. The article outlines the social background of modern and postmodern categories, with consequences which are visible in the educational structure of schools and in a psychology of a modern human person. The discrepancy between dialectic of narrative knowledge and didactics as pragmatics of scientific knowledge ­ which is legitimised through institutional network of language games ­ lies the cause of storm chaotic economic ­ social conditions and human moral values breakage. Holistic strategy, which primarly understands a man as an integrated human being, we set up lifelong education and learning as a corrective model of the current educational system and vision for the future.

In the last ten years there has been much speculation about the role of e-books and e-book readers. This paper will look at the impact of e-book readers on publishing and reading, the types of e-book readers, their advantages and disadvantages. Some ideas for future e-books and e-book readers and their use in the library and classroom will be…

Oral rehydration therapy (ORT) is a cheap and simple intervention aimed to prevent mortality and morbidity associated with dehydration due to diarrhoea. ORT promotion strategies through programme communication, social mobilisation and social marketing, and advocacy efforts have yielded substantial improvement in the scenario. However, it has also taught us lessons and suggested changes in communication strategies to make the promotion efforts more effective in future.

Full Text Available Present water shortage is one of the primary world issues, and according to climate change projections, it will be more critical in the future. Since water availability and accessibility are the most significant constraining factors for crop production, addressing this issue is indispensable for areas affected by water scarcity. Current and future issues related to “water scarcity” are reviewed in this paper so as to highlight the necessity of a more sustainable approach to water resource management. As a consequence of increasing water scarcity and drought, resulting from climate change, considerable water use for irrigation is expected to occur in the context of tough competition between agribusiness and other sectors of the economy. In addition, the estimated increment of the global population growth rate points out the inevitable increase of food demand in the future, with an immediate impact on farming water use. Since a noteworthy relationship exists between the water possessions of a country and the capacity for food production, assessing the irrigation needs is indispensable for water resource planning in order to meet food needs and avoid excessive water consumption.

The main purpose of this paper is to analyse the extent to which science and technology (S and T) policy has been integrated to patent based innovation within the context of national development objective. Could the institutionalisation of patent based innovation in Malaysia be effectively implemented ? The main argument is that patent based innovation must be integrated into the national S and T policy as a dynamic and proactive system. This paper argued that the dependency has an effect to the consequent of relationship imbalance between Malaysia and advanced nations as the main technology supplier within the international business economy. This paper adopts the interdisciplinary of social sciences using system-oriented analysis. The objective of the S and T is to enhance the capacity of national S and T resources as well as to develop the capacity of the local to select, negotiate, adopt, modify, and improve imported technology. Nevertheless, even though the S and T policy was launched in 1986, the impetus of government commitment towards the realisation of S and T policy became more pronounced only in the middle of the 90s. The Ministry of Sciences, Technology and Environment recognises the importance of creating a climate of science and technology as a critical prerequisite for national development. Various activities at the state and national levels have been organised to promote awareness and adoption of technology to the community. The Ministry that acts as the secretariat to the National Science and Development Council aims to reinforce the foundation based on S and T activities. Malaysia also should encourage activities that general technical innovations that would be integrated into patent based innovation as one of the component in the national innovation system. While this is so, an effective assimilation of technical innovation that is developed within the framework of the national innovation system is yet to be realised. Patent based innovation as a

Demands for water resources are diverse and are increasing as human populations grow and become more concentrated in urban areas and as economies develop. Water is essential for many uses including the basic human needs of food and the maintenance of good health, for many industries and the creation of electrical energy and as vital for the sustenance of the natural ecosystems on which all life is dependent. At the same time threats from water - floods, droughts - are increasing with these extreme events becoming more common and more intense in many regions of the world and as more people locate in flood- and drought-prone regions. In general, the challenges for water managers are thus becoming greater; managers not only are having to make increasingly difficult decisions regarding allocation of water resources between competing uses as demand outstrips supply, but they also have to take measures to protect societies from the ravages of extreme events. The intensity of the challenges facing water managers is not uniform throughout the world - many nations in the less developed world experiencing far greater problems than most highly developed nations - but the trend towards greater challenges is clear. Decision-makers, whether at the international, national, provincial or local level benefit from reliable information on water resources. They need information on the availability in quantity and quality of water from a variety of sources - surface waters, aquifers or from artificial sources such as re-cycling of wastewater and desalination techniques. Managers also need reliable predictions on water availability for the various uses to which water is put - such predictions are needed on time scales from weeks to decades to inform decision-making. Predictions are also needed on the probabilities of occurrence of extreme events. Thus hydrological scientists developing predictive models and working within a fast-changing world have much to contribute to the needs of

The new Next Generation Science Standards (NGSS), which spell out a set of K-12 performance expectations for life science, physical science, and Earth and space science (ESS), pose a variety of opportunities and challenges for geoscience education. Among the changes recommended by the NGSS include establishing ESS on an equal footing with both life science and physical sciences, at the full K-12 level. This represents a departure from the traditional high school curriculum in most states. In addition, ESS is presented as a complex, integrated, interdisciplinary, quantitative Earth Systems-oriented set of sciences that includes complex and politically controversial topics such as climate change and human impacts. The geoscience communities will need to mobilize in order to assist and aid in the full implementation of ESS aspects of the NGSS in as many states as possible. In this context, the NGSS highlight Earth and space science to an unprecedented degree. If the NGSS are implemented in an optimal manner, a year of ESS will be taught in both middle and high school. In addition, because of the complexity and interconnectedness of the ESS content (with material such as climate change and human sustainability), it is recommended (Appendix K of the NGSS release) that much of it be taught following physics, chemistry, and biology. However, there are considerable challenges to a full adoption of the NGSS. A sufficient work force of high school geoscientists qualified in modern Earth Systems Science does not exist and will need to be trained. Many colleges do not credit high school geoscience as a lab science with respect to college admission. The NGSS demand curricular practices that include analyzing and interpreting real geoscience data, and these curricular modules do not yet exist. However, a concerted effort on the part of geoscience research and education organizations can help resolve these challenges.

Multimodal Learning Analytics (MMLA) captures, integrates and analyzes learning traces from different sources in order to obtain a more holistic understanding of the learning process, wherever it happens. MMLA leverages the increasingly widespread availability of diverse sensors, high......-frequency data collection technologies and sophisticated machine learning and artificial intelligence techniques. The aim of this workshop is twofold: first, to expose participants to, and develop, different multimodal datasets that reflect how MMLA can bring new insights and opportunities to investigate complex...... learning processes and environments; second, to collaboratively identify a set of grand challenges for further MMLA research, built upon the foundations of previous workshops on the topic....

One of the major developments in optics industry recently is the commercial manufacturing of freeform surfaces for optical mid- and high performance systems. The loss of limitation on rotational symmetry enables completely new optical design solutions - but causes completely new challenges for the manufacturer too. Adapting the serial production from radial-symmetric to freeform optics cannot be done just by the extension of machine capabilities and software for every process step. New solutions for conventional optics productions or completely new process chains are necessary.

Tuberculosis (TB) continues to be a global health threat. BCG was developed as an attenuated live vaccine for tuberculosis control nearly a century ago. Despite being the most widely used vaccine in human history, BCG is not an ideal vaccine and has two major limitations: its poor efficacy against adult pulmonary TB and its disconcerting safety in immunocompromised individuals. A safer and more effective TB vaccine is urgently needed. This review article discusses current strategies to develop the next generation of TB vaccines to replace BCG. While some progresses have been made in the past decade, significant challenges lie ahead.

The North American demand for electric power and natural gas by sector was described and a comparison was made between the number of FERC certified electric power marketers versus natural gas marketing companies between 1986 and 1997 to illustrate the extent of changes that occurred during the decade. Regional opportunities for energy marketers were reviewed. By way of current challenges, the author identified (1) regulatory impediments, (2) divestiture of assets, (3) creation of an effective ISO, (4) establishment of effective pricing mechanisms, (5) customer systems and infrastructure, (6) forcing legislative reform, and (7) stranded cost recovery, as the most important. figs

This paper presents a state-of-the-art overview on journalism and its opportunities and challenges in virtual reality. First we take a look at what kind of real-life journalistic experiments there have been made in this field so far, then we analyze the research literature on journalistic VR. The paper proceeds to discuss the emergence of virtual reality and immersive journalism explored in the latest reports in the fields of HCI and VR design. In order to analyse VR-journalism...

With Publication 91 on the impact of ionizing radiation on non-human species, the International Commission on Radiological Protection (ICRP) has taken a major step towards the integration of environmental issues into radiological protection. The conceptual framework has developed in response to public demand and concern for environmental issues, and is underpinned by technical development undertaken by several organisations at both national and international levels. The EC-funded FASSET project (Framework for Assessment of Environmental Impact), completed in 2004, has developed an assessment framework that includes: source characterisation and initial hazard analysis; ecosystem description and selection of reference organisms; exposure analysis including conversion to dose rates; effects analysis; and, guidance for interpretation. On the basis of experience from FASSET and other recent developments, it can be concluded that (i) there is substantial agreement in terms of conceptual approaches between different frameworks currently in use being or proposed, and that (ii) differences in technical approaches can be largely attributed to differences in ecosystems of concern or in national regulatory requirements. A major futurechallenge is the development of an integrated approach where decision-making can be guided by sound scientific judgements. This requires, inter alia: filling gaps in basic knowledge of relevance to assessment and protection, through targeted experimental, theoretical (including expert judgements) and real case studies; development of risk characterisation methodologies; development of screening standards, where appropriate; development of user-friendly assessment tools; and, stakeholder involvement, including the development of supporting communication strategies. These issues will be addressed in the ERICA project (Environmental Risks from Ionizing Contaminants Assessment and Management) launched under the EC 6th Framework Programme during the

Full text: In the 50 years since they opened, the IAEA's laboratories in Seibersdorf have improved the lives of millions of people through work using sophisticated scientific techniques, IAEA Director General Yukiya Amano said today at a ceremony to mark the anniversary. Work at the labs has made a difference in controlling animal diseases in more than 30 countries in Africa and Asia, and contributed to the development of hardier and more nutritious crops such as barley that can grow in the High Andes of Peru. Scientists at the labs have helped communities identify the best sources of underground water and ensure that this scarce resource is used effectively. They have worked on safe ways to preserve food, and provided vital technical support for cancer treatment and other medical uses of nuclear technology. New challenges abound in the present and the future, Director General Amano said. ''Member States want us to do more in almost all areas of nuclear applications. This includes climate-smart agriculture, with priority on helping countries to adapt to climate change while improving food security. It includes improving preparedness for responding to nuclear emergencies and especially for dealing with radiological contamination in food and agriculture.'' The Director General also said the IAEA would contribute more to controlling mosquitoes that transmit malaria by using techniques that, together with pest control programmes, have helped control other insects. IAEA scientists at the eight nuclear applications laboratories and the safeguards laboratories carry out research and development and provide technical services to the IAEA's 158 Member States. The labs also regularly host fellows and scientific visitors, with more than 2 000 benefiting from this opportunity to learn in the past 50 years. (IAEA)

This talk will serve as the keynote address for a research symposium being held at Washington State University. The purpose of the talk is to provide researchers and students at WSU with an overview about what it is like to sleep in space. Dr. Flynn-Evans will begin by highlighting how sleep is different in movies and science fiction compared to real life. She will next cover basic information about sleep and circadian rhythms, including how sleep works on earth. She will explain how people have circadian rhythms of different lengths and how the circadian clock has to be re-set each day. She will also describe how jet-lag works as an example of what happens during circadian misalignment. Dr. Flynn-Evans will also describe how sleep is different in space and will highlight the challenges that astronauts face in low-earth orbit. She will discuss how astronauts have a shorter sleep duration in space relative to on the ground and how their schedules can shift due to operational constraints. She will also describe how these issues affect alertness and performance. She will then discuss how sleep and scheduling may be different on a long-duration mission to Mars. She will discuss the differences in light and day length on earth and mars and illustrate how those differences pose significant challenges to sleep and circadian rhythms.

Innate lymphoid cells (ILCs) are innate immune cells that are ubiquitously distributed in lymphoid and nonlymphoid tissues and enriched at mucosal and barrier surfaces. Three major ILC subsets are recognized in mice and humans. Each of these subsets interacts with innate and adaptive immune cells and integrates cues from the epithelium, the microbiota, and pathogens to regulate inflammation, immunity, tissue repair, and metabolic homeostasis. Although intense study has elucidated many aspects of ILC development, phenotype, and function, numerous challenges remain in the field of ILC biology. In particular, recent work has highlighted key new questions regarding how these cells communicate with their environment and other cell types during health and disease. This review summarizes new findings in this rapidly developing field that showcase the critical role ILCs play in directing immune responses through their ability to interact with a variety of hematopoietic and nonhematopoietic cells. In addition, we define remaining challenges and emerging questions facing the field. Finally, this review discusses the potential application of basic studies of ILC biology to the development of new treatments for human patients with inflammatory and infectious diseases in which ILCs play a role. PMID:27811053

Full Text Available The information society stays at the core of the Lisbon Strategy, despite the dot-com crisis and the still hidden macroeconomic impact of information and communication technology (ICT. Thus, i2010 has been the first concrete initiative of the revised Lisbon Strategy in 2005, while ICT represents by far the field with the largest budget in the 7th Framework Programme (FP7. On the industry side, the stakes are still high in the global competition, where Europe hopes for a place at least for communication technologies and services. However, the extreme dynamics of technology with its sometimes breathtaking promises, poses new challenges for e-inclusion. Firstly, the accelerating pace of innovation maintains a generation type of digital divide between countries with different level of development. Secondly, the changing nature of the network (e.g. web 2.0 with virtual communities; web 3.0 with location based interaction; semantic web; ambient intelligence and “the internet of things” blurs the very distinction between inside and outside the information space. The paper explores these challenges and the associated policy options.

INTRODUCTION: The Danish Fracture Database (DFDB) was established in 2011 to establish nationwide prospective quality assessment of all fracture-related surgery. In this paper, we describe the DFDB's setup, present preliminary data from the first annual report and discuss its future potential...... of osteosynthesis were the three most common indications for reoperation and accounted for 34%, 14% and 13%, respectively. CONCLUSION: The DFDB is an online database for registration of fracture-related surgery that allows for basic quality assessment of surgical fracture treatment and large-scale observational...

Despite increased availability of online promotional tools for prescription drug marketers, evidence on online prescription drug promotion is far from settled or conclusive. We highlight ways in which online prescription drug promotion is similar to conventional broadcast and print advertising and ways in which it differs. We also highlight five key areas for future research: branded drug website influence on consumer knowledge and behavior, interactive features on branded drug websites, mobile viewing of branded websites and mobile advertisements, online promotion and non-US audiences, and social media and medication decisions. PMID:26927597

Full Text Available Vietnam plays important role in Russian policy in the Asia-Pacific region. Military-technical cooperation holds special position in Russian-Vietnamese relations. The aim of the article is the detection of the special features of military-technical cooperation between Russia and Vietnam, and also challenges and opportunities it provides for Russian policy. After the collapse of the USSR defense interaction between Russia and Vietnam was determined by commercial foundations. Vietnam needed new Russian weapons to protect its interests, first of all, in the South China Sea. For Moscow military-technical cooperation with Vietnam got economic significance. But later there was a rise of political dimension of cooperation in this sphere, influenced by some external factors. The period of the 2000-2010s was marked by growth of arms sales from Russia to Vietnam. It was mostly caused by the escalation of the South China Sea conflict, for which US-Chinese contradictions began to play an increasing role. Military-technical coopera-tion with Vietnam influenced some aspects of policy of Russia in the region. There was an increase of indi-rect involvement of Russia into the South China Sea conflict. Russian arms sales for Vietnam became one of problems in Russian-Chinese relations. But Russia and China could cope with these disputes, partly because of enlargement of their interaction in international relations, including the demonstration of similar position for some aspects of the South China Sea conflict. In the framework of development of defense cooperation with Vietnam, Russia could get special conditions of access to facilities of Cam Ranh Bay that strengthened its strategic positions in the region. Russian cooperation with Hanoi in military-technical field and general reinforcement of Russian positions in Vietnam might be also a reason for contradictions with the US.

In 2008, electricity access rate in rural Mali was below 11%. In view of the challenges of electrification and development of rural areas in Mali, solar energy is seen as a strategic technology. The SIDI has asked ENEA to work on the technical and organisational terms ensuring sustainable access and spreading of photovoltaic systems in rural Mali. As such, in this report, ENEA improves the knowledge of the sector's private actors, suggests support architectures tackling the problematic, and highlights critical points by market segments

This white paper explores the technicalchallenges and solutions for acquiring (capturing) and managing enterprise images, particularly those involving visible light applications. The types of acquisition devices used for various general-purpose photography and specialized applications including dermatology, endoscopy, and anatomic pathology are reviewed. The formats and standards used, and the associated metadata requirements and communication protocols for transfer and workflow are considered. Particular emphasis is placed on the importance of metadata capture in both order- and encounter-based workflow. The benefits of using DICOM to provide a standard means of recording and accessing both metadata and image and video data are considered, as is the role of IHE and FHIR.

Full Text Available Defining and measuring organizational culture (OC is of paramount importance to organizations because a strong culture could potentially increase service quality and yield sustainable competitive advantages. However, such process could be challenging to managers because the scope of OC has been defined differently across disciplines and industries, which has led to the development of various scales for measuring OC. In addition, previously developed OC scales may also not be fully applicable in the hospitality and tourism context. Therefore, by highlighting the key factors affecting the business environment and the unique characteristics of hospitality industry, this paper aims to align the scope of OC closely with the industry and to put forth the need for a new OC scale that accurately responds to the context of the hospitality industry.

INTRODUCTION: The Danish Fracture Database (DFDB) was established in 2011 to establish nationwide prospective quality assessment of all fracture-related surgery. In this paper, we describe the DFDB's setup, present preliminary data from the first annual report and discuss its future potential...... are registered. Indication for reoperation is also recorded. The reoperation rate and the one-year mortality are the primary indicators of quality. RESULTS: Approximately 10,000 fracture-related surgical procedures were registered in the database at the time of presentation of the first annual DFDB report...... of osteosynthesis were the three most common indications for reoperation and accounted for 34%, 14% and 13%, respectively. CONCLUSION: The DFDB is an online database for registration of fracture-related surgery that allows for basic quality assessment of surgical fracture treatment and large-scale observational...

There is currently a focus on radioactivity and the Arctic region. The reason for this is the high number of nuclear sources in parts of the Arctic and the vulnerability of Arctic systems to radioactive contamination. The Arctic environment is also perceived as a wilderness and the need for the protection of this wilderness against contamination is great. In 1991, the International Arctic Environmental Protection Strategy (IAEPS) was launched and the Arctic Monitoring and Assessment Programme (AMAP) established. AMAP is undertaking an assessment of the radioactive contamination of the Arctic and its radiological consequences. This paper summarises some of current knowledge about sources of radioactive contamination, vulnerability, exposure of man, and potential sources for radioactive contamination within Arctic and some views on the future needs for work concerning radioactivity in Arctic. (author)

SEE have became a substantial Achilles heel for the reliability of space-based advanced CMOS technologies with features size downscaling. Future space and defense systems require identification and understanding of single event effects to develop hardening approaches for advanced technologies, including changes in device geometry and materials affect energy deposition, charge collection,circuit upset, parametric degradation devices. Topics covered include the impact of technology scaling on radiation response, including single event transients in high speed digital circuits, evidence for single event effects caused by proton direct ionization, and the impact for SEU induced by particle energy effects and indirect ionization. The single event effects in CMOS replacement technologies are introduced briefly. (authors)

Access to clean, affordable and reliable energy has been a cornerstone of the world's increasing prosperity and economic growth since the beginning of the industrial revolution. Our use of energy in the twenty-first century must also be sustainable. Solar and water-based energy generation, and engineering of microbes to produce biofuels are a few examples of the alternatives. This Perspective puts these opportunities into a larger context by relating them to a number of aspects in the transportation and electricity generation sectors. It also provides a snapshot of the current energy landscape and discusses several research and development opportunities and pathways that could lead to a prosperous, sustainable and secure energy future for the world.

Full Text Available The term, inclusion, particularly in the educational setting, is still based on a deficit view. Perceptions of ‘dis’-ability create barriers to true inclusion and are often reinforced through higher education training programs. To promote inclusive values, acceptance of individual and cultural differences must be included in all curricula, not solely within special education. The future of a truly inclusive education relies on a cultural shift that supports and nurtures differences, and views success through a lens not focused on standardization but on diversity. The Index for Inclusion (The Index has been utilized worldwide to support schools, to remove perceived barriers and to establish increasingly inclusive school cultures and practices. The Index aids in the creation of a culture that is dedicated to identifying and reducing barriers to inclusion and increases the learning and participation for all students.

The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.

There are major opportunities for big, important questions to drive biogeochemical research in the future. Some suggestions are presented, such as: what are the controls on N loss and retention in watershed-ecosystems; what are the rates and controls on biological N fixation and denitrification in diverse ecosystems; how does scale (temporal and spatial) control biogeochemical flux and cycling; what controls the apparent and actual weathering rates in terrestrial ecosystems and what is the fate of the weathered products; how can biogeochemical function best be integrated on regional to global scales; and what are the quantitative interrelationships between hydrologic cycles and biogeochemical cycles? Some brief examples and approaches to address such questions, for example, the value of multidisciplinary teams for addressing complicated questions,and the use of sophisticated tools (e.g., stable isotopes, spatial statistics, remote sensing), are presented

This is a report of the above-named solar car race from Darwin to Adelaide, Australia. On January 7, 1983, an Australian adventurer Mr. Hans Tholstrup succeeded in running from the Australian west coast to Sydney in a car driven solely by solar energy. The travel took 20 days, at an average speed of 23km per hour. The technology has made remarkable advances since his success and, in the World Solar Challenge 1993 held in November 1993, a Honda team crossed the Australian Continent at an average speed of 85km per hour. Technicalchallenges included the development of maximum-output solar cell panels, a car designed to make full use of such power, and a run at the maximum possible speed, all these dependent solely on the sun as energy source. This report Part I contains the details of the race, analysis, aerodynamics, car body structure, manufacture, materials, and so forth. (NEDO)

The Soil Education and public awareness technical commission of the Brazilian Soil Science Society was created in 1987 as Soil Science teaching commission at that time. In the 90's of the last century the commission was very active and realized three national symposia in the years 1994 to 1996: in Viçosa, Minas Gerais; Santa Maria, Rio Grande do Sul and Pato Branco, Paraná. The following symposium scheduled to happen in Brasilia, 1997 could not be realized and was followed by a weakening and reduction of the involved group. Those three symposia were focused on the aspects of soil science taught at the university educational level, mainly in agrarian sciences. The concern about what was going on at basic education and perception by society was not much present. The commission was revitalized in 2005 and in 2007 realized its first meeting at the Brazilian Congress of Soil Science in Gramado, Rio Grande do Sul. At that meeting it was already an urge to assume the approach of soil education instead of soil science teaching, within a major concern how society consider soils. It was accepted and adequate under the structural reorganization undergone by the national society following the IUSS main lines. The commission was renamed and got two new mates at the newly created Division IV, Soils, Environment and Society, of the Brazilian Soil Science Society: Soils and Food Safety and History, Epistemology and Sociology of Soil Science. The national symposia were relaunched to happen biannually. An inventory of the soil education experiences around the country started and the geographic distribution of the future symposia intended to rescue and bring together experiences in different parts of the country that would not be known by other means. Three symposia were already realized: Piracicaba, Sao Paulo, 2008 (southeast); Curitiba, Paraná, 2010 (south) and Sobral, Ceará, 2012 (northeast). The next is planned to happen in Recife, Pernambuco in April 2014. The scope of the

Data are the currency of science and assure the integrity of published research. As the ability to collect, analyze, and visualize data has grown beyond what could be included in a publication, and as the value of the data become more clear (or the lack of availability of data was criticized), publishers and the scientific community developed several solutions to enhance access to underlying data. Most leading journals now require authors to agree as a condition of submission that underlying data will be included or made available; indeed, publication is the key leverage point in exposing much scholarly data. Most journals allow PDF or other supplements and links to data sets hosted by authors or labs, or better, data repositories such as Dryad, and some have banned "data not shown" or any reference to unpublished work. Many of these solutions have proven problematic and recent studies have found that large fraction of data are undiscoverable even a few years after publication. The best solution has been dedicated domain repositories collectively supported by publishers, funders, and the scientific community and where deposition is required before or at the time of publication. These provide quality control and curation and facilitate reuse. However, expanding these beyond a few key repositories and developing standardized workflows and functionality among repositories and between them and publishers has been problematic. Addressing these and other data challenges requires collaborative efforts among funders, publishers, repositories, societies, and researchers. One example is the Coalition on Publishing Data in the the Earth and space sciences, where most major publishers and repositories have signed a joint statement of commitment (COPDESS.org), and are starting work to direct and link published data to domain repositories. Much work remains to be done. Major challenges include establishing data curation practices into the workflow of science from data collection

concepts and technologies that are being developed today which may be used to solve manufacturing challenges in the future, such as: (self) reconfigurable manufacturing systems, (focused) flexible manufacturing systems, and AI inspired manufacturing. The paper will try to offer a critical point of view......This paper presents an examination of Western European manufacturers’ futurechallenges as can be predicted today. Some of the challenges analyzed in the paper are: globalization, individualism and customization and agility challenges. Hereafter, the paper presents a broad analysis on manufacturing...

Nanotechnology offers an exceptional and unique opportunity for developing a new generation of tools addressing persistent challenges to progress in cancer research and clinical care. The National Cancer Institute (NCI) recognizes this potential, which is why it invests roughly $150 M per year in nanobiotechnology training, research and development. By exploiting the various capacities of nanomaterials, the range of nanoscale vectors and probes potentially available suggests much is possible for precisely investigating, manipulating, and targeting the mechanisms of cancer across the full spectrum of research and clinical care. NCI has played a key role among federal R&D agencies in recognizing early the value of nanobiotechnology in medicine and committing to its development as well as providing training support for new investigators in the field. These investments have allowed many in the research community to pursue breakthrough capabilities that have already yielded broad benefits. Presented here is an overview of how NCI has made these investments with some consideration of how it will continue to work with this research community to pursue paradigm-changing innovations that offer relief from the burdens of cancer. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

Full Text Available Cascading events can be referred to multidimensional disasters, where a primary trigger generates a nonlinear series of secondary emergencies that escalate in time, becoming eventually the priority to tackle. In this process, critical infrastructure can be handled as roots of vulnerabilities, because they accumulate both physical attributes and functional nodes. When compromised, they produce widespread breakdowns of society, but also orient emergency responses and long-term recovery. Although floods have been widely associated to the failure of vulnerable equipments or to the disruption of strategic sectors such as energy, communication and transportation, their integration with the emerging concept of cascading has been limited. This open topic presents many challenges for scholars, researchers and practitioners, in particular when the implementation of the EU Floods Directive is considered. The paper presents an overview of the Floods Directive and its relation with the cascading events, using case studies and examples from the existing literature to point out missing links and gaps in the legislation. Conclusions argue that the Directive considers only local geographical scales and limited temporal horizons, which can be result inadequate to limit the escalation of events.

Historically, progress in high-energy physics has largely been determined by development of more capable particle accelerators. This trend continues today with the imminent commissioning of the Large Hadron Collider at CERN, and the worldwide development effort toward the International Linear Collider. Looking ahead, there are two scientific areas ripe for further exploration--the energy frontier and the precision frontier. To explore the energy frontier, two approaches toward multi-TeV beams are being studied, an electron-positron linear collider based on a novel two-beam powering system (CLIC), and a Muon Collider. Work on the precision frontier involves accelerators with very high intensity, including a Super-BFactory and a muon-based Neutrino Factory. Without question, one of the most promising approaches is the development of muon-beam accelerators. Such machines have very high scientific potential, and would substantially advance the state-of-the-art in accelerator design. The challenges of the new generation of accelerators, and how these can be accommodated in the accelerator design, are described. To reap their scientific benefits, all of these frontier accelerators will require sophisticated instrumentation to characterize the beam and control it with unprecedented precision

Composite materials have emerged as the materials of choice for increasing the performance and reducing the weight and cost of military, general aviation, and transport aircraft and space launch vehicles. Major advancements have been made in the ability to design, fabricate, and analyze large complex aerospace structures. The recent efforts by Boeing and Airbus to incorporate composite into primary load carrying structures of large commercial transports and to certify the airworthiness of these structures is evidence of the significant advancements made in understanding and use of these materials in real world aircraft. NASA has been engaged in research on composites since the late 1960 s and has worked to address many development issues with these materials in an effort to ensure safety, improve performance, and improve affordability of air travel for the public good. This research has ranged from synthesis of advanced resin chemistries to development of mathematical analyses tools to reliably predict the response of built-up structures under combined load conditions. The lessons learned from this research are highlighted with specific examples to illustrate the problems encountered and solutions to these problems. Examples include specific technologies related to environmental effects, processing science, fabrication technologies, nondestructive inspection, damage tolerance, micromechanics, structural mechanics, and residual life prediction. The current state of the technology is reviewed and key issues requiring additional research identified. Also, grand challenges to be solved for expanded use of composites in aero structures are identified.

Historically, progress in high-energy physics has largely been determined by development of more capable particle accelerators. This trend continues today with the imminent commissioning of the Large Hadron Collider at CERN, and the worldwide development effort toward the International Linear Collider. Looking ahead, there are two scientific areas ripe for further exploration--the energy frontier and the precision frontier. To explore the energy frontier, two approaches toward multi-TeV beams are being studied, an electron-positron linear collider based on a novel two-beam powering system (CLIC), and a Muon Collider. Work on the precision frontier involves accelerators with very high intensity, including a Super-BFactory and a muon-based Neutrino Factory. Without question, one of the most promising approaches is the development of muon-beam accelerators. Such machines have very high scientific potential, and would substantially advance the state-of-the-art in accelerator design. The challenges of the new generation of accelerators, and how these can be accommodated in the accelerator design, are described. To reap their scientific benefits, all of these frontier accelerators will require sophisticated instrumentation to characterize the beam and control it with unprecedented precision.

Uterine sarcomas are a group of rare tumours that provide considerable challenges in their treatment. Radiological diagnosis prior to hysterectomy is difficult, with the diagnosis frequently made post-operatively. Current staging systems have been unsatisfactory, although a new FIGO staging system specifically for uterine sarcomas has now been introduced, and may allow better grouping of patients according to expected prognosis. While the mainstay of treatment of early disease is a total abdominal hysterectomy, it is less clear whether routine oophorectomy or lymphadenectomy is necessary. Adjuvant pelvic radiotherapy may improve local tumour control in high risk patients, but is not associated with an overall survival benefit. Similarly there is no good evidence for the routine use of adjuvant chemotherapy. For advanced leiomyosarcoma, newer chemotherapy agents including gemcitabine and docetaxel, and trabectedin, offer some promise, while hormonal therapies appear to be more useful in endometrial stromal sarcoma. Novel targeted agents are now being introduced for sarcomas, and uterine sarcomas, and show some indications of activity. Non-pharmacological treatments, including surgical metastatectomy, radiofrequency ablation, and CyberKnife radiotherapy, are important additions to systemic therapy for advanced metastatic disease.

Brazilian agriculture covers about one-third of the land area and is expected to expand further. We assessed the compliance of present Brazilian agriculture with environmental legislation and identified challenges for agricultural development connected to this legislation. We found (i) minor illegal land use in protected areas under public administration, (ii) a large deficit in legal reserves and protected riparian zones on private farmland, and (iii) large areas of unprotected natural vegetation in regions experiencing agriculture expansion. Achieving full compliance with the environmental laws as they presently stand would require drastic changes in agricultural land use, where large agricultural areas are taken out of production and converted back to natural vegetation. The outcome of a full compliance with environmental legislation might not be satisfactory due to leakage, where pristine unprotected areas become converted to compensate for lost production as current agricultural areas are reconverted to protected natural vegetation. Realizing the desired protection of biodiversity and natural vegetation, while expanding agriculture to meet food and biofuel demand, may require a new approach to environmental protection. New legal and regulatory instruments and the establishment of alternative development models should be considered.

Extensive need for electricity and lack of enough governmental resources for the development of related infrastructures forced the Iranian Government to invite private investors and to sign Energy Conversion Agreement (ECA) in the form of build-operate-transfer (BOT) and build-operate-own (BOO) contracts with them. Accordingly, electricity purchase would be based on a guaranteed price. Changes in some laws in 2007 caused the management of the ECAs and electricity purchase based on guaranteed price to face challenges. Shortening the commercial operation period of the earlier ECAs and signing some new short-term ECAs were the steps taken by the authorities to resolve the problems. By shortening the ECAs' commercial operation period, it is likely to cause serious problems concerning the payments of the project companies, because of shortages in the government's financial resources. The findings of the present viewpoint suggest signing of new long-term contracts (20 years long) in the form of a combinational agreement for buying the produced electricity with a guaranteed price (in the first 5 years) and supplying it in the competitive power market (for the following years) would be a better way to reduce the problems

Earth's atmosphere contains a multitude of organic compounds, which differ by orders of magnitude regarding fundamental properties such as volatility, reactivity, and propensity to form cloud droplets, affecting their impact on global climate and human health. Despite recent major research efforts and advances, there are still substantial gaps in understanding of atmospheric organic chemistry, hampering efforts to understand, model, and mitigate environmental problems such as aerosol formation in both polluted urban and more pristine regions. The analytical toolbox available for chemists to study atmospheric organic components has expanded considerably during the past decade, opening new windows into speciation, time resolution and detection of reactive and semivolatile compounds at low concentrations. This has provided unprecedented opportunities, but also unveiled new scientific challenges. Specific groundbreaking examples include the role of epoxides in aerosol formation especially from isoprene, the importance of highly oxidized, reactive organics in air-surface processes (whether atmosphere-biosphere exchange or aerosols), as well as the extent of interactions of anthropogenic and biogenic emissions and the resulting impact on atmospheric organic chemistry.

The software that the utility industry currently uses may be insufficient to analyze the distribution grid as it rapidly modernizes to include active resources such as distributed generation, switch and voltage control, automation, and increasingly complex loads. Although planners and operators have traditionally viewed the distribution grid as a passive load, utilities and consultants increasingly need enhanced analysis that incorporates active distribution grid loads in order to ensure grid reliability. Numerous commercial and open-source tools are available for analyzing distribution grid systems. These tools vary in complexity from providing basic load-flow and capacity analysis under steady-state conditions to time-series analysis and even geographical representations of dynamic and transient events. The need for each type of analysis is not well understood in the industry, nor are the reasons that distribution analysis requires different techniques and tools both from those now available and from those used for transmission analysis. In addition, there is limited understanding of basic capability of the tools and how they should be practically applied to the evolving distribution system. The study reviews the features and state of the art capability of current tools, including usability and visualization, basic analysis functionality, advanced analysis including inverters, and renewable generation and load modeling. We also discuss the need for each type of distribution grid system analysis. In addition to reviewing basic functionality current models, we discuss dynamics and transient simulation in detail and draw conclusions about existing software?s ability to address the needs of the future distribution grid as well as the barriers to modernization of the distribution grid that are posed by the current state of software and model development. Among our conclusions are that accuracy, data transfer, and data processing abilities are key to future

Nanotechnology is a paradigm for emerging technologies and much talked about area of science. It is the technology of future and has revolutionized all fields of medicine, agriculture, environmental and electronics by providing abilities that would never have previously dreamt of. It is a unique platform of multidisciplinary approaches integrating diverse fields of engineering, biology, physics and chemistry. In recent years, nanotechnology has seen the fastest pace in its all aspects of synthesis methodologies and wide applications in all areas of medicine, agricultural, environmental, and electronics. It is the impact of nanotechnology approaches that new fields of nanomedicine, cancer nanotechnology, nanorobotics and nanoelectronics have been emerged and are flourishing with the advances in this expanding field. Nanotechnology holds the potential for pervasive and promising applications and getting significant attention and financial aids also. Although there are different definitions of nanotechnology, in broad prospective, nanotechnology can be described as designing or exploiting materials at nanometer dimensions (i.e., one dimension less than 100 nanometers). At nanoscale, substances have a larger surface area to volume ratio than conventional materials which is the prime reason behind their increased level of reactivity, improved and size tunable magnetic, optical and electrical properties and more toxicity also.

Effect-directed analysis (EDA) has become useful for identification of toxicant(s) that occur in mixtures in the environment, especially those that are causative agents of specific adverse effects. Here, we summarize and review EDA methodology including preparation of samples, biological analyses, fractionations, and instrumental analyses, highlighting key scientific advancements. A total of 63 documents since 1999 (Scopus search) including 46 research articles, 13 review papers, and 4 project descriptions, have been collected and reviewed in this study. At the early stage (1999-2010), most studies that applied EDA focused on organic extracts of freshwater and coastal contaminated sediments and wastewater. Toxic effects were often measured using cell-based bioassays ( in vitro) and the causative chemicals were identified by use of low resolution gas chromatography with mass selective detector (GCMSD). More recently (2010-present), EDA has been extended to various matrices such as biota, soil, crude oil, and suspended solids and techniques have been improved to include determination of bioavailability in vivo. In particular, methods for non-target screenings of organic chemicals in environmental samples using cutting-edge instrumentation such as time of flight-mass spectrometry (ToF-MS), Fourier transform-ion cyclotron resonance (FT-ICR), and Orbitrap mass spectrometer have been developed. This overview provides descriptions of recent improvements of EDA and suggests future research directions based on current understandings and limitations.

Full Text Available China is one of the largest countries in the world with nearly 20% of the world’s population. There have been significant improvements in economy, education and technology over the last three decades. Due to substantial investments from all levels of government, the public health system in China has been improved since the 2003 severe acute respiratory syndrome (SARS outbreak. However, infectious diseases still remain a major population health issue and this may be exacerbated by rapid urbanization and unprecedented impacts of climate change. This commentary aims to explore China’s current capacity to manage infectious diseases which impair population health. It discusses the existing disease surveillance system and underscores the critical importance of strengthening the system. It also explores how the growing migrant population, dramatic changes in the natural landscape following rapid urbanization, and changing climatic conditions can contribute to the emergence and re-emergence of infectious disease. Continuing research on infectious diseases, urbanization and climate change may inform the country’s capacity to deal with emerging and re-emerging infectious diseases in the future.

Digitalization within the graphic arts industry is described and it is explained how it is improving and changing the print production strategies and which new kinds of print production systems are developed or can be expected. The relationship of printed media and electronic media is analyzed and a positioning for the next century is given. The state of the art of conventional printing technologies, especially using direct imagine techniques, and their position within the digital workflow are shortly described. Non-impact printing multicolor printing systems are explained, based on general design criteria and linked to existing and newly announced equipment. The use of high-tech components for building up successful systems with high reliability, high quality and low production costs is included with some examples. Digital printing systems open many opportunities in print production: distributed printing, personalization, print and book on demand are explained as examples. The overview of the several printing technologies and their positioning regarding quality and productivity leads to the scenario about the important position of printed media, also in the distant future.

Nanotechnology is a paradigm for emerging technologies and much talked about area of science. It is the technology of future and has revolutionized all fields of medicine, agriculture, environmental and electronics by providing abilities that would never have previously dreamt of. It is a unique platform of multidisciplinary approaches integrating diverse fields of engineering, biology, physics and chemistry. In recent years, nanotechnology has seen the fastest pace in its all aspects of synthesis methodologies and wide applications in all areas of medicine, agricultural, environmental, and electronics. It is the impact of nanotechnology approaches that new fields of nanomedicine, cancer nanotechnology, nanorobotics and nanoelectronics have been emerged and are flourishing with the advances in this expanding field. Nanotechnology holds the potential for pervasive and promising applications and getting significant attention and financial aids also. Although there are different definitions of nanotechnology, in broad prospective, nanotechnology can be described as designing or exploiting materials at nanometer dimensions (i.e., one dimension less than 100 nanometers). At nanoscale, substances have a larger surface area to volume ratio than conventional materials which is the prime reason behind their increased level of reactivity, improved and size tunable magnetic, optical and electrical properties and more toxicity also

China is one of the largest countries in the world with nearly 20% of the world's population. There have been significant improvements in economy, education and technology over the last three decades. Due to substantial investments from all levels of government, the public health system in China has been improved since the 2003 severe acute respiratory syndrome (SARS) outbreak. However, infectious diseases still remain a major population health issue and this may be exacerbated by rapid urbanization and unprecedented impacts of climate change. This commentary aims to explore China's current capacity to manage infectious diseases which impair population health. It discusses the existing disease surveillance system and underscores the critical importance of strengthening the system. It also explores how the growing migrant population, dramatic changes in the natural landscape following rapid urbanization, and changing climatic conditions can contribute to the emergence and re-emergence of infectious disease. Continuing research on infectious diseases, urbanization and climate change may inform the country's capacity to deal with emerging and re-emerging infectious diseases in the future.

There is currently a focus on radioactivity and the Arctic region. The reason for this is probably the high number of nuclear sources in parts of the Arctic and the vulnerability of Arctic systems to radioactive contamination. The Arctic environment is also perceived as a wilderness and the need for the protection of this wilderness against contamination is great. In the last decade information has also been released concerning the nuclear situation which has caused concern in many countries. Due to such concerns, the International Arctic Environmental Protection Strategy (IAEPS) was launched in 1991 and the Arctic Monitoring and Assessment Programme (AMAP) was established. AMAP is undertaking an assessment of the radioactive contamination of the Arctic and its radiological consequences. In 1996 IAEPS became part of the Arctic Council. AMAP presented one main report in 1997 and another in 1998. There are also several other national, bilateral and international programmes in existence which deal with this issue. This paper summarises some of current knowledge about sources of radioactive contamination, vulnerability, exposure of man, and potential sources for radioactive contamination within Arctic and some views on the future needs for work concerning radioactivity in Arctic. (au)

Full Text Available Diabetic retinal complications, including macular edema (DME and proliferative diabetic retinopathy (PDR, are the leading cause of new cases of blindness among adults aged 20–74. Chronic hyperglycemia, considered the underlying cause of diabetic retinopathy, is thought to act first through violation of the pericyte-endothelial coupling. Disruption of microvascular integrity leads to pathologic consequences including hypoxia-induced imbalance in vascular endothelial growth factor (VEGF signaling. Several anti-VEGF medications are in clinical trials for use in arresting retinal angiogenesis arising from DME and PDR. Although a review of current clinical trials shows promising results, the lack of large prospective studies, head-to-head therapeutic comparisons, and potential long-term and systemic adverse events give cause for optimistic caution. Alternative therapies including targeting pathogenic specific angiogenesis and mural-cell-based therapeutics may offer innovative solutions for currently intractable clinical problems. This paper describes the mechanisms behind diabetic retinal complications, current research supporting anti-VEGF medications, and future therapeutic directions.

Following the 2013 update of the European Strategy for Particle Physics, the international Future Circular Collider (FCC) study has been launched by CERN as host institute, to design an energy frontier hadron collider (FCC-hh) in a new 80-100 km tunnel with a centre-of-mass energy of about 100 TeV, an order of magnitude above the LHC's, as a long-term goal. The FCC study also includes the design of a 90-350 GeV high-luminosity lepton collider (FCC-ee) fitting the same tunnel, serving as Higgs, top and Z factory, as a potential intermediate step, as well as an electron-proton collider option (FCC-he). The physics cases for such machines will be assessed, concepts for experiments be worked out, and complete accelerator designs be developed in time for the next update of the European Strategy for Particle Physics by the end of 2018. Beside superconductor improvements and high-field magnet prototyping, the FCC R&D program includes the advancement of SRF cavities based on thin film coating, the development of ...

New lines of translational, interdisciplinary research are emerging among different fields of the neurosciences, which often point at clinical neuropsychology as the hinge discipline capable of linking the basic findings with their clinical implications and thereby endow them with some meaning for phenomenological experience. To establish the great lines of progress made in the fields of neuroscience and neuropsychology in recent years, so as to be able to foresee the strategic lines and priorities of neuroscience in the near future. To achieve this aim, the first step will be to identify the changes of paradigm that have taken place in the areas of neuroscience and psychology in the last two decades. The next step will be to propose new topics and fields of application that these changes in paradigm offer and demand from neuroscience. The false dichotomies of genes versus environment, mind versus brain, and reason versus emotion are considered, as are the new applications of neuropsychology to the understanding of psychopathological disorders, from the neurodegenerative to neurodevelopment, from 'dirty' drugs to cognitive and affective enhancers.

Research purpose: To provide an overview of the JD–R model, which incorporates many possible working conditions and focuses on both negative and positive indicators of employee well-being. Moreover, the studies of the special issue were introduced. Research design: Qualitative and quantitative studies on the JD–R model were reviewed to enlighten the health and motivational processes suggested by the model. Main findings: Next to the confirmation of the two suggested processes of the JD–R model, the studies of the special issue showed that the model can be used to predict work-place bullying, incidences of upper respiratory track infection, work-based identity, and early retirement intentions. Moreover, whilst psychological safety climate could be considered as a hypothetical precursor of job demands and resources, compassion satisfaction moderated the health process of the model. Contribution/value-add: The findings of previous studies and the studies of the special issue were integrated in the JD–R model that can be used to predict well-being and performance at work. New avenues for future research were suggested. Practical/managerial implications: The JD–R model is a framework that can be used for organisations to improve employee health and motivation, whilst simultaneously improving various organisational outcomes.

Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

Full Text Available This article addresses the state of the art of bioleaching research published in South Korean Journals. Our research team reviewed the available articles registered in the Korean Citation Index (KCI, Korean Journal Database addressing the relevant aspects of bioleaching. We systematically categorized the target metal sources as follows: mine tailings, electronic waste, mineral ores and metal concentrates, spent catalysts, contaminated soil, and other materials. Molecular studies were also addressed in this review. The classification provided in the present manuscript details information about microbial species, parameters of operation (e.g., temperature, particle size, pH, and process length, and target metals to compare recoveries among the bioleaching processes. The findings show an increasing interest in the technology from research institutes and mineral processing-related companies over the last decade. The current research trends demonstrate that investigations are mainly focused on determining the optimum parameters of operations for different techniques and minor applications at the industrial scale, which opens the opportunity for greater technological developments. An overview of bioleaching of each metal substrate and opportunities for future research development are also included.

As cancer treatment improves, more young men and women survive, but they suffer from infertility as a major sequel of cancer treatment. Gamete and embryo cryopreservation are the only options available to these patients for preserving their fertility. Although cryopreservation of spermatozoa and embryos are already established, oocyte banking is still experimental. The advent of testicular tissue cryopreservation and spermatogonial stem cell transplantation in men, and ovarian tissue cryopreservation and in-vitro follicular maturation in women, has started a frenzy of experiments worldwide trying to demonstrate their potential use in fertility preservation. Although major improvements have been made in tissue cryobanking in the past decade, there are still many unresolved technical issues related to these procedures. Furthermore, the intersection of cancer and fertility preservation in young patients raises ethical, legal and policy issues for oncologists and cancer survivors. Informed consent of minor patients, legal parentage and medical negligence claims are some of the potential legal challenges faced by society and healthcare providers. This review summarizes the technical and ethical challenges of gamete cryopreservation in young cancer patients.

TESLA (TeV Energy Superconducting Linear Accelerator) Collaboration is an international R ampersand D effort towards the development of an e + e - linear collider with 500 GeV center of mass by means of 20 km active superconducting accelerating structures at a frequency of 1.3 GHz. The ultimate challenges faced by the TESLA project are (1) to raise operational accelerating gradients to 25 MV/m from current world level of 5-10 MV/m, and (2) to reduce construction costs (cryomodules, klystrons, etc.) down to $2,000/MV from now about $40,000/MV. The TESLA Collaboration is building a prototype TESLA test facility (TTF) of a 500 MeV superconducting linear accelerator to establish the technical basis. TTF is presently under construction and will be commissioned at DESY in 1997, through the joint efforts of 24 laboratories from 8 countries. Significant progress has been made in reaching the high accelerating gradient of 25 MV/m in superconducting cavities, developing cryomodules and constructing TTF infrastructure, etc. This paper will briefly discuss the challenges being faced and review the progress achieved in the technical area of superconductivity and cryogenics by the TESLA Collaboration

The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technicalchallenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technicalchallenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

Technical trading rules have been widely used by practitioners in financial markets for a long time. The profitability remains controversial and few consider the stationarity of technical indicators used in trading rules. We convert MA, KDJ and Bollinger bands into stationary processes and investigate the profitability of these trading rules by using 3 high-frequency data(15s,30s and 60s) of CSI300 Stock Index Futures from January 4th 2012 to December 31st 2016. Several performance and risk measures are adopted to assess the practical value of all trading rules directly while ADF-test is used to verify the stationarity and SPA test to check whether trading rules perform well due to intrinsic superiority or pure luck. The results show that there are several significant combinations of parameters for each indicator when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. We also propose a method to reduce the risk of technical trading rules.

Current state-of-the-art knowledge concludes that green house gas (GHG) emissions must be controlled and reduced within the next 30-40 years. The transport sector contributes almost a fifth of the current global emissions, and its share is likely to increase in the future. The US and a number of European countries have therefore introduced various support schemes for research and development (RandD) of low emission fuels that can potentially replace the current fossil fuels. One such alternative is biofuels. The advantage of biofuels are that it is easy to introduce into the transport sector. On the other hand, recent research papers question whether the supply of feedstock is sufficient, and to what extent biofuels lead to GHG emission reductions. This report reviews the current status of second generation biofuels. Second generation biofuels are made from cellulose, which according to our survey of the literature, is in more abundant supply than the first generation biofuels feedstocks. Furthermore, it seems to have the potential to reduce GHG emissions from the transport sector without leading to devastating land use changes, which recent critique has held against first generation biofuels. Given that governments have decided to support RandD of low emission fuels, we ask the following questions: Should second generation biofuels receive RandD support to the same extent as other low emission fuels like hydrogen? How should support schemes for second generation biofuels be designed? Second generation biofuels can be divided according to the production process into thermo-chemical and bio-chemical. With respect to the thermo-chemical process the potential for cost reductions seems to be low. On the other hand, ethanol made from cellulose using the biochemical conversion process is far from a ripe technology. Expert reports point to several potential technological breakthroughs which may reduce costs substantially. Hence, cellulosic ethanol, should receive direct

Although chemicals have been reportedly used as weapons for thousands of years, it was not until 1915 at Ypres, France that an industrial chemical, chlorine, was used in World War I as an offensive weapon in significant quantity, causing mass casualties. From that point until today the development, detection, production and protection from chemical weapons has be an organized endeavor of many of the world's armed forces and in more recent times, non-governmental terrorist organizations. The number of Chemical Warfare Agents (CWAs) has steadily increased as research into more toxic substances continued for most of the 20 th century. Today there are over 70 substances including harassing agents like tear gas, incapacitating agents, and lethal agents like blister, blood, chocking, and nerve agents. The requirements for detecting chemical weapons vary depending on the context in which they are encountered and the concept of operation of the organization deploying the detection equipment. The US DoD, for example, has as a requirement, that US forces be able to continue their mission, even in the event of a chemical attack. This places stringent requirements on detection equipment. It must be lightweight (developed for this application, including, but not limited to: mass spectroscopy, IR spectroscopy, RAMAN spectroscopy, MEMs micro-cantilever sensors, surface acoustic wave sensors, differential mobility spectrometry, and amplifying fluorescence polymers. In the future the requirements for detection equipment will continue to become even more stringent. The continuing increase in the sheer number of threats that will need to be detected, the development of binary agents requiring that even the precursor chemicals be detected, the development of new types of agents unlike any of the current chemistries, and the expansion of the list of toxic industrial chemical will require new techniques with higher specificity and more sensitivity.

Full Text Available Functional gastrointestinal disorders (FGIDs are a common problem in children. These disorders in children are classified into the following categories according to the ROME III classification: Functional Dyspepsia, Irritable bowel syndrome (IBS, Abdominal Migraines, Childhood Functional abdominal pain (FAP, Childhood functional abdominal pain syndrome and functional constipation. FGIDs are diagnosed based on history and normal physical examination provided that there is no evidence of underlying disease such as anatomical abnormalities, infectious, inflammatory and malignancies. This group of poorly defined diseases represent a huge treatment challenge to the specialist, because, until now there is no therapy that has been effective in improving the symptoms. FGIDs also cause deep family problems as the disease interrupts their routine and positive response to treatment is rarely seen. On the other hand there is no objective document of the disease neither endoscopic, radiologic nor pathologic. Therapeutic strategies of FGIDs are: education and parent's assurance, detection and modifying physical and psychological stress, dietary intervention, pharmacological treatment, psychotherapy and other complementary medical treatments. Some foods may trigger the illness such as coffee, fatty foods and spicy foods, therefore they should be avoided. Lactose-free diet cannot improve symptoms of FGIDs, except in children with lactose intolerance. The beneficial effect of fiber supplement in children with FGIDs remains unknown but it has been useful in adults with IBS. Probiotics have potential efficacy in treating IBS but the efficacy in children with FGIDs remains uncertain and needs to be further studied. In patients with severe symptoms, pharmacological agents can be effective. These drugs include Antacids, Prokinetics, Anticholinergic, Tricyclic antidepressants (TCAS and Serotonergic agents (Agonists and anti agonists. Psychotherapy in FAP and IBS is

, entrepreneurial life with training courses, education programs. At the University of Debrecen, the Institute of Accounting and Finance is faced with these educational challenges, such as introduction of SAP, and IFRS into the BA and MA education, and by using these systems to support managerial decision-making process and the management accounting. The purpose of this research is to examine how can the education reflect to the worldwide changes, and what is the mainstream in the accounting education nowadays. In order that the leaders of the businesses can make quick and appropriate economic decisions, it is essential in this intensively changing world that an enterprise should have a well-functioning accounting system based on up-to-date information. and that the university can teach new, practical knowledge.

Since its breakthrough, the liquid crystal technology has continued to gain momentum and the LCD is today the dominating display type used in desktop monitors, television sets, mobile phones as well as other mobile devices. To improve production efficiency and enable larger screen sizes, the LCD industry has step by step increased the size of the mother glass used in the LCD manufacturing process. Initially the mother glass was only around 0.1 m2 large, but with each generation the size has increased and with generation 10 the area reaches close to 10 m2. The increase in mother glass size has in turn led to an increase in the size of the photomasks used - currently the largest masks are around 1.6 × 1.8 meters. A key mask performance criterion is the absence of "mura" - small systematic errors captured only by the very sensitive human eye. To eliminate such systematic errors, special techniques have been developed by Micronic Mydata. Some mura suppressing techniques are described in this paper. Today, the race towards larger glass sizes has come to a halt and a new race - towards higher resolution and better image quality - is ongoing. The display mask is therefore going through a change that resembles what the semiconductor mask went through some time ago: OPC features are introduced, CD requirements are increasing sharply and multi tone masks (MTMs) are widely used. Supporting this development, Micronic Mydata has introduced a number of compensation methods in the writer, such as Z-correction, CD map and distortion control. In addition, Micronic Mydata MMS15000, the world's most precise large area metrology tool, has played an important role in improving mask placement quality and is briefly described in this paper. Furthermore, proposed specifications and system architecture concept for a new generation mask writers - able to fulfill future image quality requirements - is presented in this paper. This new system would use an AOD/AOM writing engine and be

There are several unsettled technical and licensing issues in the areas of instrumentation and control (I and C), human factors, and updated control room designs that need coordinated, proactive industry attention. Some of these issues are already causing protracted regulatory reviews for existing plants, and left untreated, may cause substantial delays and increased costs for new plant combined construction and operating license approvals. Both industry and the NRC will have roles in resolving the key issues and addressing them in future design efforts and regulatory reviews. Where action is needed, the industry will want to minimize costs and risks by defining industry consensus solutions with corresponding technical bases. NEI has formed a working group to coordinate industry efforts and communications with NRC staff. The working group will also help determine priorities and coordinate both new and existing plant resources. EPRI will provide technical input and guidance for the working group. In order to be able to conduct reviews in a timely fashion, the NRC will likely need to enhance and expand staff resources as existing plants are upgraded and new plant reviews become more active. The industry initiative began with a workshop sponsored by EPRI and NEI on March 28-29, 2006, which led to the creation of the NEI working group. The working group has now identified and prioritized important generic issues, established resolution paths and schedules, and identified the roles of various stakeholders including utility companies, EPRI, NEI, vendors and the NRC. Through the course of this initiative I and C issues for both existing and new plants are being addressed. This paper describes the key I and C related technical and regulatory issues and their implications for new and operating plants, and provides a status report on the efforts to resolve them. (authors)

In addition to the prolonged economic recession and global financial crisis, the Great East Japan Earthquake of March 2011 has caused great fear and devastation in Japan. In the midst of these, Japanese people have felt to lose the traditional values and common sense they used to share, and it has become necessary to build a new consciousness. Engaged in psychiatry and psychiatric care under these circumstances, we have to analyze the challenges we face and to brainstorm on appropriate prescriptions that can be applied to solve the problems. Five points in particular were brought up: [1] The persistently high number of suicides. [2] The increase in depression and overflowing numbers of patients visiting clinics and outpatient departments at hospitals. [3] The absolute shortage of child psychiatrists. [4] Little progress with the transition from hospitalization-centered to community-centered medical care. [5] The disappearance of beds for psychiatry patients from general hospitals. The situations surrounding these five issues were briefly analyzed and problems were pointed out. The following are five problems that psychiatry is facing: 1) A lack of large clinical trials compared to the rest of the world. 2) The drug lag and handling of global trials. 3) The lack of staff involved in education and research (in the field of psychiatry). 4) Following the DSM diagnostic criteria dogmatically, without differentiating therapeutics. 5) Other medical departments, the industry, patients, and their families are demanding objective diagnostic techniques. After analyzing the problems, and discussing to some extent what kind of prescription may be considered to solve the problems, I gave my opinion. (1) The first problem is the deep-rooted prejudice and discrimination against psychiatric disorders that continue to be present among Japanese people. The second problem is the government's policy of low remuneration (fees) for psychiatric services. The third problem, symbolic of the

Whole system models for the GB electricity system suggest that distributed electricity storage has the potential to significantly reduce the system integration cost for future system scenarios. From a policy perspective, this poses the question why this value should not be realised within existing market structures. Opinion among stakeholders is divided. Some believe that storage deployment constitutes a ‘special case’ in need of policy support. Others insist that markets can provide the necessary platform to negotiate contracts, which reward storage operators for the range of services they could provide. This paper seeks to inform this debate with a process of stakeholder engagement using a perspective informed by socio-technical transition literatures. This approach allows the identification of tensions among actors in the electricity system and of possibilities for co-evolution in the deployment of storage technologies during a transition towards a low carbon electricity system. It also draws attention to policy-related challenges of technology lock-in and path dependency resulting from poor alignment of incumbent regimes with the requirements for distributed electricity storage. - Highlights: ► Electricity storage is poorly aligned with existing regimes in the electricity system. ► Stakeholders perceive electricity storage as “somebody else's problem”. ► Combining stakeholder views and transition theory provides new insight. ► Transition from network to operational benefits poses regulatory challenge. ► Value aggregation made difficult due to institutional barriers.

In addition to difficulties with anesthetic and medical management, transsphenoidal operations in patients with longstanding acromegaly are associated with inherent intraoperative challenges because of anatomical variations that occur frequently in these patients. The object of this study was to review the overall safety profile and anatomical/technicalchallenges associated with transsphenoidal surgery in patients with acromegaly. The authors performed a retrospective analysis of 169 patients who underwent endoscopic transsphenoidal operations for growth hormone-secreting adenomas to assess the incidence of surgical complications. A review of frequently occurring anatomical challenges and operative strategies employed during each phase of the operation to address these particular issues was performed. Of 169 cases reviewed, there was no perioperative mortality. Internal carotid artery injury occurred in 1 patient (0.6%) with complex sinus anatomy, who remained neurologically intact following endovascular unilateral carotid artery occlusion. Other complications included: significant postoperative epistaxis (5 patients [3%]), transient diabetes insipidus (5 patients [3%]), delayed symptomatic hyponatremia (4 patients [2%]), CSF leak (2 patients [1%]), and pancreatitis (1 patient [0.6%]). Preoperative considerations in patients with acromegaly should include a cardiopulmonary evaluation and planning regarding intubation and other aspects of the anesthetic technique. During the nasal phase of the transsphenoidal operation, primary challenges include maintaining adequate visualization and hemostasis, which is frequently compromised by redundant, edematous nasal mucosa and bony hypertrophy of the septum and the nasal turbinates. During the sphenoid phase, adequate bony removal, optimization of working space, and correlation of imaging studies to intraoperative anatomy are major priorities. The sellar phase is frequently challenged by increased sellar floor thickness

Via a consultation process, the PEROSH members identified what occupational safety and health topics the European institutes specialised in, and what they see as the major trends and futurechallenges in the world of work and their impact on OSH. A second part of the consultation analysed future

This presentation will provide an overview of current human spaceflight operations. It will also describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. Additionally, there are many implications regarding advanced automation and robotics, and this presentation will outline future human-automation-robotic integration challenges.

In future mobile networks aggregation at different levels is necessary but at the same time imposes challenges that mandate looking into new architectures. This paper presents the design consideration approach for a C-RAN based mobile aggregation network used in the EU HARP project....... With this architecture fronthaul aggregation is performed which might be an option for future generation of mobile networks....

Anomalies of the coronary arteries are reported in 1-2% of patients among diagnostic angiogram. Ectopic origin of right coronary artery (RCA) from opposite sinus is one of the most common and they are mainly benign, but at times may be malignant. We report a case of a 69-year-old male who underwent early invasive percutaneous coronary intervention for non-ST-segment elevation myocardial infarction (NSTEMI) where RCA arising from left sinus at the root of left main artery was culprit and various technicalchallenges were encountered while intervening in form of cannulation to tracking of hardwares. RCA was cannulated with floating wire technique using hockey stick guide catheter and revascularized by deployment of 3.5 × 38 mm Promus Premier Everolimus eluting stent (Boston Scientific, USA). To the best of our knowledge, this is the first ever report of ectopic RCA being revascularized by using hockey stick catheter.

Full Text Available The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods. Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

The SEOM Future Plan is aimed at identifying the main challenges, trends and needs of the medical oncology speciality over the next years, including potential oncologist workforce shortages, and proposing recommendations to overcome them. The estimations of the required medical oncologists workforce are based on an updated Medical Oncologist Register in Spain, Medical Oncology Departments activity data, dedication times and projected cancer incidence. Challenges, needs and future recommendations were drawn from an opinion survey and an advisory board. A shortage of 211 FTE medical oncologist specialists has been established. To maintain an optimal ratio of 158 new cases/FTE, medical oncology workforce should reach 1881 FTE by 2035. Main recommendations to face the growing demand and complexity of oncology services include a yearly growth of 2.5% of medical oncologist's workforce until 2035, and development and application of more accurate quality indicators for cancer care and health outcomes measure.

Over the past two decades, the IAEA has conducted a series of major conferences that have addressed topical issues and strategies critical to nuclear safety for consideration by the world's nuclear regulators. More recently, the IAEA organized the International Conference on Effective Nuclear Regulatory Systems - Facing Safety and Security Challenges, held in Moscow in 2006. The Moscow conference was the first of its kind, because it brought together senior regulators of nuclear safety, radiation safety and security from around the world to discuss how to improve regulatory effectiveness with the objective of improving the protection of the public and the users of nuclear and radioactive material. The International Conference on Challenges Faced by Technical and Scientific Support Organizations in Enhancing Nuclear Safety was held in Aix-en-Provence, France, from 23 to 27 April 2007. This conference, again, was the first of its kind, because it was the first to address technical and scientific support organizations (TSOs), the role they play in supporting either the national regulatory bodies or the industry in making optimum safety decisions and the challenges they face in providing this support. This conference provided a forum for the TSOs to discuss these and other issues with the organizations to which they provide this support - that is, the regulators and the operators/industry - as well as with other stakeholders such as research organizations and public authorities. This conference can also be considered to have a link to the Moscow conference. The Moscow conference concluded that effective regulation of nuclear safety is vital for the safe use of nuclear energy and associated technologies, both now and in the future, and is an essential prerequisite for establishing an effective Global Nuclear Safety Regime. The Moscow conference also highlighted the importance of continued and improved international cooperation in the area of nuclear safety. These

This paper highlights the needs for the establishment of a technical support organization (TSO) in Pakistan Nuclear Regulatory Authority (PNRA), challenges faced during its development, application of training need assessment required for the competency development of its technical manpower and difficulties encountered after its evolution. Key issues addressed include recruitment of technical manpower and enhancing their competencies, acquisition of proper tools required for safety review and assessment, development of a sustainable education and training program consistent with the best international practices and taking the measures to get confidence of the regulatory body. (author)

Changes in the vocational and technical education (VTE) opportunities available for women in Organisation for Economic Cooperation Development (OECD) member countries in response to changing labor markets and job skill requirements were discussed at a June 1994 meeting of experts. The discussions focused on the following issues: gender differences…

InSOTEC is a social sciences research project which aims to generate a better understanding of the complex interplay between the technical and the social in radioactive waste management and, in particular, in the design and implementation of geological disposal. It currently investigates and analyses the most striking socio-technicalchallenges to implementing geological disposal of radioactive waste in 14 national programs. A focus is put on situations and issues where the relationship between the technical and social components is still unstable, ambiguous and controversial, and where negotiations are taking place in terms of problem definitions and preferred solutions. Such negotiations can vary from relatively minor contestation, over mild commotion, to strong and open conflicts. Concrete examples of socio-technicalchallenges are: the question of siting, introducing the notion of reversibility / retrievability into the concept of geological disposal, or monitoring for confidence building. In a second stage the InSOTEC partners aim to develop a fine-grained understanding of how the technical and the social influence, shape, build upon each other in the case of radioactive waste management and the design and implementation of geological disposal. How are socio-technical combinations in this field translated and materialized into the solutions finally adopted? With what kinds of tools and instruments are they being integrated? Complementary to providing better theoretical insight into these socio-technicalchallenges/combinations, InSOTEC aims to provide concrete suggestions on how to address these within national and international contexts. To this end, InSOTEC will deliver insights into how mechanisms for interaction between the technical community and a broad range of socio-political actors could be developed. (authors)

Developing Strategies for Islamic Banks to Face the FutureChallenges of Financial Globalization Ahmed Al-Ajlouni Abstract This study aims at forming strategic response to assess the ability of Islamic banks in benefiting from the opportunities that may be provided by financial globalization and limits its threats, through assessing the capability of Islamic banks to meet the requirements and challenges of financial globalization, then suggests the suitable strategies that may be ...

This paper is based on a series of interviews with nine leading researchers conducted during the Future-Oriented Technology Analysis International Conference held in Seville on 16–17 October 2008. Analysis of these interviews paints a picture of FTA as an increasingly important approach being adopted in many countries to address the many challenges which are emerging at this time in human history. From this are drawn implications for the community of FTA practitioners. The biggest challenge i...

Plant diseases caused by bacterial pathogens place major constraints on crop production and cause significant annual losses on a global scale. The attainment of consistent effective management of these diseases can be extremely difficult, and management potential is often affected by grower reliance on highly disease-susceptible cultivars because of consumer preferences, and by environmental conditions favouring pathogen development. New and emerging bacterial disease problems (e.g. zebra chip of potato) and established problems in new geographical regions (e.g. bacterial canker of kiwifruit in New Zealand) grab the headlines, but the list of bacterial disease problems with few effective management options is long. The ever-increasing global human population requires the continued stable production of a safe food supply with greater yields because of the shrinking areas of arable land. One major facet in the maintenance of the sustainability of crop production systems with predictable yields involves the identification and deployment of sustainable disease management solutions for bacterial diseases. In addition, the identification of novel management tactics has also come to the fore because of the increasing evolution of resistance to existing bactericides. A number of central research foci, involving basic research to identify critical pathogen targets for control, novel methodologies and methods of delivery, are emerging that will provide a strong basis for bacterial disease management into the future. Near-term solutions are desperately needed. Are there replacement materials for existing bactericides that can provide effective disease management under field conditions? Experience should inform the future. With prior knowledge of bactericide resistance issues evolving in pathogens, how will this affect the deployment of newer compounds and biological controls? Knowledge is critical. A comprehensive understanding of bacterial pathosystems is required to not

Technical education, as enshrined in the Nigerian national policy on education, is concerned with qualitative technological human resources development directed towards a national pool of skilled and self reliant craftsmen, technicians and technologists in technical and vocational education fields. In Nigeria, the training of technical personnel…

Geographic Information System (GIS) continue to play very important role in improving spatial thinking skills of graduates from higher educational institutions. However, teaching and learning of GIS at the technical university level in Ghana remains very limited due to some implementation challenges. This paper reviews the implementation of GIS in…

The United States and China are committed to cooperation to address the challenges of the next century. Technical cooperation, building on a long tradition of technical exchange between the two countries, can play an important role. This paper focuses on technical cooperation between the United States and China in the areas of nonproliferation, arms control and other nuclear security topics. It reviews cooperation during the 1990s on nonproliferation and arms control under the U.S.-China Arms Control Exchange, discusses examples of ongoing activities under the Peaceful Uses of Technology Agreement to enhance security of nuclear and radiological material, and suggests opportunities for expanding technical cooperation between the defense nuclear laboratories of both countries to address a broader range of nuclear security topics.

Ceramic integration technologies enable hierarchical design and manufacturing of intricate ceramic and composite parts starting with geometrically simpler units that are subsequently joined to themselves and/or to metals to create components with progressively higher levels of complexity and functionality. However, for the development of robust and reliable integrated systems with optimum performance for high temperature applications, detailed understanding of various thermochemical and thermomechanical factors is critical. Different technical approaches are required for the integration of ceramic to ceramic and ceramic to metal systems. Active metal brazing, in particular, is a simple and cost-effective method to integrate ceramic to metallic components. Active braze alloys usually contain a reactive filler metal (e.g., Ti, Cr, V, Hf etc) that promotes wettability and spreading by inducing chemical reactions with the ceramics and composites. In this presentation, various examples of brazing of silicon nitride to themselves and to metallic systems are presented. Other examples of joining of ceramic composites (C/SiC and SiC/SiC) using ceramic interlayers and the resulting microstructures are also presented. Thermomechanical characterization of joints is presented for both types of systems. In addition, various challenges and opportunities in design, fabrication, and testing of integrated similar (ceramic-ceramic) and dissimilar (ceramic-metal) material systems will be discussed. Potential opportunities and need for the development of innovative design philosophies, approaches, and integrated system testing under simulated application conditions will also be presented.

Advanced SiC-based ceramic matrix composites offer significant contributions toward reducing fuel burn and emissions by enabling high overall pressure ratio (OPR) of gas turbine engines and reducing or eliminating cooling air in the hot-section components, such as shrouds, combustor liners, vanes, and blades. Additive manufacturing (AM), which allows high value, custom designed parts layer by layer, has been demonstrated for metals and polymer matrix composites. However, there has been limited activity on additive manufacturing of ceramic matrix composites (CMCs). In this presentation, laminated object manufacturing (LOM), binder jet process, and 3-D printing approaches for developing ceramic composite materials are presented. For the laminated object manufacturing (LOM), fiber prepreg laminates were cut into shape with a laser and stacked to form the desired part followed by high temperature heat treatments. For the binder jet, processing optimization was pursued through silicon carbide powder blending, infiltration with and without SiC nano powder loading, and integration of fibers into the powder bed. Scanning electron microscopy was conducted along with XRD, TGA, and mechanical testing. Various technicalchallenges and opportunities for additive manufacturing of ceramics and CMCs will be presented.

The conference Conceptual and TechnicalChallenges for Quantum Gravity at Sapienza University of Rome, from 8 to 12 September 2014, has provided a beautiful opportunity for an encounter between different approaches and different perspectives on the quantum-gravity problem. It contributed to a higher level of shared knowledge among the quantum-gravity communities pursuing each specific research program. There were plenary talks on many different approaches, including in particular string theory, loop quantum gravity, spacetime noncommutativity, causal dynamical triangulations, asymptotic safety and causal sets. Contributions from the perspective of philosophy of science were also welcomed. In addition several parallel sessions were organized. The present volume collects contributions from the Noncommutative Geometry and Quantum Gravity parallel session4, with additional invited contributions from specialists in the field. Noncommutative geometry in its many incarnations appears at the crossroad of many researches in theoretical and mathematical physics: • from models of quantum space-time (with or without breaking of Lorentz symmetry) to loop gravity and string theory, • from early considerations on UV-divergencies in quantum field theory to recent models of gauge theories on noncommutative spacetime, • from Connes description of the standard model of elementary particles to recent Pati-Salam like extensions. This volume provides an overview of these various topics, interesting for the specialist as well as accessible to the newcomer. 4partially funded by CNRS PEPS /PTI ''Metric aspect of noncommutative geometry: from Monge to Higgs''

Full Text Available Oxyfuel combustion with carbon capture and sequestration (CCS is a carbon-reduction technology for use in large-scale coal-fired power plants. Significant progress has been achieved in the research and development of this technology during its scaling up from 0.4 MWth to 3 MWth and 35 MWth by the combined efforts of universities and industries in China. A prefeasibility study on a 200 MWe large-scale demonstration has progressed well, and is ready for implementation. The overall research development and demonstration (RD&D roadmap for oxyfuel combustion in China has become a critical component of the global RD&D roadmap for oxyfuel combustion. An air combustion/oxyfuel combustion compatible design philosophy was developed during the RD&D process. In this paper, we briefly address fundamental research and technology innovation efforts regarding several technicalchallenges, including combustion stability, heat transfer, system operation, mineral impurities, and corrosion. To further reduce the cost of carbon capture, in addition to the large-scale deployment of oxyfuel technology, increasing interest is anticipated in the novel and next-generation oxyfuel combustion technologies that are briefly introduced here, including a new oxygen-production concept and flameless oxyfuel combustion.

Concerned with the overall challenges of implementing design innovations, this paper relates to the specific case of applying smart textiles in future hospital interiors. The methodological approach is inspired by design thinking and implementation processes, and through the scope of a developed ...

At the 2015 "NARST: A Worldwide Organization for Improving Science Teaching and Learning Through Research" Annual International Conference, a group of scholars held an extended pre-conference workshop to discuss key challenges and future directions faced by argumentation researchers around the world. This wide-ranging group of…

Recent developments in polymer-based controlled delivery systems have made a significant clinical impact. The second Symposium on Innovative Polymers for Controlled Delivery (SIPCD) was held in Suzhou, China to address the key challenges and provide up-to-date progress and future perspectives in the

Landscape genetics has advanced the field of evolutionary ecology by providing a direct focus on relationships between landscape patterns and population processes, such as gene flow, selection, and genetic drift. This chapter discusses the current and emerging challenges and opportunities, which focus and facilitate future progress in the field. It presents ten...

This paper presents and studies various selected literature primarily from conference proceedings, journals and clinical tests of the robotic, mechatronics, neurology and biomedical engineering of rehabilitation robotic systems. The present paper focuses of three main categories: types of rehabilitation robots, key technologies with current issues and futurechallenges. Literature on fundamental research with some examples from commercialized robots and new robot development projects related to rehabilitation are introduced. Most of the commercialized robots presented in this paper are well known especially to robotics engineers and scholars in the robotic field, but are less known to humanities scholars. The field of rehabilitation robot research is expanding; in light of this, some of the current issues and futurechallenges in rehabilitation robot engineering are recalled, examined and clarified with future directions. This paper is concluded with some recommendations with respect to rehabilitation robots.

A climate-literate public is essential to resolving pressing problems related to global change. Future elementary teachers are a critical audience in climate and climate change education, as they will introduce children in early grades (USA grades K-8, children ages 5-14) to fundamentals of the climate system, natural and anthropogenic drivers of climate change, and impacts of global change on human and natural systems. Here we describe challenges we have encountered in teaching topics of the carbon cycle, greenhouse gases, past climate, recent anthropogenic change, and carbon footprints to future elementary teachers. We also describe how we have met (to varying degrees of success) these challenges in an introductory earth science course that is specifically designed for this audience. Two prominent challenges we have encountered are: the complex nature of the scientific content of climate change, and robust misconceptions held by our students about these topics. To address the first challenge, we attempt to adjust the scientific content to a level appropriate for future K-8 teachers, without sacrificing too much accuracy or critical detail. To address the second challenge, we explicitly discuss alternate conceptions of each topic. The use of authentic data sets can also address both of these challenges. Yet incorporating 'real' climate and paleoclimate data into the classroom poses still an additional challenge of instructional design. We use a variety of teaching approaches in our laboratory-based course including student-designed experiments, computer simulations, physical models, and authentic data sets. We have found that students strongly prefer the physical models and experiments, because these are 'hands-on' and perceived as easily adaptable to the K-8 classroom. Students often express dislike for activities that use authentic data sets (for example, an activity using graphs of CO2 and methane concentrations in Vostok ice cores), in particular because they

This paper presents achievements of the geo synthetics discipline and challenges facing the discipline. The paper shows that one of the main achievements of geo synthetics discipline and challenges facing the discipline. The paper shows that one of the main achievements of geo synthetics is that they have pervaded most branches of geotechnical engineering to the point where it is almost impossible to practice geotechnical engineering without geo synthetics. Then, the paper addresses the challenges facing the geo synthetics discipline. Two major types of challenges are identified: education challenges and technicalchallenges. Regarding technicalchallenges, it is recommended that researchers focus on behaviors that are not traditionally considered in geotechnical engineering in order to use geo synthetics to their full potential. Note: this is a significantly expanded version of the keynote paper presented at the 2008 GeoAmericas Conferences. (Author)

developed, some solid waste communal leaders were born, and solid waste handling motivation and participation of community are grown. To accelerate this situation, the government introduces many training and education to produce more municipal solid waste handling facilitators. Since 2007, environment sanitation motivation activities runs through the yearly Sanitation Jamboree that educate, short train, motivate junior school children and competition among other. Technology innovation. Local governments, with or without central government support, are being to make some improvement how to handle municipal solid waste and through Sister City Program, many innovations were developed such as in Surabaya City (home Takakura composter), Depok (waste separation and composting), Bogor City (management), Malang City, Makasar City and others. The new Closing the Loops of solid waste handling approaches should be introduced in the future to break the bottle neck that always happened in the past. Integration between solid waste management and the farming activities, land plantation rehabilitations, city landscaping and gardening is very urgent to develop, including integration of 3R stakeholders in the region. The challenges. The municipal solid waste problem in urban areas is relative more complicated compared with the same problem in the rural areas. Accurate data collection and analyzing periodically is very important. Road map development and mobilizing of all stake holders both in central government and in local government such as NGOs, private sectors, education and research institutions, civil societies and the community are very urgent. New research action is required to find our new urban municipal solid waste characteristic and our appropriate technology and management to give some input to the central government, local governments and the community or others who involve in the municipal solid waste handling due to the recent fast growing of urban people income and changing

JNFL and JAEA have collaboratively started to develop an Advanced Solution Measurement and monitoring System (ASMS) as a part of technicalchallenge intended for next generation safeguards NDA equipment. After we completed feasibility study by using small detectors, the second stage of ASMS has installed into PCDF tank located in a cell, and then tested and calibrated by Pu nitrate solution experimentally. There was no experience measuring around 50kg Pu inventory directly, so it was very challenging work. The conventional SMMS (Solution Monitoring and Measurement System) that is composed of precision manometers acquires density, level and temperature of solution, so that the sampling and analysis are essential to obtain the nuclear material amount in the tank. The SMMS has two weak points on verification and monitoring of the nuclear material flow and inventory; (1) Direct measurement of the inventory cannot be done, (2) Solution rework and reagent adjustment operation in actual plant will make miss-interpretation on the monitoring evaluation. The purpose of ASMS development is to establish quantitative plutonium mass measurement technique directly by NDA of high concentrated pure plutonium nitrate solution and monitoring capability for solution transfers in a process. The merits of ASMS are considered below; (1) Provide direct Pu measurement and continuous monitoring capability, (2) Eliminate sampling and analysis at IIV, (3) Reduce unmeasured inventory. The target of the measurement uncertainty of ASMS is set less than 6% (1sigma) which is equivalent to meet the detection level of the partial defect at IIV by NDA. Known-alpha coincidence counting technique is applied to the ASMS, which is similar to the NDAs for MOX powder as a principle measurement technique. Especially, three following points are key techniques to establish ASMS. (1) Pre-determination of plutonium isotopic composition because it impacts alpha and rho-zero values to obtain multiplication

In searching for dark skies, persistently clear weather, and minimal atmospheric interference, astronomical observing sites are generally located in remote, mountainous locations, and usually far from large communities. Such locations often have weak economies, and shallow workforce pools in the technical and administrative areas generally needed by the observatories. This leads to a problem, and an opportunity, for both the observatories and their local communities. Importing employees from far away locations is costly, leads to high turnover, and deprives the community of economic benefits and the sense of fealty with the observatories that would naturally result if local people occupied these comparatively good paying jobs. While by no means unique, the observatories on Mauna Kea Hawai`i are a clear example of this dual dilemma. This presentation will report findings from a model workforce needs assessment survey of all the Mauna Kea observatories, which has establish likely annual staffing requirements in several categories of technological and administrative support, including the educational entrance requirements. Results indicated that through 2023, 80% of observatory job openings on Hawai`i Island will be in technology and administration. Furthermore, the vast majority of these jobs will require only a two-year or four-year college degree in a relevant field as an entrance requirement. Efforts to realign the existing resources to better meet these common needs will be discussed, including the highly successful partnership between County of Hawai`i Workforce Development Board, the Mauna Kea observatories, the local K-12 systems, Hawai`i Community College, the University of Hawai`i Hilo, and a number of informal education and workplace experience programs. This collaboration has resulted in no fewer than three, interlocked, community programs have stepped up to meet this challenge to the benefit of both the local community and the observatories.

Users of ecstasy (3,4-methylenedioxymethamphetamine; MDMA) may be at risk of developing MDMA-induced injury to the serotonin (5-HT) system. Previously, there were no methods available for directly evaluating the neurotoxic effects of MDMA in the living human brain. However, development of in vivoneuroimaging tools have begun to provide insights into the effects of ecstasy on the human brain. Single photon emission computed tomography (SPECT), positron emission computed tomography (PET) and proton magnetic resonance spectroscopy (1H-MRS) studies which have evaluated ecstasy's neurotoxic potential will be reviewed and discussed in terms of technical aspects, conceptual issues and future prospects. Although PET and SPECT may be limited by several factors such as the low cortical uptake and the use of a non-optimal reference region (cerebellum) the few studies conducted so far provide suggestive evidence that people who heavily use ecstasy are at risk of developing subcortical, and probably also cortical reductions in serotonin transporter (SERT) densities, a marker of 5-HT neurotoxicity. There seem to be dose-dependent and transient reductions in SERT for which females may be more vulnerable than males. 1H-MRS appears to be a less sensitive technique for studying ecstasy's neurotoxic potential. Whether individuals with a relatively low ecstasy exposure also demonstrate loss of SERT needs to be determined. Because most studies have had a retrospective design, in which evidence is indirect and differs in the degree to which any causal links can be implied, longitudinal studies in human ecstasy users are needed to draw definite conclusions.

The users' mobility imposes challenges to mobility management and, the offloading process, which hinder the conventional heterogeneous networks (HetNets) in meeting the huge data traffic requirements of the future. In this thesis, a trio-connectivity (TC), which includes a control-plane (C-plane), a user-plane (U-plane) and an indication-plane (I-plane), is proposed to tackle these challenges. Especially, the I-plane is created as an indicator to help the user equipment (UE) identify and...

Full Text Available The big amount of data generated nowadays are being used by Big Data tools to generate knowledge and to facilitate the decision-making. However, this situation creates a new challenge: how to visualize all these data without losing mid/long term crucial information. The purpose of this article is to analyze the state of the art on massive data visualization, main problems and challenges of information representation current techniques as well as the evolution of the tools and the future of them, in other words, new functionalities to offer.

The world is changing rapidly. With changes in technology, demographics, and workforce trends, Washington needs colleges to not only keep pace, but lead the way. Washington's 34 community and technical colleges answer that call. The community and technical colleges have proven uniquely positioned to adapt to, embrace, and ignite change. Community…

In accordance with the concept of lifelong education, the Technical and Vocational Education (TVE) system is one flaw of that is recognized or known as a system whose role is to develop individuals with high technical skills as desired by the industry nowadays. Changing times and technology development at this time require changes to the TVET…

Among the challenges on power supply companies seeking profitable growth are strategic corporate development in terms of product and process innovation and the management's and employees' capacity for change. This was the outcome of a survey conducted as part of the ''Value Creator III'' study performed among 130 executives of German, Austrian and Swiss companies operating in the energy sector. Based on a review of companies' past successes and expectations for the future the study presents business models for tomorrow's energy market and classifies companies within a past-and-future portfolio according to their market prospects.

Historically, progress in particle physics has largely beendetermined by development of more capable particle accelerators. Thistrend continues today with the recent advent of high-luminosityelectron-positron colliders at KEK and SLAC operating as "B factories,"the imminent commissioning of the Large Hadron Collider at CERN, and theworldwide development effort toward the International Linear Collider.Looking to the future, one of the most promising approaches is thedevelopment of muon-beam accelerators. Such machines have very highscientific potential, and would substantially advance thestate-of-the-art in accelerator design. A 20-50 GeV muon storage ringcould serve as a copious source of well-characterized electron neutrinosor antineutrinos (a Neutrino Factory), providing beams aimed at detectorslocated 3000-7500 km from the ring. Such long baseline experiments areexpected to be able to observe and characterize the phenomenon ofcharge-conjugation-parity (CP) violation in the lepton sector, and thusprovide an answer to one of the most fundamental questions in science,namely, why the matter-dominated universe in which we reside exists atall. By accelerating muons to even higher energies of several TeV, we canenvision a Muon Collider. In contrast with composite particles likeprotons, muons are point particles. This means that the full collisionenergy is available to create new particles. A Muon Collider has roughlyten times the energy reach of a proton collider at the same collisionenergy, and has a much smaller footprint. Indeed, an energy frontier MuonCollider could fit on the site of an existing laboratory, such asFermilab or BNL. The challenges of muon-beam accelerators are related tothe facts that i) muons are produced as a tertiary beam, with very large6D phase space, and ii) muons are unstable, with a lifetime at rest ofonly 2 microseconds. How these challenges are accommodated in theaccelerator design will be described. Both a Neutrino Factory and a Muon

Historically, progress in particle physics has largely been determined by development of more capable particle accelerators. This trend continues today with the recent advent of high-luminosity electron-positron colliders at KEK and SLAC operating as 'B factories', the imminent commissioning of the Large Hadron Collider at CERN, and the worldwide development effort toward the International Linear Collider. Looking to the future, one of the most promising approaches is the development of muon-beam accelerators. Such machines have very high scientific potential, and would substantially advance the state-of-the-art in accelerator design. A 20-50 GeV muon storage ring could serve as a copious source of well-characterized electron neutrinos or antineutrinos (a Neutrino Factory), providing beams aimed at detectors located 3000-7500 km from the ring. Such long baseline experiments are expected to be able to observe and characterize the phenomenon of charge-conjugation-parity (CP) violation in the lepton sector, and thus provide an answer to one of the most fundamental questions in science, namely, why the matter-dominated universe in which we reside exists at all. By accelerating muons to even higher energies of several TeV, we can envision a Muon Collider. In contrast with composite particles like protons, muons are point particles. This means that the full collision energy is available to create new particles. A Muon Collider has roughly ten times the energy reach of a proton collider at the same collision energy, and has a much smaller footprint. Indeed, an energy frontier Muon Collider could fit on the site of an existing laboratory, such as Fermilab or BNL. The challenges of muon-beam accelerators are related to the facts that (1) muons are produced as a tertiary beam, with very large 6D phase space, and (2) muons are unstable, with a lifetime at rest of only 2 microseconds. How these challenges are accommodated in the accelerator design will be described. Both a

The annual Deep Brain Stimulation (DBS) Think Tank provides a focal opportunity for a multidisciplinary ensemble of experts in the field of neuromodulation to discuss advancements and forthcoming opportunities and challenges in the field. The proceedings of the fifth Think Tank summarize progress in neuromodulation neurotechnology and techniques for the treatment of a range of neuropsychiatric conditions including Parkinson's disease, dystonia, essential tremor, Tourette syndrome, obsessive compulsive disorder, epilepsy and cognitive, and motor disorders. Each section of this overview of the meeting provides insight to the critical elements of discussion, current challenges, and identified future directions of scientific and technological development and application. The report addresses key issues in developing, and emphasizes major innovations that have occurred during the past year. Specifically, this year's meeting focused on technical developments in DBS, design considerations for DBS electrodes, improved sensors, neuronal signal processing, advancements in development and uses of responsive DBS (closed-loop systems), updates on National Institutes of Health and DARPA DBS programs of the BRAIN initiative, and neuroethical and policy issues arising in and from DBS research and applications in practice. PMID:29416498

Full Text Available The annual Deep Brain Stimulation (DBS Think Tank provides a focal opportunity for a multidisciplinary ensemble of experts in the field of neuromodulation to discuss advancements and forthcoming opportunities and challenges in the field. The proceedings of the fifth Think Tank summarize progress in neuromodulation neurotechnology and techniques for the treatment of a range of neuropsychiatric conditions including Parkinson's disease, dystonia, essential tremor, Tourette syndrome, obsessive compulsive disorder, epilepsy and cognitive, and motor disorders. Each section of this overview of the meeting provides insight to the critical elements of discussion, current challenges, and identified future directions of scientific and technological development and application. The report addresses key issues in developing, and emphasizes major innovations that have occurred during the past year. Specifically, this year's meeting focused on technical developments in DBS, design considerations for DBS electrodes, improved sensors, neuronal signal processing, advancements in development and uses of responsive DBS (closed-loop systems, updates on National Institutes of Health and DARPA DBS programs of the BRAIN initiative, and neuroethical and policy issues arising in and from DBS research and applications in practice.

The current challenges of the nuclear industry are the result of too many uncertainties: low GDP growth of OECD countries, booming state debts, deregulated electricity markets, growing safety regulation and diminishing public support. As a result, nuclear technology companies tend to entrench in their current installed base, while attempting to develop global partnerships to market their products to new nuclear countries, along with viable financing schemes. But new opportunities are lying ahead. In a future context of effective and global climate policies, nuclear energy will have to play a key role in a new energy ecosystem aside the two other clean air energy production technologies: renewable energies and electricity storage. And still, the perspective of long-term sustainability of nuclear energy is still high. This paper explores the opportunity for key innovative technologies to shift the way we think about nuclear in the future energy system while addressing these major challenges. (author)

One effective way to improve the state of the art is through competitions. Following the success of the Critical Assessment of protein Structure Prediction (CASP) in bioinformatics research, a number of challenge evaluations have been organized by the text-mining research community to assess and advance natural language processing (NLP) research for biomedicine. In this article, we review the different community challenge evaluations held from 2002 to 2014 and their respective tasks. Furthermore, we examine these challenge tasks through their targeted problems in NLP research and biomedical applications, respectively. Next, we describe the general workflow of organizing a Biomedical NLP (BioNLP) challenge and involved stakeholders (task organizers, task data producers, task participants and end users). Finally, we summarize the impact and contributions by taking into account different BioNLP challenges as a whole, followed by a discussion of their limitations and difficulties. We conclude with future trends in BioNLP challenge evaluations. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

We cover a top-level introduction to hardness assurance (HA) from a robotic space system perspective, starting at the piece-part level. We discuss error sources inherent to presently-accepted HA practices and why they cause us to be risk-averse. We conclude by reviewing current proposals that move towards more risk-tolerant system design approaches as well as futurechallenges that will require these advanced techniques.

Drawing upon the presentations made at the fourth conference on Future-oriented Technology Analysis, this essay reflects on the implications of the current period of instability and discontinuity for the practice of FTA or foresight. In the past the demand environment for foresight on research and innovation policy favoured application to priority-setting and articulation of demand. New tendencies include a heightened search for breakthrough science and a focus on grand societal challenges. B...

Background Medication safety is a global concern among healthcare providers. However, the challenges to and the future of medication safety in Saudi Arabia have not been explored. Objectives We explored the perspectives of healthcare practitioners on current issues about medication safety in hospitals and community settings in Saudi Arabia in order to identify challenges to improving it and explore the future of medication safety practice. Methods A total of 65 physicians, pharmacists, academics and nurses attended a one-day meeting in March 2010, designed especially for the purpose of this study. The participants were divided into nine round-table discussion sessions. Three major themes were explored in these sessions, including: major factors contributing to medication safety problems, challenges to improving medication safety practice, and participants’ suggestions for improving medication safety. The round-table discussion sessions were videotaped and transcribed verbatim and analyzed by two independent researchers. Results The round-table discussions revealed that major factors contributing to medication safety problems included unrestricted public access to medications from various hospitals and community pharmacies, communication gaps between healthcare institutions, limited use of important technologies such as computerized provider order entry, and the lack of medication safety programs in hospitals. Challenges to current medication safety practice identified by participants included underreporting of medication errors and adverse drug reactions, multilingualism and differing backgrounds of healthcare professionals, lack of communication between healthcare providers and patients, and high workloads. Suggestions for improving medication safety practices in Saudi Arabia included continuous education for healthcare professionals and competency assessment focusing on medication safety, development of a culture that encourages medication error and adverse

To evaluate the efficacy of computed tomography (CT)-guided radiofrequency (RF) ablation for the treatment of osteoid osteomas in common and in technicallychallenging locations. Twenty-three patients with osteoid osteomas in common (nine cases) and technicallychallenging [14 cases: intra-articular (n = 7), spinal (n = 5), metaphyseal (n = 2)] positions were treated with CT-guided RF ablation. Therapy was performed under conscious sedation with a seven-array expandable RF electrode for 8-10 min at 80-110 C and power of 90-110 W. The patients went home under instruction. A brief pain inventory (BPI) score was calculated before and after (1 day, 4 weeks, 6 months and 1 year) treatment. All procedures were technically successful. Primary clinical success was 91.3% (21 of total 23 patients), despite the lesions' locations. BPI score was dramatically reduced after the procedure, and the decrease in BPI score was significant (P < 0.001, paired t-test; n - 1 = 22) for all periods during follow up. Two patients had persistent pain after 1 month and were treated successfully with a second procedure (secondary success rate 100%). No immediate or delayed complications were observed. CT-guided RF ablation is safe and highly effective for treatment of osteoid osteomas, even in technically difficult positions. (orig.)

The committee has during the earlier period finalized their work on the report, Optics and Photonics: Essential Technologies for Our Nation (2013) . The report did undergo review and initial editorial processing. The NRC released a pre-publication report on August 13, 2012. A final report is now available. The study director has been able to practice his skills in running a national academies committee. From a research perspective the grant has generated a report with recommendations to the government. The work itself is the meetings where the committee convened to hear presenters and to discuss the status of optics and photonics as well as writing the report.

Poland now face a very interesting discussion on modern waste treatment methods, although the waste problems are very oil. This paper presents a total waste management view from the formation process to recycling, utilisation and land filling. The average municipal solid waste (MSW) annual per capita generation in poland is 250 kg per person, which is half of the waste amount generated in norway and one third of the amount in Usa. The present low per capita generation, large variations in MSW properties and an expected growth in the standard of living make the decisions regarding future polish waste management systems very important. Waste management must be handled carefully to prevent a rapid growth of waste generation - this is the p olish challenge , both mow and for the future. Three different possibilities for future waste management systems for rural areas, small cities and larger cities are discussed in the paper. 4 figs., 1 tab

Offering a comprehensive overview of the challenges, risks and options facing the future of mechatronics, this book provides insights into how these issues are currently assessed and managed. Building on the previously published book ‘Mechatronics in Action,’ it identifies and discusses the key issues likely to impact on future mechatronic systems. It supports mechatronics practitioners in identifying key areas in design, modeling and technology and places these in the wider context of concepts such as cyber-physical systems and the Internet of Things. For educators it considers the potential effects of developments in these areas on mechatronic course design, and ways of integrating these. Written by experts in the field, it explores topics including systems integration, design, modeling, privacy, ethics and future application domains. Highlighting novel innovation directions, it is intended for academics, engineers and students working in the field of mechatronics, particularly those developing new conc...

Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores the potenti......Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores...... the potentials of speculative thinking in relation to design and social and cultural studies, arguing that both offer valuable insights for creating a speculative space for new emergent criticalities challenging current assumptions of the relations between power and design. It does so by tracing out discussions...... of ‘futurity’ and ‘futuring’ in design as well as social and cultural studies. Firstly, by discussing futurist and speculative approaches in design thinking; secondly by engaging with ideas of scenario thinking and utopianism in current social and cultural studies; and thirdly by showing how the articulation...

The international nuclear safeguards community is faced with a host of challenges in the coming years, many of which have been outlined but have not been described in terms of their urgency. Literature regarding safeguards challenges is either broad and devoid of any reference to prioritization or tailored to a specific problem and removed from the overall goals of the safeguards community. For example, developing new methods of environmental sampling, improving containment and surveillance (C/S) technologies to increase efficiency and decrease inspection time, advancing nuclear material accountancy (NMA) techniques, and planning safeguards approaches for new types of nuclear facilities are all important. They have not, however, been distinctly prioritized at a high level within the safeguards community. Based on a review of existing literature and interviews with experts on these upcoming challenges, this paper offers a high-level summary of present and future priorities in safeguards, with attention both to what is feasible and to what is most imperative. In doing so, the paper addresses the potential repercussions for failing to prioritize, with a focus on the risk of diversion of nuclear material. Within the context of shifts in the American political landscape, and keeping in mind that nonproliferation issues may take a backseat to others in the near future, a prioritized view of safeguards objectives will be vital. In the interest of expanding upon this work, the paper offers several potential conceptual models for prioritization which can be explored in greater depth upon further research.

Full Text Available As the demand for wind energy continues to grow at exponential rates, reducing operation and maintenance (OM costs and improving reliability have become top priorities in wind turbine (WT maintenance strategies. In addition to the development of more highly evolved WT designs intended to improve availability, the application of reliable and cost-effective condition-monitoring (CM techniques offers an efficient approach to achieve this goal. This paper provides a general review and classification of wind turbine condition monitoring (WTCM methods and techniques with a focus on trends and futurechallenges. After highlighting the relevant CM, diagnosis, and maintenance analysis, this work outlines the relationship between these concepts and related theories, and examines new trends and futurechallenges in the WTCM industry. Interesting insights from this research are used to point out strengths and weaknesses in today’s WTCM industry and define research priorities needed for the industry to meet the challenges in wind industry technological evolution and market growth.

Full Text Available Increment of computing speed, machine learning and human interface, have extended capabilities of artificial intelligence applications to an important stage. It is predicted that use of artificial intelligence (AI to automate knowledge-based occupations (occupations such as medicine, engineering and law may have an global enormous economic impact in the near future.Applications based on artificial intelligence are able to improve health and quality of life for millions in the coming years. Although clinical applications of computer science are slow moving to real-world labs, but there are promising signs that the pace of innovation will improve. In the near future AI based applications by automating knowledge-based work in the field of diagnosis and treatment, nursing and health care, robotic surgery and development of new drugs, will have a transformative effect on the health sector. Therefore many artificial intelligence systems should work closely with health providers and patients to gain their trust. The progress of how smart machines naturally will interact with healthcare professionals, patients and patients' families is very important, yet challenging.In this article, we review the future of automation of knowledge enabled by AI work in medicine and healthcare in seven categories including big medical data mining, computer Aided Diagnosis, online consultations, evidence based medicine, health assistance, precision medicine and drug creation. Also challenges of this issue including cultural, organizational, legal and social barriers are described.

Careers in energy science related fields represent significant job growth in the U.S. Yet post-secondary career and technical programs have not kept pace with demand and energy science curriculum, including fundamental concepts of energy generation and environmental impact, lacks a firm position among general or career and technical education courses. Many of these emerging energy related jobs are skilled labor and entry level technical positions requiring less than a bachelor's degree. These include jobs such as solar/photovoltaic design and installation, solar water and space heating installation, energy management, efficiency and conservation auditor, environmental technician, etc. These energy related career pathways fit naturally within the geosciences discipline. Many of these jobs can be filled by individuals from HVAC, Industrial technology, welding, and electrical degree programs needing some additional specialized training and curriculum focused on fundamental concepts of energy, fossil fuel exploration and use, atmospheric pollution, energy generation, alternative energy sources, and energy conservation. Two-year colleges (2ycs) are uniquely positioned to train and fill these workforce needs as they already have existing career and technical programs and attract both recent high school graduates, as well as non-traditional students including displaced workers and returning veterans. We have established geoscience related workforce certificate programs that individuals completing the traditional industrial career and technical degrees can obtain to meet these emerging workforce needs. This presentation will discuss the role of geosciences programs at 2ycs in training these new workers, developing curriculum, and building a career/technical program that is on the forefront of this evolving industry.

Full Text Available The economic theories and the accounting regulations deriving from them should be reconsidered for SMEs. At global level, there are accomplishments in this respect – IASB IFRS for SMEs – or intentions – European Commission proposals for certain simplifications of the European directives. However, for these actions to be successful, further investigations concerning the theoretical and technical implications are necessary. In this study, we present our opinion concerning the theoretic influences (reconsideration of the conceptual framework and the technical influences (change in the disclosure requirements and change in the content of the standards, namely recognition and valuation implied by a standard for SMEs.

Throughout its 16-year history, CERN openlab has worked to develop and test new ICT technologies and techniques that help to make the ground-breaking physics discoveries at CERN possible. CERN openlab runs in three-year phases, with around 20 projects — addressing a wide range of IT topics — being run in its current, fifth phase. With CERN openlab’s sixth three-year phase set to begin at the start of 2018, work has been carried out throughout the first half of 2017 to identify key areas for future collaboration. A series of workshops and discussions was held to discuss the ICT challenges faced by the LHC research community — and other ‘big science’ projects over the coming years. This white paper is the culmination of these investigations, and sets out specific challenges that are ripe for tackling through collaborative R&D; projects.

Electrochemical CO2 reduction (ECR) offers an important pathway for renewable energy storage and fuels production. It still remains a challenge in designing highly selective, energy-efficient, robust, and cost-effective electrocatalysts to facilitate this kinetically slow process. Metal-free carbon-based materials have features of low cost, good electrical conductivity, renewability, diverse structure, and tunability in surface chemistry. In particular, surface functionalization of carbon materials, for example by doping with heteroatoms, enables access to unique active site architectures for CO2 adsorption and activation, leading to interesting catalytic performances in ECR. We aim to provide a comprehensive review of this category of metal-free catalysts for ECR, providing discussions and/or comparisons among different nonmetallic catalysts, and also possible origin of catalytic activity. Fundamentals and some futurechallenges are also described.

Full Text Available 3D and 4D bioprinting of the heart are exciting notions in the modern era. However, myocardial bioprinting has proven to be challenging. This review outlines the methods, materials, cell types, issues, challenges, and future prospects in myocardial bioprinting. Advances in 3D bioprinting technology have significantly improved the manufacturing process. While scaffolds have traditionally been utilized, 3D bioprinters, which do not require scaffolds, are increasingly being employed. Improved understanding of the cardiac cellular composition and multiple strategies to tackle the issues of vascularization and viability had led to progress in this field. In vivo studies utilizing small animal models have been promising. 4D bioprinting is a new concept that has potential to advance the field of 3D bioprinting further by incorporating the fourth dimension of time. Clinical translation will require multidisciplinary collaboration to tackle the pertinent issues facing this field.

When the U.S. Environmental Protection Agency (EPA) was established nearly 50 years ago, the nation faced serious threats to its air, land, and water, which in turn impacted human health. These threats were effectively addressed by the creation of EPA (in 1970) and many subsequent landmark environmental legislations which in turn significantly reduced threats to the Nation's environment and public health. A key element of historic legislation is research aimed at dealing with current and future problems. Today we face national and global challenges that go beyond classic media-specific (air, land, water) environmental legislation and require an integrated paradigm of action and engagement based on (1) innovation based on science and technology, (2) stakeholder engagement and collaboration, and (3) public education and support. This three-pronged approach recognizes that current environmental problems, include social as well as physical and environmental factors, are best addressed through collaborative problem solving, the application of innovation in science and technology, and multiple stakeholder engagement. To achieve that goal, EPA's Office of Research and Development (ORD) is working directly with states and local communities to develop and apply a suite of accessible decision support tools (DST) that aim to improve environmental conditions, protect human health, enhance economic opportunity, and advance a resilient and sustainability society. This paper showcases joint EPA and state actions to develop tools and approaches that not only meet current environmental and public health challenges, but do so in a way that advances sustainable, healthy, and resilient communities well into the future. EPA's future plans should build on current work but aim to effectively respond to growing external pressures. Growing pressures from megatrends are a major challenge for the new Administration and for cities and states across the country. The recent hurricanes hitting

Recent research in underwater wireless sensor networks (UWSNs) has gained the attention of researchers in academia and industry for a number of applications. They include disaster and earthquake prediction, water quality and environment monitoring, leakage and mine detection, military surveillance and underwater navigation. However, the aquatic medium is associated with a number of limitations and challenges: long multipath delay, high interference and noise, harsh environment, low bandwidth and limited battery life of the sensor nodes. These challenges demand research techniques and strategies to be overcome in an efficient and effective fashion. The design of routing protocols for UWSNs is one of the promising solutions to cope with these challenges. This paper presents a survey of the routing protocols for UWSNs. For the ease of description, the addressed routing protocols are classified into two groups: localization-based and localization-free protocols. These groups are further subdivided according to the problems they address or the major parameters they consider during routing. Unlike the existing surveys, this survey considers only the latest and state-of-the-art routing protocols. In addition, every protocol is described in terms of its routing strategy and the problem it addresses and solves. The merit(s) of each protocol is (are) highlighted along with the cost. A description of the protocols in this fashion has a number of advantages for researchers, as compared to the existing surveys. Firstly, the description of the routing strategy of each protocol makes its routing operation easily understandable. Secondly, the demerit(s) of a protocol provides (provide) insight into overcoming its flaw(s) in future investigation. This, in turn, leads to the foundation of new protocols that are more intelligent, robust and efficient with respect to the desired parameters. Thirdly, a protocol can be selected for the appropriate application based on its described

Securing energy demand for next generations is one of the most challenges aspects facing any sustained development plans, due to the growing electric energy demand and Egypt as a country of limited fossil fuel resources has to diversify its energy portfolio by utilization of its renewable energy resources, mainly wind due to its economic potential and solar as proved by Egypt's wind and solar atlases. In the year 2009/2010, the total installed capacity in Egypt was 24726 MW with electricity generation 139,000 GWh, of which 89 % was delivered by thermal plants and about 10% from Hydro power with total installed 2800 MW and electricity generated is about 12863 GWh and 1% from Wind energy with total installed 550 MW and electricity generated 1542 GWh. In the solar energy field, the first solar thermal power plant of 140 MW with a solar share of 20 MW using parabolic trough technology was started the initial work since the 1 st of July with estimated total energy generated of 852GWh/year. Recently, Egypt has adopted an ambitious plan to cover 20% of the generated electricity by renewable energy by 2020, including a 12% contribution from wind energy, translating more than 7200 MW grid-connected wind farms. Such plan gives a room enough to the private investment to play the major role in realizing this goal. The plan includes also a 100 MW Solar thermal energy CSP with parabolic trough technology in Kom Ombo city, and also two PV plants in Hurgada and Kom Ombo with a total installed capacity 20 MW each. Due to the high investment cost of solar energy technologies, still limited in spread all over the world on the other hand wind energy has an economic potential and becomes a commercial technology but the future potential for solar energy due to the limited land for wind energy. Current study will evaluate the Egyptian strategy for renewable energy up to 2020 and find how much the planned projects from the Egyptian government will fulfill its target, the economic study

Pervaporation (PV) has been considered as one of the most active and promising areas in membrane technologies in separating close boiling or azeotropic liquid mixtures, heat sensitive biomaterials, water or organics from its mixtures that are indispensable constituents for various important chemical and bio-separations. In the PV process, the membrane plays the most pivotal role and is of paramount importance in governing the overall efficiency. This article evaluates and collaborates the current research towards the development of next generation nanomaterials (NMs) and embedded polymeric membranes with regard to its synthesis, fabrication and application strategies, challenges and future prospects.

Full Text Available Pervaporation (PV has been considered as one of the most active and promising areas in membrane technologies in separating close boiling or azeotropic liquid mixtures, heat sensitive biomaterials, water or organics from its mixtures that are indispensable constituents for various important chemical and bio-separations. In the PV process, the membrane plays the most pivotal role and is of paramount importance in governing the overall efficiency. This article evaluates and collaborates the current research towards the development of next generation nanomaterials (NMs and embedded polymeric membranes with regard to its synthesis, fabrication and application strategies, challenges and future prospects.

The paper gives an overview of the history of the Nuclear Research Institute Rez development over forty years of its existence. Its present activities are discussed in some detail. These historical and present activities represent the basis for discussing: challenges faced by the NRI; interactions of NRI with their environment; collaboration and co-operation. Nuclear research centres would continue to be the main source of expertise for power plant operation, radiation and isotope applications, regulatory practices and waste management. Future developments should ensure viability of these centres. (author)

Inherited retinal degenerations are a clinically and genetically heterogeneous group of conditions that have historically shared an untreatable course. In recent years, however, a wide range of therapeutic strategies have demonstrated efficacy in preclinical studies and entered clinical trials with a common goal of improving visual function for patients affected with these conditions. Gene therapy offers a particularly elegant and precise opportunity to target the causative genetic mutations underlying these monogenic diseases. The present review will provide an overview of gene therapy with particular emphasis on key clinical results to date and challenges for the future.

In May 2014, NCI’s Division of Cancer Epidemiology and Genetics (DCEG) hosted Cancer Epidemiology: From Pedigrees to Populations, a scientific symposium honoring 50 years of visionary leadership by Dr. Joseph F. Fraumeni, Jr., the founding Director of DCEG. In this video, Dr. Stephen Chanock of NCI provides opening remarks. Dr. David Schottenfeld of the University of Michigan moderates a session on the search for cancer susceptibility genes. Dr. Louise Strong of University of Texas MD Anderson Cancer Center speaks about the discovery and futurechallenges of Li-Fraumeni syndrome research. For more information on this symposium, visit http://dceg.cancer.gov/news-events/Fraumeni-symposium-speakers.

In our days the concept of leadership is often met. Whether we read books or articles on business management, public administration, human resources management, etc. we are bound to “meet” the term of leadership in some way. This paper aims to present the current and future competencies and challenges of leadership, putting a bigger emphasis on the public sector, because in the last 4-5 years it has become an emerging phenomenon within public administration, which is mostly due to the fact th...

Full Text Available Stem cells are undifferentiated cells that are present in the embryonic, fetal, and adult stages of life and give rise to differentiated cells that make up the building blocks of tissue and organs. Due to their unlimited source and high differentiation potential, stem cells are considered as potentially new therapeutic agents for the treatment of infertility. Stem cells could be stimulated in vitro to develop various numbers of specialized cells including male and female gametes suggesting their potential use in reproductive medicine. During past few years a considerable progress in the derivation of male germ cells from pluripotent stem cells has been made. In addition, stem cell-based strategies for ovarian regeneration and oocyte production have been proposed as future clinical therapies for treating infertility in women. In this review, we summarized current knowledge and present future perspectives and challenges regarding the use of stem cells in reproductive medicine.

Full Text Available Fall prediction is a multifaceted problem that involves complex interactions between physiological, behavioral, and environmental factors. Existing fall detection and prediction systems mainly focus on physiological factors such as gait, vision, and cognition, and do not address the multifactorial nature of falls. In addition, these systems lack efficient user interfaces and feedback for preventing future falls. Recent advances in internet of things (IoT and mobile technologies offer ample opportunities for integrating contextual information about patient behavior and environment along with physiological health data for predicting falls. This article reviews the state-of-the-art in fall detection and prediction systems. It also describes the challenges, limitations, and future directions in the design and implementation of effective fall prediction and prevention systems.

Fall prediction is a multifaceted problem that involves complex interactions between physiological, behavioral, and environmental factors. Existing fall detection and prediction systems mainly focus on physiological factors such as gait, vision, and cognition, and do not address the multifactorial nature of falls. In addition, these systems lack efficient user interfaces and feedback for preventing future falls. Recent advances in internet of things (IoT) and mobile technologies offer ample opportunities for integrating contextual information about patient behavior and environment along with physiological health data for predicting falls. This article reviews the state-of-the-art in fall detection and prediction systems. It also describes the challenges, limitations, and future directions in the design and implementation of effective fall prediction and prevention systems.

Technical Vocational Education and Training (TVET) is widely recognized as a vital driving force for the socio-economic growth and technological development of nations. In achieving the goals and objectives of TVET in Nigeria, the quality of the programme needs to be improved and sustained. The purpose of this study is to ascertain the challenges…

Community-based projects immerse technical writing students in intercultural communication, addressing local needs and shaping documents in human terms. Students at a South Texas university work to establish communication with clients in a city-county health department to create effective documents and disseminate family health legislation. To…

In Sub-Saharan Africa (SSA), technical and vocational education and training (TVET) is central to political discourses and educational concerns as a means for economic development, poverty alleviation, youth employment, and social mobility. Yet, there is an intriguing contradiction between this consideration and the real attention dedicated to…

Who is responsible for accidents in highly automated systems? How do we apportion liability among the various participants in complex socio-technical organisations? How can different liability regulations at different levels (supranational, national, local) be harmonized? How do we provide for

Who is responsible for accidents in highly automated systems? How do we apportion liability among the various participants in complex socio-technical organisations? How can different liability regulations at different levels (supranational, national, local) be harmonized? How do we provide for

Describes a project to integrate the support and delivery of information services to faculty and staff at the University of Minnesota from the planning phase to implementation of a new organizational entity. Topics addressed include technical and organizational integration, control and delivery of services, and networking and organizational fit.…

We surveyed Extension educators in the southern Great Plains about their attitudes and beliefs regarding climate change, their interactions with constituents surrounding climate change, and challenges they face in engaging constituents on the topic of climate change. Production-oriented and sociocultural challenges in meeting constituents'…

recently been proposed for energy production, is critically reviewed. There are major challenges remaining that are shortly outlined. Scientific/technical achievements that are required in the light of the Fukushima accident are highlighted.

have recently been proposed for energy production, is critically reviewed. There are major challenges remaining that are shortly outlined. Scientific/technical achievements that are required in the light of the Fukushima accident are highlighted.

The Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA) mandates a periodic assessment of the conditions and trends of the Nation's renewable resources on forests and rangelands. The RPA Assessment includes projections of resource conditions and trends 50 years into the future. The 2010 RPA Assessment used a set of future scenarios to provide a...

Full Text Available Background: Diabetes is one of the non-communicable diseases (NCDs which is rising significantly across sub-Saharan African (SSA countries and posing a threat to the social, economic, and cultural fabric of the SSA population. The inclusion of NCDs into the post-2015 development agenda along with the global monitoring framework provides an opportunity to monitor progress of development programmes in developing countries. This paper examines challenges associated with dealing with diabetes within the development agenda in SSA and explores some policy options. Design: This conceptual review draws from a range of works published in Medline and the grey literature to advance the understanding of the post-2015 development agenda and how it relates to NCDs. The paper begins with the burden of diabetes in sub-Sahara Africa and then moves on to examine challenges associated with diabetes prevention, treatment, and management in Africa. It finishes by exploring policy implications. Results: With regards to development programmes on NCDs in the SSA sub-continent, several challenges exist: 1 poor documentation of risk factors, 2 demographic transitions (rapid urbanisation and ageing, 3 the complementary role of traditional healers, 4 tuberculosis and the treatment of the acquired immunodeficiency syndrome as risk factors for diabetes, 5 diabetes in complex emergencies, 6 diabetes as an international development priority and not a policy agenda for many SSA countries, and 7 poorly regulated food and beverage industry. Conclusion: For the post-2015 development agenda for NCDs to have an impact, sufficient investments will be needed to address legislative, technical, human, and fiscal resource constraints through advocacy, accountability, political leadership, and effective public–private partnership. Striking the right balance between competing demands and priorities, policies, and implementation strategies hold the key to an effective response to diabetes

Diabetes is one of the non-communicable diseases (NCDs) which is rising significantly across sub-Saharan African (SSA) countries and posing a threat to the social, economic, and cultural fabric of the SSA population. The inclusion of NCDs into the post-2015 development agenda along with the global monitoring framework provides an opportunity to monitor progress of development programmes in developing countries. This paper examines challenges associated with dealing with diabetes within the development agenda in SSA and explores some policy options. This conceptual review draws from a range of works published in Medline and the grey literature to advance the understanding of the post-2015 development agenda and how it relates to NCDs. The paper begins with the burden of diabetes in sub-Sahara Africa and then moves on to examine challenges associated with diabetes prevention, treatment, and management in Africa. It finishes by exploring policy implications. With regards to development programmes on NCDs in the SSA sub-continent, several challenges exist: 1) poor documentation of risk factors, 2) demographic transitions (rapid urbanisation and ageing), 3) the complementary role of traditional healers, 4) tuberculosis and the treatment of the acquired immunodeficiency syndrome as risk factors for diabetes, 5) diabetes in complex emergencies, 6) diabetes as an international development priority and not a policy agenda for many SSA countries, and 7) poorly regulated food and beverage industry. For the post-2015 development agenda for NCDs to have an impact, sufficient investments will be needed to address legislative, technical, human, and fiscal resource constraints through advocacy, accountability, political leadership, and effective public-private partnership. Striking the right balance between competing demands and priorities, policies, and implementation strategies hold the key to an effective response to diabetes in SSA countries.

Full Text Available This article presents a jubilee which the scientific review Military Technical Courier marks in 2012 - the sixtieth anniversary of regular and continuous publication. The retrospective of marking jubilean anniversaries in the past 60 years offers the evidence of persistent and thorough efforts to develop and improve the quality of the Courier's content. The article shows the great dedication of editor's offices and editorial boards as well as the invaluable scientific and professional contribution of numerous authors. The article also deals with today's positions and achievements of the Military Technical Courier which, according to the classification of the Ministry of Education and Science of the Republic of Serbia, belongs to the national category of scientific journals.

Full Text Available This paper reviews the current state and development of different numerical model classes that are used to simulate the global atmospheric system, particularly Earth’s climate and climate-chemistry connections. The focus is on Chemistry-Climate Models. In general, these serve to examine dynamical and chemical processes in the Earth atmosphere, their feedback, and interaction with climate. Such models have been established as helpful tools in addition to analyses of observational data. Definitions of the global model classes are given and their capabilities as well as weaknesses are discussed. Examples of scientific studies indicate how numerical exercises contribute to an improved understanding of atmospheric behavior. There, the focus is on synergistic investigations combining observations and model results. The possible future developments and challenges are presented, not only from the scientific point of view but also regarding the computer technology and respective consequences for numerical modeling of atmospheric processes. In the future, a stronger cross-linkage of subject-specific scientists is necessary, to tackle the looming challenges. It should link the specialist discipline and applied computer science.

PNRA is the sole organization in the country responsible to regulate all matters pertaining to ionizing radiations. For the safety of transport of radioactive material in the country, PNRA has adopted IAEA TS-R-1 as a national regulation. To cover the security aspects and emergency situations, if any, during the transportation of radioactive material, PNRA has issued the regulatory guide on ‘Transportation of Radioactive Material by Road in Pakistan’. In Pakistan, low to medium activity radioactive sources are transported from one place to another by road for the purpose of industrial radiography, well logging, medical application, etc. According to national policy, sealed radioactive sources of half life greater than 1 year and with initial activity of 100 GBq or more imported in the country are required to be returned to country of origin (exported) after its use. Although the activities related to transport of radioactive material remained safe and secure and no major accident/incident has been reported so far, however, the improvement/enhancement in the regulatory infrastructure is a continuous process. In future, more challenges are expected to be faced in the safety of transport packages. This paper will describe the steps taken by PNRA for the safety and security of transport of radioactive material in the country and futurechallenges. (author)

Problems in team communication and decision making have been implicated in accidents in high risk industries such as aviation, off shore oil processing, nuclear power generation. Recognition of the role that breakdowns in communication and teamwork play in patient safety incidents has led to a plethora of studies in the area of what has come to be widely known as non-technical skills (NTS); a term initially used in European aviation (1). This has led to increasing interest in i...

Hydrogen is a flexible, clean energy carrying intermediate that enables aggressive market penetration of renewables while deeply decarbonizing our energy system. H2 at Scale is a concept that supports the electricity grid by utilizing energy without other demands at any given time and also supports transportation and industry by providing low-priced hydrogen to them. This presentation is an update to the Hydrogen Technical Advisory Committee (HTAC).

This 50 page brochure was developed by Brookhaven National Laboratory to encourage high school students to begin considering careers in the scientific and technical fields. The topics of the brochure include career selection, career options, a review of training required for each occupation, a collection of profiles of BNL employees describing how they chose and prepared for their careers, a description of BNL educational programs for high school students, and profiles of some of the students participating in these programs.

Two of the major concerns of the NASA Ames Research Center (NASA ARC) Advisory Committee for Women (ACW) are that recruitment of women scientists, engineers, and technicians needs to increase and that barriers to advancement need to be removed for improved representation of women in middle and upper management and scientific positions. One strategy that addressed this concern was the ACW sponsorship of a Technical Paper Contest for Women at Ames Research Center. Other sponsors of the Contest were the Ames Equal Opportunity Council and the Ames Contractor Council. The Technical Paper Contest for Women greatly increased the visibility of both the civil service women and the women who work for contractors at Ames. The women had the opportunity to hone their written and oral presentation skills. Networking among Ames women increased.

Detection of non-technical losses (NTL) which include electricity theft, faulty meters or billing errors has attracted increasing attention from researchers in electrical engineering and computer science. NTLs cause significant harm to the economy, as in some countries they may range up to 40% of the total electricity distributed. The predominant research direction is employing artificial intelligence to predict whether a customer causes NTL. This paper first provides an overview ...

This analysis was conducted to support the Vehicle Systems Safety Technology (VSST) Project of the Aviation Safety Program (AVsP) milestone VSST4.2.1.01, "Identification of VSST-Related Trends." In particular, this is a review of incident data from the NASA Aviation Safety Reporting System (ASRS). The following three VSST-related technicalchallenges (TCs) were the focus of the incidents searched in the ASRS database: (1) Vechicle health assurance, (2) Effective crew-system interactions and decisions in all conditions; and (3) Aircraft loss of control prevention, mitigation, and recovery.

We have conducted the first in-vivo experiments in pencilbeam irradiation, a new synchrotron radiation technique based on the principle of microbeam irradiation, a concept of spatially fractionated high-dose irradiation. In an animal model of adult C57 BL/6J mice we have determined technical and physiological limitations with the present technical setup of the technique. Fifty-eight animals were distributed in eleven experimental groups, ten groups receiving whole brain radiotherapy with arrays of 50 µm wide beams. We have tested peak doses ranging between 172 Gy and 2,298 Gy at 3 mm depth. Animals in five groups received whole brain radiotherapy with a center-to-center (ctc) distance of 200 µm and a peak-to-valley ratio (PVDR) of ∼ 100, in the other five groups the ctc was 400 µm (PVDR ∼ 400). Motor and memory abilities were assessed during a six months observation period following irradiation. The lower dose limit, determined by the technical equipment, was at 172 Gy. The LD50 was about 1,164 Gy for a ctc of 200 µm and higher than 2,298 Gy for a ctc of 400 µm. Age-dependent loss in motor and memory performance was seen in all groups. Better overall performance (close to that of healthy controls) was seen in the groups irradiated with a ctc of 400 µm.

This article aimed to provide a critical review of the evolution of Chagas' disease (ChD) in Brazil, its magnitude, historical development and management, and challenges for the future. A literature search was performed using PubMed, SciELO and Google Scholar and throughout collected articles' references. Narrative analysis was structured around five main themes identified: vector transmission, control programme, transfusion, oral and congenital transmission. In Brazil, the Chagas' Disease Control Programme was fully implemented in the 1980s, when it reached practically all the endemic areas, and in 1991, the Southern Cone Initiative was created, aiming to control the disease transmission through eliminating the Triatoma infestans and controlling blood banks. As a result, the prevalence of chagasic donors in blood banks reduced from 4.4% in the 1980s to 0.2% in 2005. In 2006, Pan American Health Organization (PAHO) certified the interruption of transmission of ChD through this vector in Brazil. However, there are still challenges, such as the domiciliation of new vector species, the need for medical care of the infected individuals, the prevention of alternative mechanisms of transmission, the loss of political concern regarding the disease and the weakening of the control programme. Despite the progress towards control, there are still many challenges ahead to maintain and expand such control and minimise the risk of re-emergence.

Current trends in higher education in the United States demand that nursing take stock of how it is prepared or being prepared to face challenges and issues impacting on its future. The intense effort made to attract students to pursue advanced training in science and engineering in the United States pales in comparison to the numbers of science and engineering majors produced yearly in international schools. As a result, more and more jobs are being outsourced to international markets. Could international outsourcing become a method of nursing education? Authors submit that to remain competitive, the nursing profession must attract a younger cohort of technologically savvy students and faculty reflective of the growing diverse population in the United States. Additionally, nursing programs in research universities face even more daunting challenges as it relates to mandates for funded research programs of educational units. This article offers suggestions and recommendations for nursing programs in higher education institutions on ways to attract and retain ethnic minorities and of how to harness the power of research to address burgeoning societal health challenges.

Centre Pier is a 3.9 ha property owned by the Commissioners of the Port Hope Harbour in the Municipality of Port Hope, Ontario, Canada. It is centrally located on the Port Hope waterfront and is bounded on the west by the Port Hope Harbour, on the east by the Ganaraska River, on the south by Lake Ontario, and on the north by a railway corridor. The property is currently leased by the Commissioners of the Port Hope Harbour to the Cameco Corporation which owns the four onsite building that are used as warehouse space for their uranium conversion facility located on the western side of the Harbour. Remediation of this site forms part of the Port Hope Project being undertaken by Atomic Energy of Canada Limited (AECL) and Public Works and Government Services Canada (PWGSC) as part of the Port Hope Area Initiative (PHAI). Soil impacts include radiological, metals and petroleum hydrocarbons resulting from long term historical industrial use. Radiological impacts in soil extend across most of the site primarily within the upper metre of fill. Metals-contaminated soil is present across the entire site in the underlying fill material. The metals-contaminated fill extends to a maximum depth of 2.0 m below grade at the north end of the site which is underlain by peat. However, the metals-contaminated soil could extend to the top of the bedrock on the remainder of the site. Based on the elevation of the bedrock in the adjacent river and Harbour Basin, the metals-contaminated soil may extend to a depth of 5.6 m or 6.5 m below existing grade. Petroleum-contaminated soil is present on the southeast side of the site, where a storage tank farm was previously located. Challenges include: - The complex history of the site both relating to site use and Pier construction. Pier development began in the 1800's and was undertaken by many different entities. Modifications and repairs were made over the years resulting in several different types of Pier walls and fill that must be

Centre Pier is a 3.9 ha property owned by the Commissioners of the Port Hope Harbour in the Municipality of Port Hope, Ontario, Canada. It is centrally located on the Port Hope waterfront and is bounded on the west by the Port Hope Harbour, on the east by the Ganaraska River, on the south by Lake Ontario, and on the north by a railway corridor. The property is currently leased by the Commissioners of the Port Hope Harbour to the Cameco Corporation which owns the four onsite building that are used as warehouse space for their uranium conversion facility located on the western side of the Harbour. Remediation of this site forms part of the Port Hope Project being undertaken by Atomic Energy of Canada Limited (AECL) and Public Works and Government Services Canada (PWGSC) as part of the Port Hope Area Initiative (PHAI). Soil impacts include radiological, metals and petroleum hydrocarbons resulting from long term historical industrial use. Radiological impacts in soil extend across most of the site primarily within the upper metre of fill. Metals-contaminated soil is present across the entire site in the underlying fill material. The metals-contaminated fill extends to a maximum depth of 2.0 m below grade at the north end of the site which is underlain by peat. However, the metals-contaminated soil could extend to the top of the bedrock on the remainder of the site. Based on the elevation of the bedrock in the adjacent river and Harbour Basin, the metals-contaminated soil may extend to a depth of 5.6 m or 6.5 m below existing grade. Petroleum-contaminated soil is present on the southeast side of the site, where a storage tank farm was previously located. Challenges include: - The complex history of the site both relating to site use and Pier construction. Pier development began in the 1800's and was undertaken by many different entities. Modifications and repairs were made over the years resulting in several different types of Pier walls and fill that must be considered

Full Text Available Although in developed countries the futures markets have been in existence sincemid-nineteenth century, they are relatively new in the developing countries. In thelate twentieth and early twenty first century, many new futures exchanges wereestablishedin developing countries and in the majority of these newly establishedexchanges substantial growth in futures trading have been observed within a smallperiod of time. This fast growth in the futures trading volume was mainly due tothe tremendous leverage the futures provides to speculators. Thanks to futuresmargining system, by committing only a small fraction of the money needed tomaintain a position on the underlying security in the spot market, a speculator canattain a much higher return potentialby buying or selling a futures contract. Thispaper studies this effect by employing daily return data on 19 selected stockslisted continuously in IMKB-30 (Istanbul Stock Exchange 30,Turkey fromJanuary 2005 to December 2010. Populartechnical indicators are used to generatebuy and sell signals in both the spot market (Istanbul Stock Exchange and thefutures market (VOB, a fast growing derivatives exchange located in İzmir,Turkey. The profit/loss resulting from trading strategies are then calculated andcompared. The results of the study show that, although the amount invested inboth markets is the same, the profit generated from the strategies applied onfutures is significantly higher than that on spot market.A CAPM (Capital AssetPricing Model based hedge ratio is used to apply the trading strategies generated from spot market data on futures. The results show that this strategy generatessuperior returns in the futures market.

The 'JUNE and You' sessions presented at the July 2008 Undergraduate Neuroscience Education workshop, sponsored jointly by Faculty for Undergraduate Neuroscience (FUN) and Project Kaleidoscope (PKAL), featured background information about the history and mission of the Journal of Undergraduate Neuroscience Education (JUNE), followed by an informative discussion about the challenges facing JUNE, including new ideas for future developments. This article will highlight some of the information and ideas generated and shared at this conference. Critical discussion points included the need to keep members of FUN actively engaged in submitting and reviewing articles for JUNE. Ways in which authors, reviewers, and interested faculty members could best help in promoting the mission and vision of JUNE were discussed. Concerns about recent hackings into the JUNE website were also raised, and possible solutions and measures that can be taken to minimize this in the future were discussed. In addition, ideas for expanding the role of JUNE to provide a forum to evaluate new and emerging website information that is pertinent to undergraduate neuroscience education was discussed. Ideas for future developments of JUNE included revolving postings of articles as they are accepted, providing links to several related websites, and allowing updates for articles that have been previously published in JUNE. Finally, ideas for maintaining and expanding JUNE's stature as the resource for undergraduate neuroscience education included ensuring that JUNE is listed on important search vehicles, such as PubMed.

Research on high-level nuclear waste management has focused on technical and scientific issues since the U.S. National Academy of Sciences first studied the problem in the mid 1950s and recommended long-term disposal in deep salt formations. In this review, we trace the development of the problem's definition and its associated research since socioeconomic, political and policy issues were first given consideration and nuclear waste management became recognized as more than a technical issue. Three time periods are identified. First, from the mid 1970s to early 1980s, initial research explored institutional dimensions of nuclear waste, including ethics. The second period began in the early 1980s with a concerted effort to solve the problem and site nuclear waste repositories, and ended in the mid 1990s with minimal progress in the U.S. and general stalemate in Asia and Europe (with the notable exception of Sweden). This phase accelerated research on risk perception and stigma of nuclear waste, and elevated a focus on public trust. Great attention was given to repository siting conflicts, while minimal attention was placed on ethics, equity, political systems, and public participation. The last period, since the mid 1990s, has been characterized by continuing political stalemate and increased attention to public participation, political systems and international solutions. Questions of ethics have been given renewed attention, while research on risk perceptions and siting conflicts continues. We frame these periods in a broader context of the shifting role of applied social scientists. The paper concludes with a general discussion of this research area and prospects for future research

Research on high-level nuclear waste management has focused on technical and scientific issues since the U.S. National Academy of Sciences first studied the problem in the mid 1950s and recommended long-term disposal in deep salt formations. In this review, we trace the development of the problem's definition and its associated research since socioeconomic, political and policy issues were first given consideration and nuclear waste management became recognized as more than a technical issue. Three time periods are identified. First, from the mid 1970s to early 1980s, initial research explored institutional dimensions of nuclear waste, including ethics. The second period began in the early 1980s with a concerted effort to solve the problem and site nuclear waste repositories, and ended in the mid 1990s with minimal progress in the U.S. and general stalemate in Asia and Europe (with the notable exception of Sweden). This phase accelerated research on risk perception and stigma of nuclear waste, and elevated a focus on public trust. Great attention was given to repository siting conflicts, while minimal attention was placed on ethics, equity, political systems, and public participation. The last period, since the mid 1990s, has been characterized by continuing political stalemate and increased attention to public participation, political systems and international solutions. Questions of ethics have been given renewed attention, while research on risk perceptions and siting conflicts continues. We frame these periods in a broader context of the shifting role of applied social scientists. The paper concludes with a general discussion of this research area and prospects for future research

"Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…

Electronic health records (EHRs) are proliferating, and financial incentives encourage their use. Applying Fair Information Practice principles to EHRs necessitates balancing patients' rights to control their personal information with providers' data needs to deliver safe, high-quality care. We describe the technical and organizational challenges faced in capturing patients' preferences for patient-controlled EHR access and applying those preferences to an existing EHR. We established an online system for capturing patients' preferences for who could view their EHRs (listing all participating clinic providers individually and categorically-physicians, nurses, other staff) and what data to redact (none, all, or by specific categories of sensitive data or patient age). We then modified existing data-viewing software serving a state-wide health information exchange and a large urban health system and its primary care clinics to allow patients' preferences to guide data displays to providers. Patients could allow or restrict data displays to all clinicians and staff in a demonstration primary care clinic, categories of providers (physicians, nurses, others), or individual providers. They could also restrict access to all EHR data or any or all of five categories of sensitive data (mental and reproductive health, sexually transmitted diseases, HIV/AIDS, and substance abuse) and for specific patient ages. The EHR viewer displayed data via reports, data flowsheets, and coded and free text data displayed by Google-like searches. Unless patients recorded restrictions, by default all requested data were displayed to all providers. Data patients wanted restricted were not displayed, with no indication they were redacted. Technical barriers prevented redacting restricted information in free textnotes. The program allowed providers to hit a "Break the Glass" button to override patients' restrictions, recording the date, time, and next screen viewed. Establishing patient

Full Text Available Up to now, ecology has a strong influence on the development of technical and instrumental aspects of architecture, such as renewable and efficient of resources and energy, CO2 emissions, air quality, water reuse, some social and economical aspects. These concepts define the physical keys and codes of the current ׳sustainable׳ architecture, normally instrumental but rarely and insufficiently theorised. But is not there another way of bringing us to nature? We need a theoretical referent. This is where we place the Van der Laan׳s thoughts: he considers that art completes nature and he builds his theoretical discourse on it, trying to better understand many aspects of architecture. From a conceptual point of view, we find in his works sense of timelessness, universality, special attention on the ׳locus׳ and a strict sense of proportions and use of materials according to nature. Could these concepts complement our current sustainable architecture? How did Laan apply the current codes of ecology in his architecture? His work may help us to get a theoretical interpretation of nature and not only physical. This paper develops this idea through the comparison of thoughts and works of Laan with the current technical approach to ׳sustainable׳ architecture.

During the past 12 years, the Northern Eurasia Earth Science Partnership Initiative (NEESPI) - an interdisciplinary program of internationally-supported Earth systems and science research - has addressed large-scale and long-term manifestations of climate and environmental changes over Northern Eurasia and their impact on the Global Earth system. With more than 1500 peer-reviewed journal publications and 40 books to its credit, NEESPI's activities resulted in significant scientific outreach. This created a new research realm through self-organization of NEESPI scientists in a broad research network, accumulation of knowledge while developing new tools (observations, models, and collaborative networks) and producing new, exciting results that can be applied to directly support decision-making for societal needs. This realm was summed up at the Synthesis NEESPI Workshop in Prague, Czech Republic (April 9-12, 2015) where it was decided to shift gradually the foci of regional studies in Northern Eurasia towards applications with the following major Science Question: " What dynamic and interactive change(s) will affect societal well-being, activities, and health, and what might be the mitigation and adaptation strategies that could support sustainable development and decision-making activities in Northern Eurasia?". To answer this question requires a stronger socio-economic component in the ongoing and future regional studies focused on sustainable societal development under changing climatic and environmental conditions, especially, under conditions when societal decision-making impacts and feeds back on the environment. This made the NEESPI studies closer to the ICSU research initiative "Future Earth". Accordingly, the NEESPI Research Team decided to reorganize in the nearest future NEESPI into "Northern Eurasia Future Initiative" (NEFI) and began development of its Programmatic White Paper (in preparation at the time of this abstract submission). The NEFI research

Decades of research show that high school dropouts are more likely than graduates to commit crimes, abuse drugs and alcohol, have children out of wedlock, earn low wages, be unemployed, and suffer from poor health. The ChalleNGe program, currently operating in 27 states, is a residential program coupled with post-residential mentoring that seeks…

The study of exoplanets and the protoplanetary discs in which they form is a very challenging task. In this thesis we present several studies in which we investigate the potential of imaging polarimetry at visible and near-infrared wavelengths to reveal the characteristics of these objects and

A program is proposed for upgrading the usefulness and attractiveness of ZGS polarized facilities to the user community. It is hoped that the advance approval or completion of parts (or all) of such a program will lead to an accelerated interest in proposing experiments unique to the ZGS, which will open new dimensions in the understanding of high energy phenomena. Recommendations are summarized, and a discussion is given of past activities and future plans of various experimental groups and ZGS facilities. A series of arguments is made concerning possible future ZGS facilities which do not necessarily relate to any specific experiments, suggested or proposed

The reactor refuelling system provides the means of transporting, storing, and handling reactor core subassemblies. The system consists of the facilities and equipment needed to accomplish the scheduled refuelling operations. The choice of a FHS impacts directly on the general design of the reactor vessel (primary vessel, storage, and final cooling before going to reprocessing), its construction cost, and its availability factor. Fuel handling design must take into account various items and in particular operating strategies such as core design and management and core configuration. Moreover, the FHS will have to cope with safety assessments: a permanent cooling strategy to prevent fuel clad rupture, plus provisions to handle short-cooled fuel and criteria to ensure safety during handling. In addition, the handling and elimination of residual sodium must be investigated; it implies specific cleaning treatment to prevent chemical risks such as corrosion or excess hydrogen production. The objective of this study is to identify the challenges of a SFR fuel handling system. It will then present the range of technical options incorporating innovative technologies under development to answer the GENERATION IV SFR requirements. (author)

The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its robustness and flexibility is critical for monitoring detector performance and providing fast and comprehensive feedback centrally for the experiment in real-time (Online DQM), after a full event processing with fine-grained analysis (Offline DQM), and as a validation tool to validate both the CMS software (CMSSW), calibration and alignment scenarios and extensive simulations. The entire DQM framework has undergone fundamental changes, and the first performance results of this newly updated system will be presented in the context of the first proton-proton collisions for CERNs Large Hadron Collider at a center of mass energy of 13 TeV. These results will encapsulate the performance of the CMS detector in the context of the upgraded DQM system that makes available more sophisticated methods for evaluating data quality, as well as a dedicated review of the technicalchallenges and improvements specific to the DQM framework i...

Growing up in an era of video games and Web-based applications has primed current medical students to expect rapid, interactive feedback. To address this need, the A.T. Still University-School of Osteopathic Medicine in Arizona (Mesa) has developed and integrated a variety of approaches using technology-enhanced active learning for medical education (TEAL-MEd) into its curriculum. Over the course of 3 years (2010-2013), the authors facilitated more than 80 implementations of games and virtual patient simulations into the education of 550 osteopathic medical students. The authors report on 4 key aspects of the TEAL-MEd initiative, including purpose, portfolio of tools, progress to date regarding challenges and solutions, and future directions. Lessons learned may be of benefit to medical educators at academic and clinical training sites who wish to implement TEAL-MEd activities.

Non-governmental organizations, or NGOs, are increasingly instrumental to the implementation of international health programs. Following an overview of current conditions in global health and the problems that could be targeted by NGOs, this article describes the activities and philosophies of several representative approaches in this sector. The attributes of NGOs that increase their potential effectiveness are discussed, including ability to reach areas of severe need, promotion of local involvement, low cost of operations, adaptiveness and innovation, independence, and sustainability. A summary is provided of major futurechallenges in international health that may be addressed by NGOs, with particular emphasis on tobacco-related disease, communicable diseases and the AIDS epidemic, maternal mortality and women's health, injury prevention and control, and the need to secure durable financial support.

The aim of this paper is to describe educational programs that reportedly teach how to break bad news in the emergency department. We also suggest some recommendations on how to communicate bad news based on the research of evidence available in the field. The examined evidence points toward six major components with which physicians should familiarize when communicating bad news: 1) doctor-patient empathic communication, 2) establishing a proper space to give the news, 3) identifying characteristics of the person who receives the news, 4) essential aspects for communicating the news; 5) emotional support, and 6) medical and administrative aspects of the encounter. Finally, we point out several limitations in the studies in the field and futurechallenges identified in the communication of bad news in emergency room facilities.

The research of the past two decades has firmly established that hormones modulate numerous aspects of cognitive function, including memory, attention, decision-making, and sensory processing. That such a wide variety of hormones influence cognition mediated by multiple nonhypothalamic brain regions illustrates the critical importance of hormones to neural and cognitive function. The diversity of hormonal effects on cognition is evident in the collection of reviews and original research articles assembled for this special section. Together, these articles provide an overview of recent research on varied topics in hormones and cognition, address controversial issues in the field, and discuss challenges that must be overcome in future research to gain a better understanding of the mechanisms through which hormones modulate cognitive function.

Issues related to developing information resources for assessing the health effects from chemical exposure include the question of how to address the individual political issues relevant to identifying and determining the timeliness, scientific credibility, and completeness of such kinds of information resources. One of the important ways for agencies to share information is through connection tables. This type of software is presently being used to build information products for some DHHS agencies. One of the challenges will be to convince vendors of data of the importance of trying to make data files available to communities that need them. In the future, information processing will be conducted with neural networks, object-oriented database management systems, and fuzzy-set technologies, and meta analysis techniques.

This paper attempts an analysis of some current trends and future developments in computer science, education, and educational technology. Based on these trends, two possible future predictions of AIED are presented in the form of a utopian vision and a dystopian vision. A comparison of these two visions leads to seven challenges that AIED might…

The requirements of projected space programs (1985-1995) for transportation vehicles more advanced than the space shuttle are discussed. Several future program options are described and their transportation needs are analyzed. Alternative systems approaches to meeting these needs are presented.

Background Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity, direct current to cortical areas facilitating or inhibiting spontaneous neuronal activity. In the past ten years, tDCS physiological mechanisms of action have been intensively investigated giving support for the investigation of its applications in clinical neuropsychiatry and rehabilitation. However, new methodological, ethical, and regulatory issues emerge when translating the findings of preclinical and phase I studies into phase II and III clinical studies. The aim of this comprehensive review is to discuss the key challenges of this process and possible methods to address them. Methods We convened a workgroup of researchers in the field to review, discuss and provide updates and key challenges of neuromodulation use for clinical research. Main Findings/Discussion We reviewed several basic and clinical studies in the field and identified potential limitations, taking into account the particularities of the technique. We review and discuss the findings into four topics: (i) mechanisms of action of tDCS, parameters of use and computer-based human brain modeling investigating electric current fields and magnitude induced by tDCS; (ii) methodological aspects related to the clinical research of tDCS as divided according to study phase (i.e., preclinical, phase I, phase II and phase III studies); (iii) ethical and regulatory concerns; (iv) future directions regarding novel approaches, novel devices, and future studies involving tDCS. Finally, we propose some alternative methods to facilitate clinical research on tDCS. PMID:22037126

Full Text Available In the face of the broad political call for an “energy turnaround”, we are currently witnessing three essential trends with regard to energy infrastructure planning, energy generation and storage: from planned production towards fluctuating production on the basis of renewable energy sources, from centralized generation towards decentralized generation and from expensive energy carriers towards cost-free energy carriers. These changes necessitate considerable modifications of the energy infrastructure. Even though most of these modifications are inherently motivated by geospatial questions and challenges, the integration of energy system models and Geographic Information Systems (GIS is still in its infancy. This paper analyzes the shortcomings of previous approaches in using GIS in renewable energy-related projects, extracts distinct challenges from these previous efforts and, finally, defines a set of core future research avenues for GIS-based energy infrastructure planning with a focus on the use of renewable energy. These future research avenues comprise the availability base data and their “geospatial awareness”, the development of a generic and unified data model, the usage of volunteered geographic information (VGI and crowdsourced data in analysis processes, the integration of 3D building models and 3D data analysis, the incorporation of network topologies into GIS, the harmonization of the heterogeneous views on aggregation issues in the fields of energy and GIS, fine-grained energy demand estimation from freely-available data sources, decentralized storage facility planning, the investigation of GIS-based public participation mechanisms, the transition from purely structural to operational planning, data privacy aspects and, finally, the development of a new dynamic power market design.

To describe the US public health nutrition workforce and its future social, biological and fiscal challenges. Literature review primarily for the four workforce surveys conducted since 1985 by the Association of State and Territorial Public Health Nutrition Directors. The United States. Nutrition personnel working in governmental health agencies. The 1985 and 1987 subjects were personnel in full-time budgeted positions employed in governmental health agencies providing predominantly population-based services. In 1994 and 1999 subjects were both full-time and part-time, employed in or funded by governmental health agencies, and provided both direct-care and population-based services. The workforce primarily focuses on direct-care services for pregnant and breast-feeding women, infants and children. The US Department of Agriculture funds 81.7 % of full-time equivalent positions, primarily through the WIC Program (Special Supplemental Nutrition Program for Women, Infants, and Children). Of those personnel working in WIC, 45 % have at least 10 years of experience compared to over 65 % of the non-WIC workforce. Continuing education needs of the WIC and non-WIC workforces differ. The workforce is increasingly more racially/ethnically diverse and with 18.2 % speaking Spanish as a second language. The future workforce will need to focus on increasing its diversity and cultural competence, and likely will need to address retirement within leadership positions. Little is known about the workforce's capacity to address the needs of the elderly, emergency preparedness and behavioural interventions. Fiscal challenges will require evidence-based practice demonstrating both costs and impact. Little is known about the broader public health nutrition workforce beyond governmental health agencies.

In recent years, deficits in emotion regulation have been studied as a putative maintaining factor and promising treatment target in a broad range of mental disorders. This article aims to provide an integrative review of the latest theoretical and empirical developments in this rapidly growing field of research. Deficits in emotion regulation appear to be relevant to the development, maintenance, and treatment of various forms of psychopathology. Increasing evidence demonstrates that deficits in the ability to adaptively cope with challenging emotions are related to depression, borderline personality disorder, substance-use disorders, eating disorders, somatoform disorders, and a variety of other psychopathological symptoms. Unfortunately, studies differ with regard to the conceptualization and assessment of emotion regulation, thus limiting the ability to compare findings across studies. Future research should systematically work to use comparable methods in order to clarify the following: which individuals have; what kinds of emotion regulation difficulties with; which types of emotions; and what interventions are most effective in alleviating these difficulties. Despite some yet to be resolved challenges, the concept of emotion regulation has a broad and significant heuristic value for research in mental health.

Gelatin is a highly purified animal protein of pig, cow, and fish origins and is extensively used in food, pharmaceuticals, and personal care products. However, the acceptability of gelatin products greatly depends on the animal sources of the gelatin. Porcine and bovine gelatins have attractive features but limited acceptance because of religious prohibitions and potential zoonotic threats, whereas fish gelatin is welcomed in all religions and cultures. Thus, source authentication is a must for gelatin products but it is greatly challenging due to the breakdown of both protein and DNA biomarkers in processed gelatins. Therefore, several methods have been proposed for gelatin identification, but a comprehensive and systematic document that includes all of the techniques does not exist. This up-to-date review addresses this research gap and presents, in an accessible format, the major gelatin source authentication techniques, which are primarily nucleic acid and protein based. Instead of presenting these methods in paragraph form which needs much attention in reading, the major methods are schematically depicted, and their comparative features are tabulated. Future technologies are forecasted, and challenges are outlined. Overall, this review paper has the merit to serve as a reference guide for the production and application of gelatin in academia and industry and will act as a platform for the development of improved methods for gelatin authentication.

The demand for modeling tools robust to climate change and weather extremes along with coincident increases in computational capabilities have led to an increase in the use of physics-based snow models in operational applications. Current operational applications include the WSL-SLF's across Switzerland, ASO's in California, and USDA-ARS's in Idaho. While the physics-based approaches offer many advantages there remain limitations and modeling challenges. The most evident limitation remains computation times that often limit forecasters to a single, deterministic model run. Other limitations however remain less conspicuous amidst the assumptions that these models require little to no calibration based on their foundation on physical principles. Yet all energy balance snow models seemingly contain parameterizations or simplifications of processes where validation data are scarce or present understanding is limited. At the research-basin scale where many of these models were developed these modeling elements may prove adequate. However when applied over large areas, spatially invariable parameterizations of snow albedo, roughness lengths and atmospheric exchange coefficients - all vital to determining the snowcover energy balance - become problematic. Moreover as we apply models over larger grid cells, the representation of sub-grid variability such as the snow-covered fraction adds to the challenges. Here, we will demonstrate some of the major sensitivities of distributed energy balance snow models to particular model constructs, the need for advanced and spatially flexible methods and parameterizations, and prompt the community for open dialogue and future collaborations to further modeling capabilities.

Full Text Available In order to respond to new challenges in transportation and traffic problems, it is essential to enhance statistics in this field that provides the basis for policy researches. Many of the statistics in this field in Japan consist of “official statistics” created by the government. This paper gives a review of the current status of transportation and traffic statistics (hereinafter called “transportation statistics” in short in Japan. Furthermore, the paper discusses challenges in such statistics in the new environment and the direction that statistics that should take in the future. For Japan’s transportation statistics to play vital roles in more sophisticated analyses, it is necessary to improve the environment that facilitates the use of microdata for analysis. It is also necessary to establish an environment where big data can be more easily used for compilation of official statistics and performing policy researches. To achieve this end, close cooperation among the government, academia, and businesses will be essential.

The informatics landscape of clinical trials in oncology has changed significantly in the last 10 years. The current state of the infrastructure for clinical trial management, execution, and data management is reviewed. The systems, their functionality, the users, and the standards available to researchers are discussed from the perspective of the oncologist-researcher. Challenges in complexity and in the processing of information are outlined. These challenges include the lack of communication and information-interchange between systems, the lack of simplified standards, and the lack of implementation and adherence to the standards that are available. The clinical toxicology criteria from the National Cancer Institute (CTCAE) are cited as a successful standard in oncology, and HTTP on the Internet is referenced for its simplicity. Differences in the management of information standards between industries are discussed. Possible future advances in oncology clinical research informatics are addressed. These advances include strategic policy review of standards and the implementation of actions to make standards free, ubiquitous, simple, and easily interpretable; the need to change from a local data-capture- or transaction-driven model to a large-scale data-interpretation model that provides higher value to the oncologist and the patient; and the need for information technology investment in a readily available digital educational model for clinical research in oncology that is customizable for individual studies. These new approaches, with changes in information delivery to mobile platforms, will set the stage for the next decade in clinical research informatics.

This paper explores the opportunities and challenges of integrating technology to support mathematics teaching and learning in creative engineer- ing disciplines. We base our discussion on data from our research in the Media Technology department of Aalborg University Copenhagen, Denmark. Our ana...... analysis proposes that unlike in other engineering disciplines, technology in these disciplines should be used for contextualizing mathematics rather than in- troducing and exploring mathematical concepts....

Radiology has gained an enviable position among medial specialities. Developments in new technology expand its horizons and the volume of radiologic imaging techniques and procedures increase far more than the overall growth in health care services. In this position radiology has become a prime target for restrictions, cutbacks, controlled financing in an area of managed care and new national health care policy based on partially fixed budgets. Future health care takers have to choose the best available diagnostic and therapeutic techniques. Evidence based medicine, cost-utility analysis, diagnostic performance analysis, patient outcome analysis, technology assessment and guidelines for practice are means to guide us through our obligatory choice. Our major objective is to use the most performant available imaging technique or intervention to achieve the best possible outcome for our patient at lower possible costs. A strategic response from radiologists is required to meet the imperatives of this new management situation. They must do far more than interpret imaging procedures. They must work as efficient managers of imaging resources, organise their practices and define their marketing-strategies using the different, so-called, marketing-mix elements. The challenges will be great but the rewards are worth our best efforts. In this article we highlight the marketing responsibilities of future radiologists and their clinical practice in this new socio-economic environment and we present different useful marketing tools.

Due to rapid modernization the energy resources are depleting rapidly throughout the world while the energy demand is rising steadily. The crude oil price has soared upto $140.0 per barrel that has triggered the use of renewable energy recourses. Pakistan particular is the most energy deficient country where a shortfall of as high as 4500 MW is recorded in the recent year. The Renewable Energy Technologies (RET's) are important and had gained the prime importance these days with specific focus on solar and wind power. This paper highlights the challenges and opportunities for wind power in Pakistan. The wind potential in different areas has been explored, including a vital area of about 9700 km/sup 2/ in Sindh. Wind power is a new energy resource in Pakistan's history, uptil now main resources are Fossil Fuel contributing 65%, hydel 33% and nuclear only 2% respectively. Wind is an environment friendly resource and its appreciable contribution will be achieved in future. Paper analyses the present energy scenario through wind power in Pakistan and leads to future progress in order to secure energy security in the country. (author)

Japan Atomic Energy Agency (JAEA) has started a nuclear forensics (NF) technology development project from JFY 2011, according to the National Statement of Japan in Nuclear Security Summit 2010. This paper will present the progress and future prospects of the development project during JFY 2011 to 2013. The project on NF technology in JAEA includes the development of analytical technologies such as isotope and impurity measurements, morphology analysis, age determination technique, and the prototype of nuclear forensics library (NFL) for future national NFL. Some analytical devices were installed for the analytical technology developments, and various uranium materials produced in JAEA facilities at Ningyo-toge have been measured to verify the analytical technologies. A nuclear material database of the prototype NFL was also developed with brief tools of multivariate analysis and image analysis. The implementation of the analytical technologies, the development of advanced analytical technologies and the system improvements of the prototype NFL will be continued from JFY 2014 in JAEA. The national regime and national response plan are remained as a big challenge to establish the national NF capabilities in Japan. (author)

The glaciers and snowfields of the Himalaya are the ultimate source for the many rivers that flow across the Asian subcontinent, but they are diminishing rapidly in the face of sustained climatic change. Predictions of how future river discharge may vary through space and time are hampered by two major knowledge gaps. First, simulations of glacier mass loss in high Asia are severely limited by data availability and assumptions made in the parameterisation of glacier models. Consequently, projections of glacier change vary widely; in Nepal for example, recent estimates of volumetric ice loss by AD2100 have ranged between 8% and 99%. A second major gap in knowledge lies in the coupling between glaciers and downstream areas, and specifically in quantifying the relative contributions of different sources to river flow. Although it is clear that ice and snow melt dominates flow for considerable distances downstream, how this contribution interacts with groundwater supplies with increasing distance from its source remains poorly understood. This presentation will review recent work that closes some of the knowledge gaps in understanding debris-covered glacier behaviour including new results from drilling work on the Khumbu Glacier in Nepal. Additionally, it will report on the outputs from an interdisciplinary study in the Annapurna region of Nepal, which is focussing specifically on disaggregating the relative contributions to flow using isotope-based hydrograph separations. It will finish by exploring the most likely drivers of future changes to water supply, including an evaluation of the impact of glacial lake development, and by identifying the main challenges for future related research.

Purpose This paper aims to describe and analyse some of the ways in which good leadership can enable those working within the National Health Service (NHS) to weather the changes and difficulties likely to arise in the coming years, and takes the format of an essay written by the prize-winner of the Faculty of Medical Leadership and Management's Student Prize. The Faculty of Medical Leadership and Management ran its inaugural Student Prize in 2015-2016, which aimed at medical students with an interest in medical leadership. In running the Prize, the Faculty hoped to foster an enthusiasm for and understanding of the importance of leadership in medicine. Design/methodology/approach The Faculty asked entrants to discuss the role of good leadership in addressing the current and futurechallenges faced by the NHS, making reference to the Leadership and Management Standards for Medical Professionals published by the Faculty in 2015. These standards were intended to help guide current and future leaders and were grouped into three categories, namely, self, team and corporate responsibility. Findings This paper highlights the political nature of health care in the UK and the increasing impetus on medical professionals to navigate debates on austerity measures and health-care costs, particularly given the projected deficit in NHS funding. It stresses the importance of building organisational cultures prizing transparency to prevent future breaches in standards of care and the value of patient-centred approaches in improving satisfaction for both patients and staff. Identification of opportunities for collaboration and partnership is emphasised as crucial to assuage the burden that lack of appropriate social care places on clinical services. Originality/value This paper offers a novel perspective - that of a medical student - on the complex issues faced by the NHS over the coming years and utilises a well-regarded set of standards in conceptualising the role that health

This panorama takes stock on the international energy actuality in 2003 and discusses the instability of the geo-political context of the energy and the part of the fossil fuels for the future years 2030-2050. The following topics were presented: activities and market for the exploration-production, refining and petrochemistry, the world gas trade situation, the petroleum supply and demand, the Iraq, the diesel in the USA, the investments and the depletion, long-dated evolutions of motors and fuels, implementing of the european directive concerning the market of tradable permits of CO 2 , the carbon sequestration, hydrogen the energy of the future and the biofuels in Europe. (A.L.B.)

Sustainable development as a concept came to force only after the developed nations started getting undesirable fallout of development on local and global environment due to indiscriminate resource exploitation. The concept of sustainable development is the development that meets the needs of the present without compromising the ability of the future generation to meet their own needs. This volume contains some papers dealing with sustainable development in energy sector. The papers relevant to INIS is indexed separately

Recent advances in information communication technology allow contact with patients at home through e-Health services (telemedicine, in particular). We provide insights on the state of the art of e-Health and telemedicine for possible wider future clinical use. Telemedicine opportunities are summarized as i) home telenursing, ii) electronic transfer to specialists and hospitals, iii) teleconsulting between general practitioners and specialists and iv) call centres activities and online health. At present, a priority action of the EU is the Initiative on TM for chronic disease management as home health monitoring and the future Vision for Europe 2020 is based on development of Integrated Telemedicine Services. There are pros and cons in e-Health and telemedicine. Benefits can be classified as benefits for i) citizens, patients and caregivers and ii) health care provider organizations. Institutions and individuals that play key roles in the future of e-Health are doctors, patients and hospitals, while the whole system should be improved at three crucial levels: 1) organizational, 2) regulatory and 3) technological. Quality, access and efficiency are the general key issues for the success of e-Health and telemedicine implementation. The real technology is the human resource available into the organizations. For e-Health and telemedicine to grow, it will be necessary to investigate their long-term efficacy, cost effectiveness, possible improvement in quality of life and impact on public health burden.

Since the advent of jet-powered flight in the 1960s, the threat of volcanic ash to aviation operations has become widely recognized and the mitigation of this threat has received concerted international attention. At the same time the susceptibility to operational disruption has grown. Technical improvements to airframes, engines, and avionic systems have been made in response to the need for improved fuel efficiency and the demand for increased capacity for passenger and freight traffic. Operational demands have resulted in the growth of extended overseas flight operations (ETOPS), increased flight frequency on air traffic routes, and closer spacing of aircraft on heavily traveled routes. The net result has been great advances in flight efficiency, but also increased susceptibility to flight disruption, especially in heavily traveled regions such as North Atlantic-Europe, North America, and the North Pacific. Advances in ash avoidance procedures, pilot and air manager training, and in detection of ash-related damage and maintenance of aircraft and engines have been spurred by noteworthy eruptions such as Galunggung, Indonesia, 1982; Redoubt, Alaska, 1989-1990; and Pinatubo, Philippines, 1991. Comparable advances have been made in the detection and tracking of volcanic ash clouds using satellite-based remote sensing and numerical trajectory forecast models. Following the April 2010 eruption of Eyjafjallajökull volcano, Iceland, the global aviation community again focused attention on the issue of safe air operations in airspace affected by volcanic ash. The enormous global disruption to air traffic in the weeks after the Eyjafjallajökull eruption has placed added emphasis for the global air traffic management system as well as on the equipment manufacturers to reevaluate air operations in ash-affected airspace. Under the leadership of the International Civil Aviation Organization and the World Meteorological Organization, efforts are being made to address this

Beginning in 2006, NASA initiated the J-2X engine development effort to develop an upper stage propulsion system to enable the achievement of the primary objectives of the Constellation program (CxP): provide continued access to the International Space Station following the retirement of the Space Station and return humans to the moon. The J-2X system requirements identified to accomplish this were very challenging and the time expended over the five years following the beginning of the J- 2X effort have been noteworthy in the development of innovations in both the fields for liquid rocket propulsion and system engineering.

Understanding the interaction of mountain glaciers and permafrost with weather and climate is essential for the interpretation of past states of the cryosphere in terms of climate change. Most of the glaciers and rock glaciers in Eastern Alpine terrain are subject to strong gradients in climatic forcing, and the persistence of these gradients under past climatic conditions is, more or less, unknown. Thus a key challenge of monitoring the cryosphere is to define the demands on a monitoring strategy for capturing essential processes and their potential changes. For example, the effects of orographic precipitation and local shading vary with general circulation patterns and the amount of solar radiation during the melt(ing) season. Recent investigations based on the Austrian glacier inventories have shown that glacier distribution is closely linked to topography and climatic situation, and that these two parameters imply also different sensitivities of the specific glaciers to progressing climate change. This leads to the need to develop a monitoring system capturing past, but also fairly unknown future ensembles of climatic state and sensitivities. As a first step, the Austrian glacier monitoring network has been analyzed from the beginning of the records onwards. Today's monitoring network bears the imprints of past research interests, but also past funding policies and personal/institutional engagements. As a limitation for long term monitoring in general, today's monitoring strategies have to cope with being restricted to these historical commitments to preserve the length of the time series, but at the same time expanding the measurements to fulfil present and future scientific and societal demands. The decision on cryospheric benchmark sites has an additional uncertainty: the ongoing disintegration of glaciers, their increasing debris cover as well as the potential low ice content and relatively unknown reaction of rock glaciers in the course of climate change

Full Text Available Periodontitis is a chronic inflammation of the periodontium caused by persistent bacterial infection that leads to the breakdown of connective tissue and bone. Because the ability to reconstruct the periodontium is limited after alveolar bone loss, early diagnosis and intervention should be the primary goals of periodontal treatment. However, periodontitis often progresses without noticeable symptoms, and many patients do not seek professional dental care until the periodontal destruction progresses to the point of no return. Furthermore, the current diagnosis of periodontitis depends on time-consuming clinical measurements. Therefore, there is an unmet need for near-patient testing to diagnose periodontitis. Saliva is an optimal biological fluid to serve as a near-patient diagnostic tool for periodontitis. Recent developments in point-of-care (POC testing indicate that a diagnostic test for periodontitis using saliva is now technically feasible. A number of promising salivary biomarkers associated with periodontitis have been reported. A panel of optimal biomarkers must be carefully selected based on the pathogenesis of periodontitis. The biggest hurdle for the POC diagnosis of periodontitis using saliva may be the process of validation in a large, diverse patient population. Therefore, we propose the organization of an International Consortium for Biomarkers of Periodontitis, which will gather efforts to identify, select, and validate salivary biomarkers for the diagnosis of periodontitis.

The down-stream processes where new product prospects undergo pilot testing and adjustments before market launch can have significant impact on development speed. Previous literature has primarily pointed to firm traits as influential factors to speed. However, product specific measures such as t......The down-stream processes where new product prospects undergo pilot testing and adjustments before market launch can have significant impact on development speed. Previous literature has primarily pointed to firm traits as influential factors to speed. However, product specific measures...... such as technical innovativeness may be a critical factor when going through the crucial late stages of development. We have limited knowledge of the relation between product newness and the speed of NPD, and how this may be related to firm size and partnering strategies during development. Combining product traits...... with firm resources relevant to late stage development speed can enrich our understanding of time-tomarket in the aim of improving this crucial measure of NPD. The research model is tested on a dataset of all new drug developments approved for the US market 2000-2010. The results show that newness...

Accessing and integrating human genomic data with phenotypes is important for biomedical research. Making genomic data accessible for research purposes, however, must be handled carefully to avoid leakage of sensitive individual information to unauthorized parties and improper use of data. In this article, we focus on data sharing within the scope of data accessibility for research. Current common practices to gain biomedical data access are strictly rule based, without a clear and quantitative measurement of the risk of privacy breaches. In addition, several types of studies require privacy-preserving linkage of genotype and phenotype information across different locations (e.g., genotypes stored in a sequencing facility and phenotypes stored in an electronic health record) to accelerate discoveries. The computer science community has developed a spectrum of techniques for data privacy and confidentiality protection, many of which have yet to be tested on real-world problems. In this article, we discuss clinical, technical, and ethical aspects of genome data privacy and confidentiality in the United States, as well as potential solutions for privacy-preserving genotype–phenotype linkage in biomedical research. PMID:27681358

The UN Framework Convention of Climate Change (FCCC) allows for the Joint Implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. At present, all JI transactions are voluntary, and no country has claimed JI credit against existing FCCC commitments. Nevertheless, JI could have important implications for both the economic efficiency and the international equity of the implementation of the FCCC. The paper discusses some of the information needs of JI projects and seeks to clarify some of the common assumptions and arguments about JI. Issues regarding JI are distinguished according to those that are specific to JI as well as other types of regimes and transactions. The focus is on the position of developing countries and their potential risks and benefits regarding JI. 2 figs., 3 tabs., 35 refs