Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquakeinformationtechnology. UseIT provides the cross-training in computer science/informationtechnology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Informationtechnology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.

IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

Scientists with the U.S. Geological Survey have adapted their methods for quickly finding the exact source of an earthquake to the problem of locating gunshots. On the basis of this work, a private company is now testing an automated gunshot-locating system in a San Francisco Bay area community. This system allows police to rapidly pinpoint and respond to illegal gunfire, helping to reduce crime in our neighborhoods.

The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

... Search Term(s): Main Content Home Be InformedEarthquakesEarthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquakeinformation and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

Describes important information-handling products, predicting future devices in light of convergence and greater flexibility offered through use of microchip technology. Contends that informationtechnology and its impact of privacy depends on how information systems are used, arguing that the privacy issue deals more with moral/physiological…

The Sichuan "5.12" Earthquake in 2008 occurred in a relatively underdeveloped area in China. The rainy weather, the mountainous environment and the local languages all posed major challenges to the dissemination of information and services after the disaster. By adopting a communication perspective, this study applies the diffusion of innovations theory to investigate how healthcare professionals diffused health technologies, health information and services during the rescue and relief operation. The authors conducted three focus group sessions with the health professionals who had attended to the rescue and relief work of the Sichuan "5.12" Earthquake in 2008. A range of questions regarding the diffusion of innovations were asked during these sessions. The health professionals used their cell phones to communicate with other healthcare providers, disseminated knowledge of health risks and injuries to affected residents with pamphlets and posters and attended daily meetings at the local government offices. They reported on the shortage of maritime satellite cell phones and large-size tents for medical use, and the absence of fully equipped ambulances. Volunteers, local health professionals and local officials provided health information and services in different ways. However, the diffusion of health information and services was less likely to reach those living next to transportation centers, in remote areas and in disaster areas neglected by the media. New communication devices such as cell phones and the mobile Internet enabled medical professionals to coordinate the rescue and relief work after this major natural disaster, at a time when the country's emergency response system still had plenty of room for improvement. In future, the mobile Internet should be used as a means of collecting bottom-up disaster reports so that the media will not neglect any disaster areas as they did during the Sichuan Earthquake. Rescue relief work would have been substantially

Amazon.com, Barnesandnoble.com Inc., Priceline.com Inc., and Hotels.com are some of the market leaders. The Business-to-Business ( B2B ) segment dwarfs the...B2C segment. eMarketer estimates B2B revenues of $474.3 billion in 2001 will grow to $2.4 trillion in 2004. Companies like Cisco Systems and Dell...icaf/industry/IS2003/papers/2003%20Information%20Technology.htm and “business to business,” or ( B2B ). eMarketer, an online market research and

Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

Tsunami Informationtechnology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.

This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

The Euro-Med Seismological Centre (EMSC) operates the second global earthquakeinformation website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquakeinformation and services, collect their observations, collate them for improved earthquakeinformation services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

The existence of ionospheric anomalies before earthquakes is now widely accepted. These phenomena started to be considered by GPS community to mitigate the GPS signal degradation over the territories of the earthquake preparation. The question is still open if they could be useful for seismology and for short-term earthquake forecast. More than decade of intensive studies proved that ionospheric anomalies registered before earthquakes are initiated by processes in the boundary layer of atmosphere over earthquake preparation zone and are induced in the ionosphere by electromagnetic coupling through the Global Electric Circuit. Multiparameter approach based on the Lithosphere-Atmosphere-Ionosphere Coupling model demonstrated that earthquake forecast is possible only if we consider the final stage of earthquake preparation in the multidimensional space where every dimension is one from many precursors in ensemble, and they are synergistically connected. We demonstrate approaches developed in different countries (Russia, Taiwan, Japan, Spain, and Poland) within the framework of the ISSI and ESA projects) to identify the ionospheric precursors. They are also useful to determine the all three parameters necessary for the earthquake forecast: impending earthquake epicenter position, expectation time and magnitude. These parameters are calculated using different technologies of GPS signal processing: time series, correlation, spectral analysis, ionospheric tomography, wave propagation, etc. Obtained results from different teams demonstrate the high level of statistical significance and physical justification what gives us reason to suggest these methodologies for practical validation.

At least 9,500 people were killed, 30,000 were injured and 100,000 were left homeless by this earthquake. According to some unconfirmed reports, the death toll from this earthquake may have been as high as 35,000. this earthquake is estimated to have seriously affected an area of 825,000 square kilometers, caused between 3 and 4 billion dollars in damage, and been felt by 20 million people.

Landslides are caused by many different triggers, including earthquakes. In Italy, a detailed new generation catalogue of information on historical earthquakes for the period 461 B.C to 1997 is available (Catalogue of Strong Italian Earthquakes from 461 B.C. to 1997, ING-SGA 2000). The catalogue lists 548 earthquakes and provides information on a total of about 450 mass-movements triggered by 118 seismic events. The information on earthquake-induced landslides listed in the catalogue was obtained through the careful scrutiny of historical documents and chronicles, but was rarely checked in the field. We report on an attempt to combine the available historical information on landslides caused by earthquakes with standard geomorphological techniques, including the interpretation of aerial photographs and field surveys, to better determine the location, type and distribution of seismically induced historical slope failures. We present four examples in the Central Apennines. The first example describes a rock slide triggered by the 1279 April 30 Umbria-Marche Apennines earthquake (Io = IX) at Serravalle, along the Chienti River (Central Italy). The landslide is the oldest known earthquake-induced slope failure in Italy. The second example describes the location of 2 large landslides triggered by the 1584 September 10 earthquake (Io = IX) at San Piero in Bagno, along the Savio River (Northern Italy). The landslides were subsequently largely modified by mass movements occurred on 1855 making the recognition of the original seismically induced failures difficult, if not impossible. In the third example we present the geographical distribution of the available information on landslide events triggered by 8 earthquakes in Central Valnerina, in the period 1703 to 1979. A comparison with the location of landslides triggered by the September-October 1997 Umbria-Marche earthquake sequence is presented. The fourth example describes the geographical distribution of the available

Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) appearing several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following considerations: Selection of the precursors in the terms of priority, taking into account their statistical and physical parameters Configuration of the spacecraft payload Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule) Proposal of different options (cheap microsatellite or comprehensive multisatellite constellation) Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention will be devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies will be considered.

Recent theoretical and experimental studies explicitly demonstrated the ability of space technologies to identify and monitor the specific variations at near-earth space plasma, atmosphere and ground surface associated with approaching severe earthquakes (named as earthquake precursors) which appear several days (from 1 to 5) before the seismic shock over the seismically active areas. Several countries and private companies are in the stage of preparation (or already launched) the dedicated spacecrafts for monitoring of the earthquake precursors from space and for short-term earthquake prediction. The present paper intends to outline the optimal algorithm for creation of the space-borne system for the earthquake precursors monitoring and for short-term earthquake prediction. It takes into account the following: Selection of the precursors in the terms of priority, considering their statistical and physical parameters.Configuration of the spacecraft payload.Configuration of the satellite constellation (orbit selection, satellite distribution, operation schedule).Different options of the satellite systems (cheap microsatellite or comprehensive multisatellite constellation). Taking into account that the most promising are the ionospheric precursors of earthquakes, the special attention is devoted to the radiophysical techniques of the ionosphere monitoring. The advantages and disadvantages of such technologies as vertical sounding, in-situ probes, ionosphere tomography, GPS TEC and GPS MET technologies are considered.

The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

Technology (IT) MEL/ITL Task Group on Metrology for InformationTechnology (IT) U.S. DEPARTMENT OF COMMERCE Technology Administration National Institute of...NIST management requested a white paper on metrology for informationtechnology (IT). A task group was formed to develop this white paper with...representatives from the Manufacturing Engineering Laboratory (MEL), the InformationTechnology Laboratory (ITL), and Technology Services (TS). The task

In China, the assessment of the earthquake disasters information is mainly based on the inversion of the seismic source mechanism and the pre-calculated population data model, the real information of the earthquake disaster is usually collected through the government departments, the accuracy and the speed need to be improved. And in a massive earthquake like the one in Mexico, the telecommunications infrastructure on ground were damaged , the quake zone was difficult to observe by satellites and aircraft in the bad weather. Only a bit of information was sent out through maritime satellite of other country. Thus, the timely and effective development of disaster relief was seriously affected. Now Chinese communication satellites have been orbiting, people don't only rely on the ground telecom base station to keep communication with the outside world, to open the web page,to land social networking sites, to release information, to transmit images and videoes. This paper will establish an earthquakeinformation collection system which public can participate. Through popular social platform and other information sources, the public can participate in the collection of earthquakeinformation, and supply quake zone information, including photos, video, etc.,especially those information made by unmanned aerial vehicle (uav) after earthqake, the public can use the computer, potable terminals, or mobile text message to participate in the earthquakeinformation collection. In the system, the information will be divided into earthquake zone basic information, earthquake disaster reduction information, earthquake site information, post-disaster reconstruction information etc. and they will been processed and put into database. The quality of data is analyzed by multi-source information, and is controlled by local public opinion on them to supplement the data collected by government departments timely and implement the calibration of simulation results ,which will better guide

In this paper we discuss an approach to the teaching of informationtechnology law to higher education computing students that attempts to prepare them for professional computing practice. As informationtechnology has become ubiquitous its interactions with the law have become more numerous. Informationtechnology practitioners, and in particular…

An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

The InformationTechnology Resources Assessment (ITRA) is being published as a companion document to the Department of Energy (DOE) FY 1994--FY 1998 Information Resources Management Long-Range Plan. This document represents a collaborative effort between the Office of Information Resources Management and the Office of Energy Research that was undertaken to achieve, in part, the Technology Strategic Objective of IRM Vision 21. An integral part of this objective, technology forecasting provides an understanding of the informationtechnology horizon and presents a perspective and focus on technologies of particular interest to DOE program activities. Specifically, this document provides site planners with anmore » overview of the status and use of new informationtechnology for their planning consideration.« less

A powerful 7.8 magnitude earthquake struck Nepal at 06:11 UTC on 25 April 2015. Several subsequent aftershocks were deadliest earthquake in recent history of Nepal. In total about 9000 people died and 22,300 people were injured, and lives of eight million people, almost one-third of the population of Nepal was effected. The event lead to massive campaigned to gather data and information on damage and loss using remote sensing, field inspection, and community survey. Information on distribution of relief materials is other important domain of information necessary for equitable relief distribution. Pre and post-earthquake high resolution satellite images helped in damage area assessment and mapping. Many national and international agencies became active to generate and fill the information vacuum. The challenges included data access bottleneck due to lack of good IT infrastructure; inconsistent products due to absence of standard mapping guidelines; dissemination challenges due to absence of Standard Operating Protocols and single information gateway. These challenges were negating opportunities offered by improved earth observation data availability, increasing engagement of volunteers for emergency mapping, and centralized emergency coordination practice. This paper highlights critical practical challenges encountered during emergency mapping and information management during the earthquake in Nepal. There is greater need to address such challenges to effectively use technological leverages that recent advancement in space science, IT and mapping domain provides.

Health care executives are increasingly frustrated by InformationTechnology (IT). Although our industry is often accused of underinvesting in technology (hospitals average 2-3 percent of their costs in IT, compared to other industry's 8-10 percent), when IT investments are made, they fail to reflect demonstrable return to the bottom line. Yet the effective deployment of technology is so critical to the success of the organization and can in itself cause the failure of a health care system.

J-RISQ (Japan Real-time Information System for earthquake) has been developing in NIED for appropriate first-actions to big earthquakes. When an earthquake occurs, seismic intensities (SI) are calculated first at each observation station and sent to the Data Management Center in different timing. The system begins the first estimation when the number of the stations observing the SI of 2.5 or larger exceeds the threshold amount. It estimates SI distribution, exposed population and earthquake damage on buildings by using basic data for estimation, such as subsurface amplification factors, population, and building information. It has been accumulated in J-SHIS (Japan Seismic Information Station) developed by NIED, a public portal for seismic hazard information across Japan. The series of the estimation is performed for each 250m square mesh and finally the estimated data is converted into information for each municipality. Since October 2013, we have opened estimated SI, exposed population etc. to the public through the website by making full use of maps and tables.In the previous system, we sometimes could not inspect the information of the surrounding areas out of the range suffered from strong motions, or the details of the focusing areas, and could not confirm whether the present information was the latest or not without accessing the website. J-RISQ has been advanced by introducing the following functions to settle those problems and promote utilization in local areas or in personal levels. In addition, the website in English has been released.・It has become possible to focus on the specific areas and inspect enlarged information.・The estimated information can be downloaded in the form of KML.・The estimated information can be updated automatically and be provided as the latest one.・The newest information can be inspected by using RSS readers or browsers corresponding to RSS.・Exclusive pages for smartphones have been prepared.The information estimated

Information Service for Earthquake Engineering - NISEE The NISEE/PEER Library is an affiliated library of the Field Station, five miles from the main Berkeley campus. Address NISEE/PEER Library, University of Regents Hours Monday - Friday, 9:00 am - 1:00 pm Open to the public NISEE/PEER Library home page. 325

At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquakeinformation from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

Damages and loss of life sustained during an earthquake results from falling structures and flying glass and objects. To address these and other problems, new informationtechnology and systems as a means can improve crisis management and crisis response. The most important factor for managing the crisis depends on our readiness before disasters by useful data. This study aimed to determine the EarthquakeInformation Management System (EIMS) in India, Afghanistan and Iran, and describe how we can reduce destruction by EIMS in crisis management. This study was an analytical comparison in which data were collected by questionnaire, observation and checklist. The population was EIMS in selected countries. Sources of information were staff in related organizations, scientific documentations and Internet. For data analysis, Criteria Rating Technique, Delphi Technique and descriptive methods were used. Findings showed that EIMS in India (Disaster Information Management System), Afghanistan (Management Information for Natural Disasters) and Iran are decentralized. The Indian state has organized an expert group to inspect issues about disaster decreasing strategy. In Iran, there was no useful and efficient EIMS to evaluate earthquakeinformation. According to outcomes, it is clear that an information system can only influence decisions if it is relevant, reliable and available for the decision-makers in a timely fashion. Therefore, it is necessary to reform and design a model. The model contains responsible organizations and their functions.

Context: Damages and loss of life sustained during an earthquake results from falling structures and flying glass and objects. To address these and other problems, new informationtechnology and systems as a means can improve crisis management and crisis response. The most important factor for managing the crisis depends on our readiness before disasters by useful data. Aims: This study aimed to determine the EarthquakeInformation Management System (EIMS) in India, Afghanistan and Iran, and describe how we can reduce destruction by EIMS in crisis management. Materials and Methods: This study was an analytical comparison in which data were collected by questionnaire, observation and checklist. The population was EIMS in selected countries. Sources of information were staff in related organizations, scientific documentations and Internet. For data analysis, Criteria Rating Technique, Delphi Technique and descriptive methods were used. Results: Findings showed that EIMS in India (Disaster Information Management System), Afghanistan (Management Information for Natural Disasters) and Iran are decentralized. The Indian state has organized an expert group to inspect issues about disaster decreasing strategy. In Iran, there was no useful and efficient EIMS to evaluate earthquakeinformation. Conclusions: According to outcomes, it is clear that an information system can only influence decisions if it is relevant, reliable and available for the decision-makers in a timely fashion. Therefore, it is necessary to reform and design a model. The model contains responsible organizations and their functions. PMID:23555130

NASA and the National Science Foundation (NSF) commissioned a panel of U.S. experts to study the international status of satellite communications systems and technology. The study covers emerging systems concepts, applications, services, and the attendant technologies. The panel members traveled to Europe, Japan, and Russia to gather information firsthand. They visited 17 sites in Europe, 20 in Japan, and 4 in Russia. These included major manufacturers, government organizations, service providers, and associated research and development facilities. The panel's report was reviewed by the sites visited, by the panel, and by representatives of U.S. industry. The report details the information collected and compares it to U.S. activities.

Describes six informationtechnology projects in California libraries, including Internet access in public libraries; digital library developments at the University of California, Berkeley; the World Wide Web home page for the state library; Pacific Bell's role in statewide connectivity; state government initiatives; and services of the state…

Horiuchi et al. (2005) developed a real-time earthquakeinformation system (REIS) using Hi-net, a densely deployed nationwide seismic network, which consists of about 800 stations operated by NIED, Japan. REIS determines hypocenter locations and earthquake magnitudes automatically within a few seconds after P waves arrive at the closest station and calculates focal mechanisms within about 15 seconds. Obtained hypocenter parameters are transferred immediately by using XML format to a computer in Japan Meteorological Agency (JMA), who started the service of EEW to special users in June 2005. JMA also developed EEW using 200 stations. The results by the two systems are merged. Among all the first issued EEW reports by both systems, REIS information accounts for about 80 percent. This study examines the rapidity and credibility of REIS by analyzing the 4050 earthquakes which occurred around the Japan Islands since 2005 with magnitude larger than 3.0. REIS re-determines hypocenter parameters every one second according to the revision of waveform data. Here, we discuss only about the results by the first reports. On rapidness, our results show that about 44 percent of the first reports are issued within 5 seconds after the P waves arrives at the closest stations. Note that this 5-second time window includes time delay due to data package and transmission delay of about 2 seconds. REIS waits till two stations detect P waves for events in the network but four stations outside the network so as to get reliable solutions. For earthquakes with hypocentral distance less than 100km, 55 percent of earthquakes are warned in 5 seconds and 87 percent are warned in 10 seconds. Most of events having long time delay are small and triggered by S wave arrivals. About 80 percent of events have difference in epicenter distances less than 20km relative to JMA manually determined locations. Because of the existence of large lateral heterogeneity in seismic velocity, the difference depends

Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded. PMID:28272359

Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual's earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.

Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

Counties Manukau District Health Board (CMDHB) uses informationtechnology (IT) to drive its Integrated Care strategy. IT enables the sharing of relevant health information between care providers. This information sharing is critical to closing the gaps between fragmented areas of the health system. The tragic case of James Whakaruru demonstrates how people have been falling through those gaps. The starting point of the Integrated Care strategic initiative was the transmission of electronic discharges and referral status messages from CMDHB's secondary provider, South Auckland Health (SAH), to GPs in the district. Successful pilots of a Well Child system and a diabetes disease management system embracing primary and secondary providers followed this. The improved information flowing from hospital to GPs now enables GPs to provide better management for their patients. The Well Child system pilot helped improve reported immunization rates in a high health need area from 40% to 90%. The diabetes system pilot helped reduce the proportion of patients with HbA1c rang:9 from 47% to 16%. IT has been implemented as an integral component of an overall Integrated Care strategic initiative. Within this context, Integrated Care IT has helped to achieve significant improvements in care outcomes, broken down barriers between health system silos, and contributed to the establishment of a system of care continuum that is better for patients.

Significant effort has been expended to provide infrastructure and to facilitate the remote collaborations within the fusion community and out. Through the Office of Fusion Energy Science InformationTechnology Initiative, communication technologies utilized by the fusion community are being improved. The initial thrust of the initiative has been collaborative seminars and meetings. Under the initiative 23 sites, both laboratory and university, were provided with hardware required to remotely view, or project, documents being presented. The hardware is capable of delivering documents to a web browser, or to compatible hardware, over ESNET in an access controlled manner. The ability also exists for documents to originate from virtually any of the collaborating sites. In addition, RealNetwork servers are being tested to provide audio and/or video, in a non-interactive environment with MBONE providing two-way interaction where needed. Additional effort is directed at remote distributed computing, file systems, security, and standard data storage and retrieval methods. This work supported by DoE contract No. W-7405-ENG-48

Living in postindustrial, 21st-century society means being surrounded by the accoutrements of informationtechnology. Informationtechnology is in people's offices, cars and homes. One third of adults do not deal well with informationtechnology, according to the research of Larry Rosen, psychology professor, author, and pundit. Rosen is the Paul…

The full impact of the current informationtechnology and networking revolution remains unknown, but the experiences of organizations and individuals who are using the tools and resources offered by informationtechnology suggest that it may change our social fabric. Some of the current and emerging trends in informationtechnology include: the…

Natural disasters like earthquakes require a fast response from local authorities. Well trained rescue teams have to be available, equipment and technology has to be ready set up, information have to be directed to the right positions so the head quarter can manage the operation precisely. The main goal is to reach the most affected areas in a minimum of time. But even with the best preparation for these cases, there will always be the uncertainty of what really happened in the affected area. Modern geophysical sensor networks provide high quality data. These measurements, however, are only mapping disjoint values from their respective locations for a limited amount of parameters. Using observations of witnesses represents one approach to enhance measured values from sensors ("humans as sensors"). These observations are increasingly disseminated via social media platforms. These "social sensors" offer several advantages over common sensors, e.g. high mobility, high versatility of captured parameters as well as rapid distribution of information. Moreover, the amount of data offered by social media platforms is quite extensive. We analyze messages distributed via Twitter after major earthquakes to get rapid information on what eye-witnesses report from the epicentral area. We use this information to (a) quickly learn about damage and losses to support fast disaster response and to (b) densify geophysical networks in areas where there is sparse information to gain a more detailed insight on felt intensities. We present a case study from the Mw 7.1 Philippines (Bohol) earthquake that happened on Oct. 15 2013. We extract Twitter messages, so called tweets containing one or more specified keywords from the semantic field of "earthquake" and use them for further analysis. For the time frame of Oct. 15 to Oct 18 we get a data base of in total 50.000 tweets whereof 2900 tweets are geo-localized and 470 have a photo attached. Analyses for both national level and locally for

One of the main missions of the United States Geological Survey (USGS) National EarthquakeInformation Center (NEIC) is the dissemination of information to national and international agencies, scientists, and the general public through various products such as ShakeMap and earthquake summary posters. During the summer of 2012, undergraduate and graduate student interns helped to update and improve our series of regional seismicity posters and regional tectonic summaries. The "Seismicity of the Earth (1900-2007)" poster placed over a century's worth of global seismicity data in the context of plate tectonics, highlighting regions that have experienced great (M+8.0) earthquakes, and the tectonic settings of those events. This endeavor became the basis for a series of more regionalized seismotectonic posters that focus on major subduction zones and their associated seismicity, including the Aleutian and Caribbean arcs. The first round of these posters were inclusive of events through 2007, and were made with the intent of being continually updated. Each poster includes a regional tectonic summary, a seismic hazard map, focal depth cross-sections, and a main map that illustrates the following: the main subduction zone and other physiographic features, seismicity, and rupture zones of historic great earthquakes. Many of the existing regional seismotectonic posters have been updated and new posters highlighting regions of current seismological interest have been created, including the Sumatra and Java arcs, the Middle East region and the Himalayas (all of which are currently in review). These new editions include updated lists of earthquakes, expanded tectonic summaries, updated relative plate motion vectors, and major crustal faults. These posters thus improve upon previous editions that included only brief tectonic discussions of the most prominent features and historic earthquakes, and which did not systematically represent non-plate boundary faults. Regional tectonic

The Naval Postgraduate School (NPS) Remote Sensing Center (RSC) and research partners have completed a remote sensing pilot project in support of California post-earthquake-event emergency response. The project goals were to dovetail emergency management requirements with remote sensing capabilities to develop prototype map products for improved earthquake response. NPS coordinated with emergency management services and first responders to compile information about essential elements of information (EEI) requirements. A wide variety of remote sensing datasets including multispectral imagery (MSI), hyperspectral imagery (HSI), and LiDAR were assembled by NPS for the purpose of building imagery baseline data; and to demonstrate the use of remote sensing to derive ground surface information for use in planning, conducting, and monitoring post-earthquake emergency response. Worldview-2 data were converted to reflectance, orthorectified, and mosaicked for most of Monterey County; CA. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired at two spatial resolutions were atmospherically corrected and analyzed in conjunction with the MSI data. LiDAR data at point densities from 1.4 pts/m2 to over 40 points/ m2 were analyzed to determine digital surface models. The multimodal data were then used to develop change detection approaches and products and other supporting information. Analysis results from these data along with other geographic information were used to identify and generate multi-tiered products tied to the level of post-event communications infrastructure (internet access + cell, cell only, no internet/cell). Technology transfer of these capabilities to local and state emergency response organizations gives emergency responders new tools in support of post-disaster operational scenarios.

In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-EarthquakeInformation Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

Over the past decade, catastrophic earthquakes have garnered international attention regarding the need for improving immediate and ongoing support services for disrupted communities. Following the December 26, 2004 Indonesian earthquake, the Indian Ocean tsunami was responsible for displacing millions and taking the lives of an estimated 320,000…

Nevertheless, basic earthquake-related information has always been of consuming interest to the public and the media in this part of California (fig. 2.). So it is not surprising that earthquake prediction continues to be a significant reserach program at the laboratory. Several of the current spectrum of projects related to prediction are discussed below.

Dartmouth College’s Institute for Security, Technology, and Society conducted three workshops on securing informationtechnology in healthcare, attended by a diverse range of experts in the field. This article summarizes the three workshops. PMID:25379030

eHealth, the use of informationtechnology to improve or enable health and health care, has recently been high on the health care development agenda. Given the vivid interest in eHealth, little reference has been made to the use of these technologies in the promotion of health. The aim of this present study was to conduct a review on recent uses of informationtechnology in health promotion through looking at research articles published in peer-reviewed journals. Fifteen relevant journals with issues published between 2003 and June 2005 yielded altogether 1352 articles, 56 of which contained content related to the use of informationtechnology in the context of health promotion. As reflected by this rather small proportion, research on the role of informationtechnology is only starting to emerge. Four broad thematic application areas within health promotion were identified: use of informationtechnology as an intervention medium, use of informationtechnology as a research focus, use of informationtechnology as a research instrument and use of informationtechnology for professional development. In line with this rather instrumental focus, the concepts 'ePromotion of Health' or 'Health ePromotion' would come close to describing the role of informationtechnology in health promotion.

According to one embodiment, a system for removing heat from a rack of informationtechnology equipment may include a sidecar indoor air to liquid heat exchanger that cools warm air generated by the rack of informationtechnology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of informationtechnology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat from the rack of informationtechnology equipment.

User acceptance of informationtechnology has been a significant area of research for more than two decades in the field of informationtechnology. This study assessed the acceptance of informationtechnology in the context of Health Information Management (HIM) by utilizing Technology Acceptance Model (TAM) which was modified and applied to assess user acceptance of health informationtechnology as well as viability of TAM as a research construct in the context of HIM. This was a descriptive- analytical study in which a sample of 187 personnel from a population of 363 personnel, working in medical records departments of hospitals affiliated to Tehran University of Medical Sciences, was selected. Users' perception of applying informationtechnology was studied by a researcher-developed questionnaire. Collected data were analyzed by SPSS software (version16) using descriptive statistics and regression analysis. The results suggest that TAM is a useful construct to assess user acceptance of informationtechnology in the context of HIM. The findings also evidenced the perceived ease of use (PEOU) and perceived usefulness (PE) were positively associated with favorable users' attitudes towards HIM. PU was relatively more associated (r= 0.22, p = 0.05) than PEOU (r = 0.014, p = 0.05) with favorable user attitudes towards HIM. Users' perception of usefulness and ease of use are important determinants providing the incentive for users to accept informationtechnologies when the application of a successful HIM system is attempted. The findings of the present study suggest that user acceptance is a key element and should subsequently be the major concern of health organizations and health policy makers.

Addresses the use of informationtechnology for intelligence and information warfare in the context of national security and reviews the status of clandestine collection. Discusses hacking, human agent collection, signal interception, covert action, counterintelligence and security, and communications between intelligence producers and consumers…

We present the progress reached by the GPS TEC technologies in study of pre-seismic anomalies in the ionosphere appearing few days before the strong earthquakes. Starting from the first case studies such as 17 August 1999 M7.6 Izmit earthquake in Turkey the technology has been developed and converted into the global near real-time monitoring of seismo-ionospheric effects which is used now in the multiparameter nowcast and forecast of the strong earthquakes. Development of the techniques of the seismo-ionospheric anomalies identification was carried out in parallel with the development of the physical mechanism explaining these anomalies generation. It was established that the seismo-ionospheric anomalies have a self-similarity property, are dependent on the local time and are persistent at least for 4 hours, deviation from undisturbed level could be both positive and negative depending on the leading time (in days) to the moment of impending earthquake and from longitude of anomaly in relation to the epicenter longitude. Low latitude and near equatorial earthquakes demonstrate the magnetically conjugated effect, while the middle and high latitude earthquakes demonstrate the single anomaly over the earthquake preparation zone. From the anomalies morphology the physical mechanism was derived within the framework of the more complex Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling concept. In addition to the multifactor analysis of the GPS TEC time series the GIM MAP technology was applied also clearly showing the seismo-ionospheric anomalies locality and their spatial size correspondence to the Dobrovolsky determination of the earthquake preparation zone radius. Application of ionospheric tomography techniques permitted to study not only the total electron content variations but also the modification of the vertical distribution of electron concentration in the ionosphere before earthquakes. The statistical check of the ionospheric precursors passed the

A flood of new electronic technologies promises to usher in the Information Age and alter economic and social structures. Telematics, a potent combination of telecommunications and computer technologies, could eventually bring huge volumes of information to great numbers of people by making large data bases accessible to computer terminals in…

Addressed to the new administration and the Congress, this summary report on Federal Government information management and technology issues begins by describing the environment in which informationtechnology has been managed. Arguing that effective government depends directly on effective automation to support programs and initiatives, the…

According to one embodiment, a system for removing heat from a rack of informationtechnology equipment may include a sidecar indoor air to liquid heat exchanger that cools air utilized by the rack of informationtechnology equipment to cool the rack of informationtechnology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of informationtechnology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat generated by the rack of informationtechnology equipment.

This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National EarthquakeInformation Center. The Edge and Continuous Waveform Buffer software supports the National EarthquakeInformation Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

If you thought managed care was a tough nut to crack, wait until you have to start making decisions about your organization's informationtechnology (IT). Information systems are complex and expensive, they can take years to implement, and, once installed, they need costly and regular upgrades. But for a contemporary clinical organization to function, this technology is as essential as power and water. For many years, informationtechnology was seen as a black box, impenetrable and beyond real understanding. If done with knowledge and care, however, cracking the box opens up possibilities, not ruin.

The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National EarthquakeInformation Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquakeinformation products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of informationtechnology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431527

This Policy establishes EPA's responsibilities and procedures for making its Electronic and InformationTechnology (EIT) products accessible to all people, including people with disabilities, in accordance with Section 508 of the Rehabilitation Act.

The Lean and InformationTechnology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

Four issues of this newsletter on informationtechnology and disabilities (ITD) contain the following articles: "Developing an Accessible Online Public Access Catalog at the Washington Talking Book and Braille Library" (Charles Hamilton); "Assistive Technology in the Science Laboratory: A Talking Laboratory Work Station for Visually Impaired…

This textbook is designed for a one-semester introductory course in which the goal is to give students a foundation in the basics of informationtechnology (IT). It focuses on how the technology works, issues relating to its use and development, how it can lend personal and business advantages, and how it is creating a globally networked society.…

The role of informationtechnology (IT) is changing, and is becoming more important for the overall success of colleges today. The structure of IT has not changed much through the years, but a greater amount of institutions exist where multiple areas of technology are being merged back into a single IT organization. The model of IT explored in…

Brian Wagner (l to r) with the U.S. Navy, Andrew Hiukenbein with NVision Solutions and Theresa Avoskey with the Naval Oceanographic Office at Stennis Space Center learn about the latest improvements in making flash drives secure during an InformationTechnology Expo held June 16. Various area companies visited Stennis during the day to offer exhibits for employees on a range of informationtechnology topics. The theme of the daylong expo was 'The Road to Green IT Computing.'

A central regulatory mandate of the Environmental Protection Agency, spanning many Program Offices and issues, is to assess the potential health and environmental risks of large numbers of chemicals released into the environment, often in the absence of relevant test data. Models for predicting potential adverse effects of chemicals based primarily on chemical structure play a central role in prioritization and screening strategies yet are highly dependent and conditional upon the data used for developing such models. Hence, limits on data quantity, quality, and availability are considered by many to be the largest hurdles to improving prediction models in diverse areas of toxicology. Generation of new toxicity data for additional chemicals and endpoints, development of new high-throughput, mechanistically relevant bioassays, and increased generation of genomics and proteomics data that can clarify relevant mechanisms will all play important roles in improving future SAR prediction models. The potential for much greater immediate gains, across large domains of chemical and toxicity space, comes from maximizing the ability to mine and model useful information from existing toxicity data, data that represent huge past investment in research and testing expenditures. In addition, the ability to place newer “omics” data, data that potentially span many possible domains of toxicological effects, in the broader context of historical data is the means for opti

InformationTechnology and Indigenous People provides theoretical and empirical information related to the planning and execution of IT projects aimed at serving indigenous people. It explores many cultural concerns with IT implementation, including language issues and questions of cultural appropriateness, and brings together cutting-edge…

This document reports on a study of the application of informationtechnology to emergency response for hazardous materials incidents. Focus is on the information needs of first responders, i.e., those who are first on the site of an incident. The re...

This profile includes a comprehensive set of informationtechnology competencies that are grounded in core academic subject areas and built around four occupational clusters (information services and support, network systems, programming and software development, and interactive media) that reflect the job opportunities and skills required for…

The role of informationtechnology in educational models of under-graduate and post-graduate medical education is growing in 1980's influenced by PC's break-in in medical practice and creating relevant data basis, and, particularly, in 1990's by integration of informationtechnology on international level, development of international network, Internet, Telemedicin, etc. The development of new educational informationtechnology is evident, proving that information in transfer of medical knowledge, medical informatics and communication systems represent the base of medical practice, medical education and research in medical sciences. In relation to the traditional approaches in concept, contents and techniques of medical education, new models of education in training of health professionals, using new informationtechnology, offer a number of benefits, such as: decentralization and access to relevant data sources, collecting and updating of data, multidisciplinary approach in solving problems and effective decision-making, and affirmation of team work within medical and non-medical disciplines. Without regard to the dynamics of change and progressive reform orientation within health sector, the development of modern medical education is inevitable for all systems a in which informationtechnology and available data basis, as a base of effective and scientifically based medical education of health care providers, give guarantees for efficient health care and improvement of health of population.

Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquakeinformation.

its activity in 1981 and consists of 7 stations equipped with seismometers and acquisition digital technology, working 24 hours per day. Moreover, a network of 9 accelerometers has been set up in the southern Trentino, where most of the seismic events are concentrated. All the information revealed in each station flow to the "Data Acquisition Central Office", where the data are checked, processed and recorded. The Geological Service manages the seismometric network, elaborates and publishes the information regarding the seismicity of the area and surroundings. In case of earthquake the "Seismic Alert", an automatic alarm system, is activated to Civil Protection purposes. The "Seismic Alert" is managed by "Antilope", the consortium of the Eastern Alpine seismometric networks. Moreover the seismotectonic is another research field carried out by this Geological Service, to investigate the formation mechanism of earthquakes and estimate the causative tectonic stress, in relation to the main tectonic structures of the region and of the whole Alpine chain. Hence the Trento study-case reported in this exhibition illustrates the general methodology used to understand the "seismic behaviour" of a region. At the end this exhibition sector also presents the activity of the Trento Civil Protection in the Abruzzo region, where a dramatic seismic event occurred on 6th April 2009, describing the investigation of the still occurring surface deformations. This activity is part of a general framework in which the Trento Province provided first aid and assistance to the local communities. The collaboration between the Natural Science Tridentino Museum and the Geological Service of Trento, already fruitful on field geological researches, has been also effective in this project of science communication. In the future the two institutions could collaborate in other main themes of the relationship between science and society, regarding the dissemination of Earth Sciences.

Scientific organizations like the United States Geological Survey (USGS) release information to support effective responses during an earthquake crisis. Information is delivered to the White House, the National Command Center, the Departments of Defense, Homeland Security (including FEMA), Transportation, Energy, and Interior. Other crucial stakeholders include state officials and decision makers, emergency responders, numerous public and private infrastructure management centers (e.g., highways, railroads and pipelines), the media, and the public. To meet the diverse information requirements of these users, rapid earthquake notifications have been developed to be delivered by e-mail and text message, as well as a suite of earthquakeinformation resources such as ShakeMaps, Did You Feel It?, PAGER impact estimates, and data are delivered via the web. The ShakeAlert earthquake early warning system being developed for the U.S. West Coast will identify and characterize an earthquake a few seconds after it begins, estimate the likely intensity of ground shaking, and deliver brief but critically important warnings to people and infrastructure in harm's way. Currently the USGS is also developing a capability to deliver Operational Earthquake Forecasts (OEF). These provide estimates of potential seismic behavior after large earthquakes and during evolving aftershock sequences. Similar work is underway in New Zealand, Japan, and Italy. In the development of OEF forecasts, social science research conducted during these sequences indicates that aftershock forecasts are valued for a variety of reasons, from informing critical response and recovery decisions to psychologically preparing for more earthquakes. New tools will allow users to customize map-based, spatiotemporal forecasts to their specific needs. Hazard curves and other advanced information will also be available. For such authoritative information to be understood and used during the pressures of an earthquake

The experience of collection of data on earthquake effects and diffusion of information to people, carried on through the site "haisentitoilterremoto.it" (didyoufeelit) managed by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), has evidenced a constantly growing interest by Italian citizens. Started in 2007, the site has collected more than 520,000 compiled intensity questionnaires, producing intensity maps of almost 6,000 earthquakes. One of the most peculiar feature of this experience is constituted by a bi-directional information exchange. Every person can record observed effects of the earthquake and, at the same time, look at the generated maps. Seismologists, on the other side, can find each earthquake described in real time through its effects on the whole territory. In this way people, giving punctual information, receive global information from the community, mediated and interpreted by seismological knowledge. The relationship amongst seismologists, mass media and civil society is, thus, deep and rich. The presence of almost 20,000 permanent subscribers distributed on the whole Italian territory, alerted in case of earthquake, has reinforced the participation: the subscriber is constantly informed by the seismologists, through e-mail, about events occurred in his-her area, even if with very small magnitude. The "alert" service provides the possibility to remember that earthquakes are a phenomenon continuously present, on the other hand it shows that high magnitude events are very rare. This kind of information is helpful as it is fully complementary to that one given by media. We analyze the effects of our activity on society and mass media. The knowledge of seismic phenomena is present in each person, having roots on fear, idea of death and destruction, often with the deep belief of very rare occurrence. This position feeds refusal and repression. When a strong earthquake occurs, surprise immediately changes into shock and desperation. A

The U.S. Geological Survey National EarthquakeInformation Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

Realizing the online handling of administrative approval of earthquakes is an important measure to improve work efficiency and facilitate people’s convenience. Based on the analysis of the characteristics and processes of the administrative licensing in the earthquake industry, this paper proposes an online processing model based on ASP technology and an online processing system based on B/S architecture. This paper presents the design and implementation methods. The application of the system shows that the system is simple in design and full in function, and can be used on mobile platforms such as computers and mobile phones, and has good practicability and forward-lookingness.

This research evaluates the public education earthquakeinformation prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of

A tech prep/associate degree program in informationtechnology was developed to prepare workers for entry into and advancement in occupations entailing applications of scientific principles and higher mathematics in situations involving various office machines. According to the articulation agreement reached, students from five country regional…

Enumerates the challenges of adopting informationtechnology (IT)-based teaching and learning strategies in higher education. Concerns addressed include whether IT should supplant rather than augment traditional teaching methods, the financing of IT acquisition, change of teaching and learning processes to increase productivity per person, and…

Four issues of this newsletter on informationtechnology and disabilities (ITD) contain the following articles: "Building an Accessible CD-ROM Reference Station" (Rochelle Wyatt and Charles Hamilton); "Development of an Accessible User Interface for People Who Are Blind or Vision Impaired as Part of the Re-Computerisation of Royal Blind Society…

The last twenty years has seen the development of demand for a new type of computing professional, which has resulted in the emergence of the academic discipline of InformationTechnology (IT). Numerous colleges and universities across the country and abroad have responded by developing programs without the advantage of an existing model for…

Informationtechnologies are important tools for individual, team, and organizational learning. Developments in virtual reality and the Internet, performance support systems that increase the efficiency of individuals and groups, and other innovations have the potential to enhance the relationship between work and learning. (SK)

This document consists of all issues/pages of the electronic journal "InformationTechnology and Disabilities" published during 1996, i.e., a total of 13 ITD articles: (1) "New CSUF (California State University at Fullerton) Braille Transcription Center Promotes Access to Postsecondary Instructional Materials for the California State University…

Computerized geographic information systems (GISs) are emerging as the spatial data handling tools of choice for solving complex geographical problems. However, few guidelines exist for assisting potential users in identifying suitable hardware and software. A process to be followed in evaluating the merits of GIS technology is presented. Related standards and guidelines, software functions, hardware components, and benchmarking are discussed. By making users aware of all aspects of adopting GIS technology, they can decide if GIS is an appropriate tool for their application and, if so, which GIS should be used.

To provide an editorial introduction to the 2006 IMIA Yearbook of Medical Informatics with an overview of its contents and contributors. A brief overview of the main theme of 'Assessing InformationTechnology for Health Care', and an outline of the purposes, readership, contents, new format, and acknowledgment of contributions for the 2006 IMIA Yearbook. Assessing informationtechnology (IT) in biomedicine and health care is emphasized in a number of survey and review articles. Synopses of a selection of best papers for the past 12 months are included, as are original papers on the history of medical informatics by pioneers in the field, and selected research and education programs. Information about IMIA and its constituent societies is given, as well as the authors, reviewers, and advisors to the Yearbook. The 2006 IMIA Yearbook of Medical Informatics highlights as its theme one of the most significant yet difficult aspects of informationtechnology in health: the assessment of IT as part of the complex enterprise of biomedical research and practice. It is being published in a new format with a wide range of original survey and review articles.

The era of the networked society--and medical care depending on networked intelligence--is dawning. Physicians need to plan for office practice information systems in common, with an eye to conveying data electronically between all the locations of care and all the providers involved in caring for defined populations of people. The shared database will become the most important asset of the collection of providers who make up the delivery system that creates it. This will be accomplished by layering technology on local and wide-area networks of group practices, hospitals, health plans, and payers and developing standards that make data accessible in the same format to all users, no matter where they are.

This paper will propose a whole new viewpoint about building a CSPMS(Coal-mine Safety Production Management System) by means of informationtechnology. This system whose core part is a four-grade automatic triggered warning system achieves the goal that information transmission will be smooth, nondestructive and in time. At the same time, the system provides a comprehensive and collective technology platform for various Public Management Organizations and coal-mine production units to deal with safety management, advance warning, unexpected incidents, preplan implementation, and resource deployment at different levels. The database of this system will support national related industry's resource control, plan, statistics, tax and the construction of laws and regulations effectively.

This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.

Earthquake Summary Posters (ESP's), a new product of the U.S. Geological Survey's Earthquake Program, are produced at the National EarthquakeInformation Center (NEIC) in Golden. The posters consist of rapidly-generated, GIS-based maps made following significant earthquakes worldwide (typically M>7.0, or events of significant media/public interest). ESP's consolidate, in an attractive map format, a large-scale epicentral map, several auxiliary regional overviews (showing tectonic and geographical setting, seismic history, seismic hazard, and earthquake effects), depth sections (as appropriate), a table of regional earthquakes, and a summary of the reional seismic history and tectonics. The immediate availability of the latter text summaries has been facilitated by the availability of Rapid, Accurate Tectonic Summaries (RATS) produced at NEIC and posted on the web following significant events. The rapid production of ESP's has been facilitated by generating, during the past two years, regional templates for tectonic areas around the world by organizing the necessary spatially-referenced data for the map base and the thematic layers that overlay the base. These GIS databases enable scripted Arc Macro Language (AML) production of routine elements of the maps (for example background seismicity, tectonic features, and probabilistic hazard maps). However, other elements of the maps are earthquake-specific and are produced manually to reflect new data, earthquake effects, and special characteristics. By the end of this year, approximately 85% of the Earth's seismic zones will be covered for generating future ESP's. During the past year, 13 posters were completed, comparable to the yearly average expected for significant earthquakes. Each year, all ESPs will be published on a CD in PDF format as an Open-File Report. In addition, each is linked to the special event earthquake pages on the USGS Earthquake Program web site (http://earthquake.usgs.gov). Although three formats

The development of the computer science during the last 30 years has had a very important influence in human life, changing paradigms on all daily activities like public policies, commerce, education and science development. The aim of this editorial is to communicate some considerations about the way the development of technology in information and communication had influenced on the spread of scientific knowledge in its using on medical publications.

InformationTechnology (IT) Security Risk Management is a critical task for the organization to protect against the loss of confidentiality, integrity and availability of IT resources. As systems bgecome more complex and diverse and and attacks from intrusions and malicious content increase, it is becoming increasingly difficult to manage IT security risk. This paper describes a two-pronged approach in addressing IT security risk and risk management in the organization: 1) an institutional enterprise appraoch, and 2) a project life cycle approach.

A new Operator display subsystem is being incorporated as part of the next generation United States Navy (USN) helicopter avionics system to be integrated into the Multi-Mission Helicopter (MMH) which will replace both the SH-60B and the SH- 60F in 2001. This subsystem exploits state-of-the-art technology for the display hardware, the display driver hardware, information presentation methodologies, and software architecture. The technologies to be base technologies have evolved during the development period and the solution has been modified to include current elements including high resolution AMLCD color displays that are sunlight readable, highly reliable, and significantly lighter that CRT technology, as well as Reduced Instruction Set Computer (RISC) based high-performance display generators that have only recently become feasible to implement in a military aircraft. This paper describes the overall subsystem architecture, some detail on the individual elements along with supporting rationale, the manner in which the display subsystem provides the necessary tools to significantly enhance the performance of the weapon system through the vital Operator-System Interface. Also addressed is a summary of the evolution of design leading to the current approach to MMH Operator displays and display processing as well as the growth path that the MMH display subsystem will most likely follow as additional technology evolution occurs.

The 2010 earthquake in Haiti, which killed an estimated 316,000 people, offered many lessons in mass-fatality management (MFM). The dissertation defined MFM in seeking information and in recovery, preservation, identification, and disposition of human remains. Specifically, it examined how mass fatalities were managed in Haiti, how affected…

The Physics of InformationTechnology explores the familiar devices that we use to collect, transform, transmit, and interact with electronic information. Many such devices operate surprisingly close to very many fundamental physical limits. Understanding how such devices work, and how they can (and cannot) be improved, requires deep insight into the character of physical law as well as engineering practice. The book starts with an introduction to units, forces, and the probabilistic foundations of noise and signaling, then progresses through the electromagnetics of wired and wireless communications, and the quantum mechanics of electronic, optical, and magnetic materials, to discussions of mechanisms for computation, storage, sensing, and display. This self-contained volume will help both physical scientists and computer scientists see beyond the conventional division between hardware and software to understand the implications of physical theory for information manipulation.

The Alaska EarthquakeInformation Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

Technology is the sum of the ways in which social groups manipulate order in the world to achieve their ends. It enables our active engagement with the world. Technology is central to our present well-being and vital for our future survival. As such it needs a coherent world view, a conceptual framework which will enable the fundamental problems that it poses for society to be approached in an illuminating way. Furthermore, such an approach, while remaining convincing, must not be overwhelmed by an ever-increasing welter of specialization and diversity of application. It is the purpose of the set of papers presented here to examine some key aspects of such a conceptual framework; not in the sense of offering a fully worked out philosophy of technology--that would be a huge and complex undertaking--but rather by considering some key topics. Subsidiary aims are to survey important relevant areas, to identify key sources that can provide access points for further study, and to consider some possible future developments. Major, coherent domains of activity are characterized by a few, fundamental, extensively used and essentially unifying concepts. Technology is such a domain, and its fundamental concepts are information, knowledge and agency. The following sections give a synoptic overview of the material presented in this theme issue, and set it within a wider context.

NASA's InformationTechnology (IT) resources and IT support continue to be a growing and integral part of all NASA missions. Furthermore, the growing IT support requirements are becoming more complex and diverse. The following are a few examples of the growing complexity and diversity of NASA's IT environment. NASA is conducting basic IT research in the Intelligent Synthesis Environment (ISE) and Intelligent Systems (IS) Initiatives. IT security, infrastructure protection, and privacy of data are requiring more and more management attention and an increasing share of the NASA IT budget. Outsourcing of IT support is becoming a key element of NASA's IT strategy as exemplified by Outsourcing Desktop Initiative for NASA (ODIN) and the outsourcing of NASA Integrated Services Network (NISN) support. Finally, technology refresh is helping to provide improved support at lower cost. Recently the NASA Automated Data Processing (ADP) Consolidation Center (NACC) upgraded its bipolar technology computer systems with Complementary Metal Oxide Semiconductor (CMOS) technology systems. This NACC upgrade substantially reduced the hardware maintenance and software licensing costs, significantly increased system speed and capacity, and reduced customer processing costs by 11 percent.

Non-health-care uses of informationtechnology (IT) provide important lessons for health care informatics that are often overlooked because of the focus on the ways in which health care is different from other domains. Eight examples of IT use outside health care provide a context in which to examine the content and potential relevance of these lessons. Drawn from personal experience, five books, and two interviews, the examples deal with the role of leadership, academia, the private sector, the government, and individuals working in large organizations. The interviews focus on the need to manage technologic change. The lessons shed light on how to manage complexity, create and deploy standards, empower individuals, and overcome the occasional “wrongness” of conventional wisdom. One conclusion is that any health care informatics self-examination should be outward-looking and focus on the role of health care IT in the larger context of the evolving uses of IT in all domains. PMID:10495095

The real-time earthquakeinformation system (REIS) of the Japanese seismic network is developed for automatically determining earthquake parameters within a few seconds after the P-waves arrive at the closest stations using both the P-wave arrival times and the timing data that P-waves have not yet arrived at other stations. REIS results play a fundamental role in the real-time information for earthquake early warning in Japan. We show the rapidity and accuracy of REIS from the analysis of 4,050 earthquakes in three years since 2005; 44 percent of the first reports are issued within 5 seconds after the first P-wave arrival and 80 percent of the events have a difference in epicenter distance less than 20 km relative to manually determined locations. We compared the formal catalog to the estimated magnitude from the real-time analysis and found that 94 percent of the events had a magnitude difference of +/-1.0 unit.

Satellites will operate more like wide area broadband computer networks in the 21st Century. Space-based information and communication technologies will therefore be a lot more accessible and functional for the individual user. These developments are the result of earth-based telecommunication and computing innovations being extended to space. The author predicts that the broadband Internet will eventually be available on demand to users of terrestrial networks wherever they are. Earth and space communication assets will be managed as a single network. Space networks will assure that online access is ubiquitous. No matter whether users are located in cities or in remote locations, they will always be within reach of a node on the Internet. Even today, scalable bandwidth can be delivered to active users when moving around in vehicles on the ground, or aboard ships at sea or in the air. Discussion of the innovative technologies produced by NASA's Advanced Communications Technology Satellite (1993-2004) demonstrates future capabilities of satellites that make them uniquely suited to serve as nodes on the broadband Internet.

situation of whole damage of the city and necessity of evacuation with optimum timing and access. According to the evaluation by the city staffs through the experiments, informationtechnology is available for rationally implementing initial responses just after a large earthquake in spite of some improvement on the systems used in the experiments.

Since the Wenchuan earthquake in 2008, a dramatic progress on earthquake early warning (EEW) has been made by Institute of Care-life (ICL) in China. The research on EEW by ICL covers choosing appropriate sensors, methods of installing the sensors, data automatic process methods of the seismic waves for EEW, methods of applying of EEW warnings for public, schools and life-line projects. ICL innovatively applies distributed computing and cloud computing technology. So far, ICL has deployed over 5500 EEW sensors in China, which is 5 times the number of EEW sensors in Japan, covering more than 2.1 million square kilometers. Since June, 2011, over 5000 earthquakes, with 28 of them are destructive quakes, have triggered the EEWS with no false alert. The root mean square (RMS) error of the magnitude for the 28 destructive quakes is 0.32. In addition, innovative work is done to suppress false alarm and miss alarm, which pushes forward the application of EEW in China. The technology is also being applied in Nepal now.

The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational

The Berkeley Seismological Laboratory operates seismic and geodetic stations in the San Francisco Bay area and northern California for earthquake and deformation monitoring. The seismic systems, part of the Berkeley Digital Seismic Network (BDSN), include strong motion and broadband sensors, and 24-bit dataloggers. The data from 20 GPS stations, part of the Bay Area Regional Deformation (BARD) network of more than 70 stations in northern California, are acquired in real-time. We have developed methods to acquire GPS data at 12 stations that are collocated with the seismic systems using the seismic dataloggers, which have large on-site data buffer and storage capabilities, merge it with the seismic data stream in MiniSeed format, and continuously stream both data types using reliable frame relay and/or radio modem telemetry. Currently, the seismic data are incorporated into the Rapid Earthquake Data Integration (REDI) project to provide notification of earthquake magnitude, location, moment tensor, and strong motion information for hazard mitigation and emergency response activities. The geodetic measurements can provide complementary constraints on earthquake faulting, including the location and extent of the rupture plane, unambiguous resolution of the nodal plane, and distribution of slip on the fault plane, which can be used, for example, to refine strong motion shake maps. We are developing methods to rapidly process the geodetic data to monitor transient deformation, such as coseismic station displacements, and for combining this information with the seismic observations to improve finite-fault characterization of large earthquakes. The GPS data are currently processed at hourly intervals with 2-cm precision in horizontal position, and we are beginning a pilot project in the Bay Area in collaboration with the California Spatial Reference Center to do epoch-by-epoch processing with greater precision.

The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

It is important to note that this document provides a brief introduction to the work of dozens of software developers and IT specialists, spanning in many cases more than a decade. References to significant amounts of supporting documentation, code, and information are supplied within.

NASA Langley Research Center's product is aerospace research information. To this end, Langley uses informationtechnology tools in three distinct ways. First, informationtechnology tools are used in the production of information via computation, analysis, data collection and reduction. Second, informationtechnology tools assist in streamlining business processes, particularly those that are primarily communication based. By applying these information tools to administrative activities, Langley spends fewer resources on managing itself and can allocate more resources for research. Third, Langley uses informationtechnology tools to disseminate its aerospace research information, resulting in faster turn around time from the laboratory to the end-customer.

The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquakeinformation service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.

All phases of disaster management require up-to-date and accurate information. Different in-situ and remote sensor systems help to monitor dynamic properties such as air quality, water level or inundated areas. The rapid emergence of web-based services has facilitated the collection, dissemination, and cartographic representation of spatial information from the public, giving rise to the idea of using Volunteered Geographic Information (VGI) to aid disaster management. In this study, with a brief review on the concept and the development of disaster management, opportunities and challenges for applying VGI in disaster management were explored. The challenges, including Data availability, Data quality, Data management and Legal issues of using VGI for disaster management, were discussed in detail with particular emphasis on the actual needs of disaster management practice in China. Three different approaches to assure VGI data quality, namely the classification and authority design of volunteers, a government-led VGI data acquisition framework for disaster management and a quality assessment system for VGI, respectively, were presented and discussed. As a case study, a prototype of VGI oriented earthquake disaster databank & sharing platform, an open WebGIS system for volunteers and other interested individuals collaboratively create and manage the earthquake disaster related information, was proposed, to provide references for improving the level of earthquake emergency response and disaster mitigation in China.

Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of informationtechnology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.

Progress in electronics and optics offers faster computers, and rapid communication via the internet that is matched by ever larger and evolving storage systems. Instinctively one assumes that this must be totally beneficial. However advances in software and storage media are progressing in ways which are frequently incompatible with earlier systems and the economics and commercial pressures rarely guarantee total compatibility with earlier systems. Instead, the industries actively choose to force the users to purchase new systems and software. Thus we are moving forward with new technological variants that may have access to only the most recent systems and we will have lost earlier alternatives. The reality is that increased processing speed and storage capacity are matched by an equally rapid decline in the access and survival lifetime of older information. This pattern is not limited to modern electronic systems but is evident throughout history from writing on stone and clay tablets to papyrus and paper. It is equally evident in image systems from painting, through film, to magnetic tapes and digital cameras. In sound recording we have variously progressed from wax discs to vinyl, magnetic tape and CD formats. In each case the need for better definition and greater capacity has forced the earlier systems into oblivion. Indeed proposed interactive music systems could similarly relegate music CDs to specialist collections. The article will track some of the examples and discuss the consequences as well as noting that this information loss is further compounded by developments in language and changes in cultural views of different societies.

Five examples illustrate how geologic and seismologic information can be used to reduce the effects of earthquakes Included are procedures for anticipating damage to critical facilities, preparing, adopting, or implementing seismic safety studies, plans, and programs, retrofitting highway bridges, regulating development in areas subject to fault-rupture, and strengthening or removing unreinforced masonry buildings. The collective effect of these procedures is to improve the public safety, health, and welfare of individuals and their communities. ?? 1984 Springer-Verlag New York Inc.

In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

The language of informationtechnology is discussed, with a focus on accessibility in the information society. The metaphors of informationtechnology as an "information superhighway" or "infobahn" are analyzed; limitations of the "road system" and developments of Internet systems are considered. The concept of…

Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

The Internet has fastened the collection of felt reports and macroseismic data after global earthquakes. At the European-Mediterranean Seismological Centre (EMSC), where the traditional online questionnaires have been replace by thumbnail-based questionnaires, an average of half of the reports are collected within 10 minutes of an earthquake's occurrence. In regions where EMSC is well identified this goes down to 5 min. The user simply specifies the thumbnail corresponding to observed effects erasing languages barriers and improving collection via small smartphone screens. A previous study has shown that EMSC data is well correlated with "Did You Feel It" (DYFI) data and 3 independent, manually collected datasets. The efficiency and rapidity of felt report collection through thumbnail-based questionnaires does not necessarily mean that they offer a complete picture of the situation for all intensities values, especially the higher ones. There are several potential limitations. Demographics probably play a role but so might eyewitnesses' behaviors: it is probably not their priority to report when their own safety and that of their loved ones is at stake. We propose to test this hypothesis on EMSC felt reports and to extend the study to LastQuake smartphone application uses. LastQuake is a free smartphone app providing very rapid information on felt earthquakes. There are currently 210 000 active users around the world covering almost every country except for a few ones in Sub-Saharan Africa. Along with felt reports we also analyze the characteristics of LastQuake app launches. For both composite datasets created from 108 earthquakes, we analyze the rapidity of eyewitnesses' reaction and how it changes with intensity values and surmise how they reflect different types of behaviors. We will show the intrinsic limitations of crowdsourced information for rapid situation awareness. More importantly, we will show in which cases the lack of crowdsourced information could

Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial informationtechnology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial informationtechnology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial informationtechnology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial informationtechnology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial informationtechnology for disaster risk management and responses.

For realizing cross-sectional inform ation sharing in the Tokyo metropolitan area, we develop disaster management applications to reduce negative impact due to vital issue in phase of initial response, and cooperation of those applications are demonstrated toward public officials in charge of disaster management. The demonstration of information sharing among disaster related organizations focusing on issues about simultaneous multiple post-earthquake fires and rescue operations after an earthquake directly underneath Tokyo are reported.

The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence

The Strategic InformationTechnology Plan of Washington is introduced and explained. The plan is mandated by state law to create a new framework for communication and collaboration to bring together agency technology planning with the achievement of statewide informationtechnology goals and strategies. It provides a point of reference for the…

The current post-secondary graduation rates in computing disciplines suggest American universities are only training enough students to fill one third of the projected 1.4 million technology and computing jobs available (National Center for Women and InformationTechnology, 2011). Pursuit of informationtechnology (IT) majors depends, to a great…

Presented in this paper is a modified interpretation of the traditional TRLs aimed solely at informationtechnology. The intent of this new set of definitions is twofold: First, to enable a definitive measurement of progress among developing informationtechnologies for spacecraft; and second, to clarify particular challenges and requirements that must be met as these technologies are validated in increasingly realistic environments.

Why do informationtechnology companies support exploratory research in physics and allied fields? The answer is simple-because of the need to bring new technology quickly to market. Ultimately, even long-term research is all about speed.

Considers problems and solutions for transfer of technologicalinformation for developing nations. Imbalances created by industrial growth have brought the concept of choice of technologies to the forefront of national objectives. (RAA)

Waves of enthusiasm for technological innovations that promise to revitalize teaching and learning are at least a century old. Unfortunately, the record of accomplishment for the many varieties of hardware and software introduced into schools over the decades is remarkably thin. Today's promoters of technology in education tend to forget similar…

This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

In the current rapidly changing business environment, companies need to be efficient and agile to survive and thrive. That is why flexible systems integration is urgent and crucial concern for any enterprise. For the meanwhile, systems integration technology is getting more complicated, and middleware types are beginning blur for decades. We sort system integration into four different types, “Delayed Federation", “Real-time Federation", “Delayed Integration", and “Real-time Integration". We also outline appropriate technology and architecture for each type.

Rapid technologicalinformation dissemination system related to the field of remote sensing is presented. The technology transfer staff systematically designed instructional materials and activities using the matrix as an organizer to meet the need of the students, scientists and users in a rapidly expanding technology.

These proceedings contain 27 papers developed for a conference at which information was provided on currently available and future technological alternatives for delivery of career information. The presentations by staff of State Occupational Information Coordinating Committees, Career Information Delivery Systems, and hardware vendors are grouped…

Intellectual pursuit and the recognition of ideas is a central concept. Copyrights protect the rights of intellectual creators while balancing those rights with the needs for access. As technologies have expanded, and production has become more sophisticated, the legal regulations surrounding their use have become more complex. With the advent of…

This paper explores the claims of technology's ability to enhance citizen participation, with particular attention focused on the Internet. The claims are grounded within the larger context of political theory, specifically the tension between representative and direct forms of democracy. Sections of the paper are: Introduction; "What's Wrong with…

Discusses a unit on nuclear technology which is taught in a physics class. Explains the unit design, implementation process, demonstrations used, and topics of discussion that include light and optics, naturally and artificially produced sources of radioactivity, nuclear equations, isotopes and half-lives, and power-generating nuclear reactors.…

There is an assumption that psychiatric nurses are late adopters of technology because psychiatric nursing has been traditionally viewed as a nontechnological nursing specialty. This article will review current nursing literature to outline the value and significance of computers and informationtechnologies to psychiatric nursing. Existing bodies of research literature related to computers and informationtechnology for psychiatric nurses. Three areas of psychiatric nursing are identified and the specific advantages and benefits of computers and informationtechnologies in each of these areas are discussed. In addition, the importance of informatics competencies for psychiatric nursing practice is reiterated in order to accelerate its acquisition.

Social technologies such as Weblogs, wikis, and social bookmarking are emerging both as information resources and as tools for research. This paper reflects on these technologies and suggests they may be well placed to build fluency in the higher-order thinking skills outlined in various information literacy frameworks, particularly in an…

Informationtechnology is having a profound effect on higher education in North America, and Brandon University in Manitoba (Canada) is in a position to join this movement in its early stages. The case for integrating informationtechnology into the curriculum is argued, and the potential role of the new library complex in the teaching function is…

Provides an overview of the theoretical arguments and problems encountered in the implementation of informationtechnology in Chinese language teaching. States there is a belief that teaching and learning can be enhanced with the introduction of informationtechnology, explaining that it may increase students' motivation to learn. (CMK)

The InformationTechnology Plan as set forth in this document provides the organizational structure, assignment of duties, infrastructure, and philosophic environment that should permit the Kern Community College District to proceed in an orderly manner in developing informationtechnology to enhance student learning, serve the faculty, and…

We study the breadth of inclusion of informationtechnology in sport management (SM) programs, surveying program sponsoring colleges and universities within a prominent state-university system. Our results indicate a very low number of SM programs require any type of informationtechnology courses as part of their core requirements. In fact, only…

This study investigated variables contributing to older adults' informationtechnology acceptance through a survey, which was used to find factors explaining and predicting older adults' informationtechnology acceptance behaviors. Four factors, including needs satisfaction, perceived usability, support availability, and public acceptance, were…

Since little empirical research has been conducted on adoption of currently available informationtechnology by the advertising industry, a study explored the extent of advertising agencies' adoption of selected informationtechnologies such as online database services and electronic mail. The study discussed data from earlier studies and analyzed…

The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) InformationTechnology Strategic Research (ITSR).

The NSF has been a leader in the development of new informationtechnologies, including support for work in education and technology. Often, opportunities for educators are found in larger efforts. This is the case for the InformationTechnology Research (ITR) program. It has now been extended to education areas, as announced in NSF Publication 00-126. Links to the program announcement in multiple formats are found at http://www.nsf.gov/cgi-bin/getpub?nsf00126.

Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquakeinformation for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquakeinformation website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.

It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include miniaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is less easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large data sets. Three limiting paradigms are as follows: saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage, and retrieval off the shelf; and the linear model of innovation. We must extend these paradigms to meet our goals for informationtechnology at the end of the decade.

It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for informationtechnology at the end of the decade.

As everyone knows, knowledge (along with its less-sophisticated sibling, information) is power. For a long time, detailed knowledge (in agriculture) has been generally inaccessible or was prohibitively expensive to acquire. Advances in electronics, communications, and software over the past several decades have removed those earlier impediments. Inexpensive sensors and...

Based upon the synopsis of system intelligence and information services, this paper puts forward the attributes and the logic structure of information service, sets forth intelligent technology framework of electronic information system, and presents a series of measures, such as optimizing business information flow, advancing data decision capability, improving information fusion precision, strengthening deep learning application and enhancing prognostic and health management, and demonstrates system operation effectiveness. This will benefit the enhancement of system intelligence.

The Information Age is transforming our economy and our lives. In its pathbreaking 1999 report to President Clinton, the Presidential InformationTechnology Advisory Committee (PITAC) outlined the ten crucial ways that new technologies are transforming society in the U.S. It is clear that the Federal government will need to provide the critical R&D investments that will help retain and bolster the U.S. technological lead in the 21st century. These investments will also support efforts to make new technologies and their benefits available to all U.S. citizens.

The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

Although children are often exposed to technological devices early in life, little is known about how they evaluate these novel sources of information. In two experiments, children aged 3, 4, and 5 years old ("n" = 92) were presented with accurate and inaccurate computer informants, and they subsequently relied on information provided by…

Purpose--The purpose of this paper is to study the provisions of informationtechnology IT for development of academic resources and examines the effect of IT in academic institutions for sharing information. Design/methodology/approach--The paper examines the role of IT in sharing information in academic institutions and explores the IT…

The objectives of this survey were: to gather information on the development of institutional informationtechnology policies and guidelines for responsible computing and use of electronic information; to identify the scope of such policies and guidelines; and to determine the role of the library in the development and/or use of the policies and…

This slide presentation reviews how informationtechnology supports the Human Research Facility (HRF) and specifically the uses that contractor has for the information. There is information about the contractor, the HRF, some of the experiments that were performed using the HRF on board the Shuttle, overviews of the data architecture, and software both commercial and specially developed software for the specific experiments.

Health informationtechnology has been embraced as a strategy to facilitate patients' access to their health information and engagement in care. However, not all patients are able to access, or are capable of using, a computer or mobile device. Although family caregivers assist individuals with some of the most challenging and costly health needs, their role in health informationtechnology is largely undefined and poorly understood. This perspective discusses challenges and opportunities of engaging family caregivers through the use of consumer-oriented health informationtechnology. We compile existing evidence to make the case that involving family caregivers in health informationtechnology as desired by patients is technically feasible and consistent with the principles of patient-centered and family-centered care. We discuss how more explicit and purposeful engagement of family caregivers in health informationtechnology could advance clinical quality and patient safety by increasing the transparency, accuracy, and comprehensiveness of patient health information across settings of care. Finally, we describe how clarifying and executing patients' desires to involve family members or friends through health informationtechnology would provide family caregivers greater legitimacy, convenience, and timeliness in health system interactions, and facilitate stronger partnerships between patients, family caregivers, and health care professionals.

The work environment surrounding integrated office systems is reviewed. The known effects of automated office technologies is synthesized and their known impact on work efficiency is reviewed. These effects are explored with regard to their impact on networks, work flow/processes, as well as organizational structure and power. Particular emphasis is given to structural changes due to the introduction of newer informationtechnologies in organizations. The new informationtechnologies have restructed the average organization's middle banks and, as a consequence, they have shrunk drastically. Organizational pyramids have flattened with fewer levels since executives have realized that they can get ahold of the needed information via the new technologies quicker and directly and do not have to rely on middle-level managers. Power shifts are typically accompanied with the introduction of these technologies resulting in the generation of a new form of organizational power.

Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National EarthquakeInformation Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground

The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed informationtechnology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

The importance of digital data capture at the point of health care service within the military environment is highlighted. Current paper-based data capture does not allow for efficient data reuse throughout the medical support information domain. A simple, high-level process and data flow model is used to demonstrate the importance of data capture at point of service. The Department of Defense is developing a personal digital assistant, called MEDTAG, that accomplishes point of service data capture in the field using a prototype smart card as a data store in austere environments.

The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies aremore » inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other informationtechnology infrastructure components necessary for a more integrated informationtechnology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.« less

Health care executives are increasingly frustrated by InformationTechnology (IT). Although our industry is often accused of underinvesting in technology (hospitals average 2-3 percent of their costs in IT, compared to other industry's 8-10 percent), when IT investments are made, they fail to reflect demonstrable return to the bottom line. Yet the effective deployment of technology is so critical to the success of the organization and can in itself cause the failure of a health care system.

NASA Missions and Programs create a wealth of science data and information that are essential to understanding our earth, our solar system and the universe. Advancements in informationtechnology will allow many people within and beyond the Agency to more effectively analyze and apply these data and information to create knowledge. The desired end result is to see that NASA data and science information are used to generate the maximum possible impact to the nation: to advance scientific knowledge and technological capabilities, to inspire and motivate the nation's students and teachers, and to engage and educate the public.

Explores the theme of self-censorship in libraries in relation to new technologies. Highlights include the results of investing in high-cost electronic resources at the expense of traditional, lower-status formats; the effect of informationtechnologies on literacy and historical records; and market censorship, including centralization and…

Discusses the so-called third industrial revolution, or the information revolution. Topics addressed include the progression of the revolution in the U.S. economy, in Europe, and in Third World countries; the empowering technologies, including digital switches, optical fiber, semiconductors, CD-ROM, networks, and combining technologies; and future…

Describes developments in informationtechnology during 1988, including new telecommunications and networking services, advances in optical disk technologies, the increased use of facsimile transmissions, and new microcomputer hardware and software products. Litigation within the computer industry is reviewed, and the implications for needed…

Recent technological progress in the generation, manipulation and detection of individual single photons has opened a new scientific field of photonic quantum information. This progress includes the realization of single photon switches, photonic quantum circuits with specific functions, and the application of novel photonic states to novel optical metrology beyond the limits of standard optics. In this review article, the recent developments and current status of photonic quantum informationtechnology are overviewed based on the author's past and recent works.

A methodology is described for system engineering security into large informationtechnology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate informationtechnology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

The California Integrated Seismic Network (CISN) Display is part of a Web-enabled earthquake notification system alerting users in near real-time of seismicity, and also valuable geophysical information following a large earthquake. It will replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering graphical earthquakeinformation to users at emergency operations centers, and other organizations. Features distinguishing the CISN Display from other GUI tools are a state-full client/server relationship, a scalable message format supporting automated hyperlink creation, and a configurable platform-independent client with a GIS mapping tool; supporting the decision-making activities of critical users. The CISN Display is the front-end of a client/server architecture known as the QuakeWatch system. It is comprised of the CISN Display (and other potential clients), message queues, server, server "feeder" modules, and messaging middleware, schema and generators. It is written in Java, making it platform-independent, and offering the latest in Internet technologies. QuakeWatch's object-oriented design allows components to be easily upgraded through a well-defined set of application programming interfaces (APIs). Central to the CISN Display's role as a gateway to other earthquake products is its comprehensive XML-schema. The message model starts with the CUBE message format, but extends it by provisioning additional attributes for currently available products, and those yet to be considered. The supporting metadata in the XML-message provides the data necessary for the client to create a hyperlink and associate it with a unique event ID. Earthquake products deliverable to the CISN Display are ShakeMap, Ground Displacement, Focal Mechanisms, Rapid Notifications, OES Reports, and Earthquake Commentaries. Leveraging the power of the XML-format, the CISN Display provides prompt access to

can give Emergency Response managers' information needed to allocate limited personnel and resources after a major event. The shaking intensity shape files may be downloaded out-of-band to the client computer, and with the GIS mapping tool, users can plot organizational assets on the CISN Display map and analyze their inventory against potentially damaged areas. Lastly, in support of a robust design is a well-established and reliable set of communication protocols. To achieve a state-full server connection and messaging via a signaling channel the application uses a Common Object Request Broker Architecture (CORBA). The client responds to keep-alive signals from the server, and alerts users of changes in the connection status. This full-featured messaging service will allow the system to trigger a reconnect strategy whenever the client detects a loss of connectivity. This sets the CISN Display apart from its predecessors, which do not provide a failover mechanism, or a state of connection. Thus by building on past programming successes and advances in proven Internet technologies, the CISN Display will augment the emergency responder's ability to make informed decisions following a potentially damaging earthquake.

Discusses the need for proposing and implementing an informationtechnology standards policy for the federal government. Topics addressed include the National Information Infrastructure (NII); voluntary standards among federal agencies; private sector organizations; coordinating the use of standards; enforcing compliance; policy goals; a framework…

The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and

Communication and informationtechnologies can reduce the barriers of distance and space that disadvantage rural areas. This report defines a set of distinct voice, computer, and video telecommunication services; describes several rural information applications that make use of these services; and surveys various wireline and wireless systems and…

The information revolution is making various impacts on social studies. Students are children of this age and are learning social ideas from technology. The information revolution should be part of the social studies curriculum. Unresolved questions (e.g., Who should write computer software?) and some thoughts on the future are discussed. (RM)

Nursing homes are considered lagging behind in adopting health informationtechnology (HIT). Many studies have highlighted the use of HIT as a means of improving health care quality. However, these studies overwhelmingly do not provide empirical information proving that HIT can actually achieve these improvements. The main research goal of this…

Like it or not, an institution's IT infrastructure is a matter with which institutional strategic planners must concern themselves. Information systems represent a significant investment, they perform mission-critical functions, and the appropriate use of information and learning technologies can have a critical part to play in delivering against…

To enhance the informationtechnology literacy of optometry students, the Southern College of Optometry (Tennessee) developed an academic assignment, the Electronic Media Paper, in which second-year students must search two different electronic media for information. Results suggest Internet use for searching may be a useful tool for specific…

This dissertation aims to investigate how advanced informationtechnologies cope with the various demands of disaster response. It consists of three essays on the exploration of micro-blogging and FOSS environments. The first essay looks at the usage of micro-blogging in the aftermath of the massive 2008 China earthquake and explores the…

The lecture is devoted to the urgent problem that is to increase the quality of cytological diagnosis, by diminishing the subjectivism factor via introduction of up-to-date computer informationtechnologies into a cytologist's practice. Its main lines from the standardization of cytological specimen preparation to the registration of a cytologist's opinion and the assessment of the specialist's work quality at the laboratories that successfully use the capacities of the current information systems are described. Informationtechnology capabilities to improve the interpretation of the cellular composition of cytological specimens are detailed.

This volume contains the reports of three working groups which were convened separately over a 3-year period at the request of the Advisory Committee for the Division of Information Science and Technology of the National Science Foundation to obtain the opinion of experts concerning research opportunities and trends in information science and…

Management's dilemma, when allocating financial resources towards the improvement of technological readiness and IT flexibility within their organizations, is to control financial risk and maximize IT effectiveness. Technological readiness is people's propensity to embrace and use technology. Its drivers are optimism, innovativeness, discomfort,…

This essay is based on the assumption that current practices and knowledge of Information and InformationTechnology in Health are unable to deal with the complexity of the health/disease/care processes and contemporary problems that must be overcome, curbing the expansion of the response capacity of the Brazilian State. It aims to further explore the understanding of the roots and determining factors behind these constraints, analyzing alternatives for confronting them that depend less on location-specific initiatives in the field of information and more - among others - on the adoption of new benchmarks, starting with the meaning and concept of Health. It identifies the existence of an 'information and informationtechnology interfield' that arises from an epistemology based on a transdisciplinary approach, as well as the consolidation of a political and historical process of institutional construction, an area endowed with power and relevance: a political-epistemological interfield. The analysis goes on through an exploratory study of the social, political and epistemological processes found in the historical construction health information networks established by Science and Technology in health, as well as by healthcare systems and services, in addition to social, political and economic information.

tropical deforestation , which is particularly acute in the Amazon rainforests of Brazil; and the encroachment of desertification, especially in Africa...in the general decline in authoritative regimes during the last 15 years. But a definitive impact of these trends on the fate of nations became...the world in myriad ways, but the impact of informationtechnology is likely to become even deeper and broader in the coming decades as the technology

Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

The Internet represents a technological revolution that is transforming our society. In the healthcare industry, physicians have been typified as slow adopters of informationtechnology. However, young physicians, having been raised in a computer-prevalent society, may be more likely to embrace technology. We attempt to characterize the use and acceptance of the Internet and informationtechnology among resident physicians in a large academic medical center and to assess concerns regarding privacy, security, and credibility of information on the Internet. A 41-question survey was distributed to 150 pediatric, medical, and surgical residents at an urban, academic medical center. One hundred thirty-five residents completed the survey (response rate of 90%). Responses were evaluated and statistical analysis was done. The majority of resident physicians in our survey have adopted the tools of informationtechnology. Ninety-eight percent used the Internet and 96% use e-mail. Two-thirds of the respondents used the Internet for healthcare-related purposes and a similar percentage thought that the Internet has affected their practice of medicine positively. The majority of residents thought that Internet healthcare services such as electronic medical records, peer-support websites, and remote patient monitoring would be beneficial for the healthcare industry. However, they are concerned about the credibility, privacy, and security of health and medical information online. The majority of resident physicians in our institution use Internet and informationtechnology in their practice of medicine. Most think that the Internet will continue to have a beneficial role in the healthcare industry.

This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

Infection prevention requires handling enormous amounts of medical information collection, analysis, and delivery--a cumbersome, inefficient process. Hospital information system (HIS) data not intended for preventing infection cannot be used directly for such prevention. The rapid introduction of informationtechnology in infection prevention can potentially solve these problems. The IT-based infection prevention system (ITIPS) structure depends on the purpose specified, however, and using this information in hospitals requires that the detailed HIS structure be clarified, especially the connection between HIS and ITIPS. The future ITIPS role is envisioned in early infection detection and warning. This, in turn, requires that ITIPS field operational support systems for medical staff mature further.

A team was formed to assess NASA Office of Space Science (OSS) informationtechnology research and development activities. These activities were reviewed for their relevance to OSS missions, for their potential for using products better supplied by industry or other government agencies, and for recommending an informationtechnology (IT) infusion strategy for appropriate products for OSS missions. Assessment scope and methodology are presented. IT needs and interests for future OSS missions and current NASA IT research and development (R&D) are discussed. Non-NASA participants provide overviews of some of their IT R&D programs. Implementation and infusion issues and the findings and recommendations of the assessment team are presented.

Recent technological progress in the generation, manipulation and detection of individual single photons has opened a new scientific field of photonic quantum information. This progress includes the realization of single photon switches, photonic quantum circuits with specific functions, and the application of novel photonic states to novel optical metrology beyond the limits of standard optics. In this review article, the recent developments and current status of photonic quantum informationtechnology are overviewed based on the author’s past and recent works. PMID:26755398

Health informationtechnology is an emerging area of focus in clinical medicine with the potential to improve injury and violence prevention practice. With injuries being the leading cause of death for Americans aged 1–44 years, greater implementation of evidence-based preventive services, referral to community resources, and real-time surveillance of emerging threats is needed. Through a review of the literature and capturing of current practice in the field, this paper showcases how health informationtechnology applied to injury and violence prevention can lead to strengthened clinical preventive services, more rigorous measurement of clinical outcomes, and improved injury surveillance, potentially resulting in health improvement. PMID:25441230

... GOVERNMENT ACCOUNTABILITY OFFICE Health InformationTechnology Policy Committee Appointment AGENCY... Recovery and Reinvestment Act of 2009 (ARRA) established the Health InformationTechnology Policy Committee to make recommendations on the implementation of a nationwide health informationtechnology...

The article represents the structure of informationaltechnologies complex that is applied in modern school education, describes the most important educational methods, shows the results of their implementation. It represents the forms and methods of educational process informative support usage, examined in respects of different aspects of their using that take into account also the psychological features of students. A range of anxious facts and dangerous trends connected with the usage and distribution of the informationaltechnologies that are to be taken into account in the educational process of informatization is also indicated in the article. Materials of the article are based on the experience of many years in operation and development of the informational educational sphere on the basis of secondary school of the physics and mathematics specialization.

Healthcare is evolving from a task-based industry to a knowledge-based one. To gain and retain value as intellectual capital, nursing likewise must evolve from a vocation of task performers to a profession of knowledge-workers. Informationtechnology can transform nursing tasks into nursing knowledge.

The National Science Foundation was the original organizational leader for the Internet, and it is still engaged in funding research and infrastructure related to the use of networked information. As it is written in the strategic plan for the Directorate for Computer and Information Science and Engineering, "These technologies promise to have at least as great an impact as did the invention of written language thousands of years ago."

The Clearinghouse provides emergency management and response professionals, scientific and engineering communities with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes or tsunamis. Clearinghouse activations include participation from Federal, State and local government, law enforcement, fire, EMS, emergency management, public health, environmental protection, the military, public and non-governmental organizations, and private sector. For the August 24, 2014 S. Napa earthquake, over 100 people from 40 different organizations participated during the 3-day Clearinghouse activation. Every organization has its own role and responsibility in disaster response; however all require authoritative data about the disaster for rapid hazard assessment and situational awareness. The Clearinghouse has been proactive in fostering collaboration and sharing Essential Elements of Information across disciplines. The Clearinghouse-led collaborative promotes the use of standard formats and protocols to allow existing technology to transform data into meaningful incident-related content and to enable data to be used by the largest number of participating Clearinghouse partners, thus providing responding personnel with enhanced real-time situational awareness, rapid hazard assessment, and more informed decision-making in support of response and recovery. The Clearinghouse efforts address national priorities outlined in USGS Circular 1242, Plan to Coordinate NEHRP post-earthquake investigations and S. 740-Geospatial Data Act of 2015, Sen. Orrin Hatch (R-UT), to streamline and coordinate geospatial data infrastructure, maximizing geospatial data in support of the Robert T. Stafford Act. Finally, the US Dept. of Homeland Security, Geospatial Management Office, recognized Clearinghouse's data sharing efforts as a Best Practice to be included in the forthcoming 2015 HLS Geospatial Concept of Operations.

This paper aims at explaining the outcomes of informationtechnology education for international students using anthropological theories of cultural schemas. Even though computer science and engineering are usually assumed to be culture-independent, the practice in classrooms seems to indicate that learning patterns depend on culture. The…

National Science Foundation (NSF); the United States Department of Agriculture (USDA); and the Department of Commerce (DOC)-plus eight other government S&T resources. It is intended as a starter set to help users locate U.S. government technologyinformation. For convenience, it uses Internet addresses, e-mail addresses, and telephone numbers. The Appendix contains a

Informationtechnology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

This book presents an overview of the present status of the use of library automation hardware and software in Pakistan. The following 20 articles are included: (1) "The Status of Library Automation in Pakistan"; (2) "Promoting InformationTechnology in Pakistan: the Netherlands Library Development Project"; (3) "Library…

Explores library-related implications of the U.S. Department of Justice's investigations into the operations of Microsoft and Intel and suggests that developing a broader understanding of informationtechnology marketing is crucial to the short- and long-term future of libraries. (MES)

This paper evaluates the impacts of the Internet on organizational structures and identifies new forms of organizations in light of informationtechnology (IT) advances. Four traditional forms of organizations are summarized, i.e., the bureaucratic hierarchy, the entrepreneurial organization, the matrix organization, and the adhocacy. The…

For student achievement, the diffusion and adoption of informationtechnology (IT) infrastructure enabled by special funding was posited to have a positive impact on student achievement. Four urban school districts provided the context for this study to assess the impact of IT adoption on standardized test scores.

Following the onset of the Canterbury, New Zealand earthquakes, there were widespread concerns that mental health services were under severe strain as a result of adverse consequences on mental health. We therefore examined Health of the Nation Outcome Scales data to see whether this could inform our understanding of the impact of the Canterbury earthquakes on patients attending local specialist mental health services. Health of the Nation Outcome Scales admission data were analysed for Canterbury mental health services prior to and following the Canterbury earthquakes. These findings were compared to Health of the Nation Outcome Scales admission data from seven other large District Health Boards to delineate local from national trends. Percentage changes in admission numbers were also calculated before and after the earthquakes for Canterbury and the seven other large district health boards. Admission Health of the Nation Outcome Scales scores in Canterbury increased after the earthquakes for adult inpatient and community services, old age inpatient and community services, and Child and Adolescent inpatient services compared to the seven other large district health boards. Admission Health of the Nation Outcome Scales scores for Child and Adolescent community services did not change significantly, while admission Health of the Nation Outcome Scales scores for Alcohol and Drug services in Canterbury fell compared to other large district health boards. Subscale analysis showed that the majority of Health of the Nation Outcome Scales subscales contributed to the overall increases found. Percentage changes in admission numbers for the Canterbury District Health Board and the seven other large district health boards before and after the earthquakes were largely comparable with the exception of admissions to inpatient services for the group aged 4-17 years which showed a large increase. The Canterbury earthquakes were followed by an increase in Health of the Nation

Global Positioning System (GPS) data are useful for understanding both interseismic and postseismic deformation. Models of GPS data suggest that the lower crust, lateral heterogeneity, and fault slip, all provide a role in the earthquake cycle.

The utilization of hospital informationtechnology (HIT) as a tool for home care is a recent trend in health science. Subjects gaining benefits from this new endeavor include middle-aged individuals with serious chronic illness living at home. Published data on the utilization of health care informationtechnology especially for home care in chronic illness patients have increased enormously in recent past. The common chronic illnesses reported in these studies were primarily on heart and lung diseases. Furthermore, health professionals have confirmed in these studies that HIT was beneficial in gaining better access to information regarding their patients and they were also able to save that information easily for future use. On the other hand, some health professional also observed that the use of HIT in home care is not suitable for everyone and that individuals cannot be replaced by HIT. On the whole it is clear that the use of HIT could complement communication in home care. The present review aims to shed light on these latest aspects of the health care informationtechnology in home care.

Managers work to achieve the greatest output for the least input effort, better balancing all factors of delivery to achieve the most with the smallest resource effort. Documentation of actual health informationtechnology (HIT) cost savings has been elusive. Informationtechnology and linear programming help to control hospital costs without harming service quality or staff morale. This study presents production function results from a study of hospital output during the period 2008-2011. The results suggest that productivity varies widely among the 58 hospitals as a function of staffing patterns, methods of organization, and the degree of reliance on information support systems. Financial incentives help to enhance productivity. Incentive pay for staff based on actual productivity gains is associated with improved productivity. HIT can enhance the marginal value product of nurses and staff, so that they concentrate their workday around patient care activities. The implementation of electronic health records (EHR) was associated with a 1.6 percent improvement in productivity.

The Medicaid InformationTechnology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their informationtechnology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs).

The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes â and when.

Advances in medicine in recent decades are in significant correlation with the advances in the informationtechnology. Modern informationtechnologies (IT) have enabled faster, more reliable and comprehensive data collection. These technologies have started to create a large number of irrelevant information, which represents a limiting factor and a real growing gap, between the medical knowledge on one hand, and the ability of doctors to follow its growth on the other. Furthermore, in our environment, the term technology is generally reserved for its technical component. Education means, learning, teaching, or the process of acquiring skills or behavior modification through various exercises. Traditionally, medical education meant the oral, practical and more passive transferring of knowledge and skills from the educators to students and health professionals. For the clinical disciplines, of special importance are the principles, such as, “learning at bedside,” aided by the medical literature. In doing so, these techniques enable students to contact with their teachers, and to refer to the appropriate literature. The disadvantage of these educational methods is in the fact, that teachers often do not have enough time. Additionally they are not very convenient to the horizontal and vertical integration of teaching, create weak or almost no self education, as well as, low skill levels and poor integration of education with a real social environment. In this paper authors describe application of modern IT in medical education – their advantages and disadvantages comparing with traditional ways of education. PMID:23408471

Advances in medicine in recent decades are in significant correlation with the advances in the informationtechnology. Modern informationtechnologies (IT) have enabled faster, more reliable and comprehensive data collection. These technologies have started to create a large number of irrelevant information, which represents a limiting factor and a real growing gap, between the medical knowledge on one hand, and the ability of doctors to follow its growth on the other. Furthermore, in our environment, the term technology is generally reserved for its technical component. Education means, learning, teaching, or the process of acquiring skills or behavior modification through various exercises. Traditionally, medical education meant the oral, practical and more passive transferring of knowledge and skills from the educators to students and health professionals. For the clinical disciplines, of special importance are the principles, such as, "learning at bedside," aided by the medical literature. In doing so, these techniques enable students to contact with their teachers, and to refer to the appropriate literature. The disadvantage of these educational methods is in the fact, that teachers often do not have enough time. Additionally they are not very convenient to the horizontal and vertical integration of teaching, create weak or almost no self education, as well as, low skill levels and poor integration of education with a real social environment. In this paper authors describe application of modern IT in medical education - their advantages and disadvantages comparing with traditional ways of education.

Informationtechnology and mobile devices may be beneficial and useful in many aspects of stroke management, including recognition of stroke, transport and triage of patients, emergent stroke evaluation at the hospital, and rehabilitation. In this review, we address the contributions of informationtechnology and mobile health to stroke management. Rapid detection and triage are essential for effective thrombolytic treatment. Awareness of stroke warning signs and responses to stroke could be enhanced by using mobile applications. Furthermore, prehospital assessment and notification could be streamlined for use in telemedicine and teleradiology. A mobile telemedicine system for assessing the National Institutes of Health Stroke Scale scores has shown higher correlation and fast assessment comparing with face-to-face method. Because the benefits of thrombolytic treatment are time-dependent, treatment should be initiated as quickly as possible. In-hospital communication between multidisciplinary team members can be enhanced using informationtechnology. A computerized in-hospital alert system using computerized physician-order entry was shown to be effective in reducing the time intervals from hospital arrival to medical evaluations and thrombolytic treatment. Mobile devices can also be used as supplementary tools for neurologic examination and clinical decision-making. In post-stroke rehabilitation, virtual reality and telerehabilitation are helpful. Mobile applications might be useful for public awareness, lifestyle modification, and education/training of healthcare professionals. Informationtechnology and mobile health are useful tools for management of stroke patients from the acute period to rehabilitation. Further improvement of technology will change and enhance stroke prevention and treatment.

To develop a technological tool that improves the initial learning of sign language in hearing impaired children. The development of this research was conducted in three phases: the lifting of requirements, design and development of the proposed device, and validation and evaluation device. Through the use of informationtechnology and with the advice of special education professionals, we were able to develop an electronic device that facilitates the learning of sign language in deaf children. This is formed mainly by a graphic touch screen, a voice synthesizer, and a voice recognition system. Validation was performed with the deaf children in the Filadelfia School of the city of Bogotá. A learning methodology was established that improves learning times through a small, portable, lightweight, and educational technological prototype. Tests showed the effectiveness of this prototype, achieving a 32 % reduction in the initial learning time for sign language in deaf children.

The clinical information system architecture at the Columbia-Presbyterian Medical Center in New York is being incorporated into an intranet using Internet and World Wide Web protocols. The result is an Enterprise-Wide Web which provides more flexibility for access to specific patient information and general medical knowledge. Critical aspects of the architecture include a central data repository and a vocabulary server. The new architecture provides ways of displaying patient information in summary, graphical, and multimedia forms. Using customized links called Infobuttons, we provide access to on-line information resources available on the World Wide Web. Our experience to date has raised a number of interesting issues about the use of this technology for health care systems.

Developing countries emphasize expansion of the educated population but demand for quality improvement follows later. Current science education reform is driven in part by post cold war restructuring of the global economy and associated focus on the education of a more scientifically literate society, due to the industrial change from labor-intensive to high-technology type, and the societal change inherent in the present information era. Industry needs employees of broad and flexible background with inter disciplinary training, engineers with better physics training, and well trained physicists. Education researches have proved that active-learning based methods are superior to the traditional methods and the informationtechnology (IT) has lot to offer in this. Use of IT for improving physics education is briefly discussed with prospects for collaboration in the Asia-Pacific region via Asian Physics Education Network (ASPEN), UNESCO University Foundation Course in Physics (UUFCP), etc.

Modern informationtechnologies and worldwide communication through the Internet promise both universal access to information and the globalization of the medico-social network's modes of communication between doctors, laboratories, patients, and other players. The authors, specialists in public health and members of an association that aims to create opportunities for access to training in public health in developing countries, warn that the use of the term "globalization" ignores the reality of the "digital divide," that is, the fact that social inequalities may preclude the realization of this promise on a truly global scale. PMID:11720953

Modern informationtechnologies and worldwide communication through the Internet promise both universal access to information and the globalization of the medico-social network s modes of communication between doctors, laboratories, patients, and other players. The authors, specialists in public health and members of an association that aims to create opportunities for access to training in public health in developing countries, warn that the use of the term "globalization" ignores the reality of the "digital divide," that is, the fact that social inequalities may preclude the realization of this promise on a truly global scale.

alternative root be economically advantageous , an actor’s ability to exploit market forces and create an alternative root would be significantly improved...conduct their operations. Therefore, a loss or disruption to Internet services would not be advantageous for the desired outcomes of these syndicates.26... eCommerce Service loss or disruption [C] Traffic Redirection [C] = Undesired consequence InformationTechnology Sector Baseline Risk Assessment

The Integrated Engineering InformationTechnology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

A team was formed to assess NASA Office of Space Science (OSS) informationtechnology research and development activities. These activities were reviewed for their relevance to OSS missions, for their potential for using products better supplied by industry or other government agencies, and for recommending an IT infusion strategy for appropriate products for OSS missions. Assessment scope and methodology and the findings and recommendations of OSS IT users and providers are presented.

For a civilization of the post-industrial society which is defined as 'informational civilization,' the heightened interest to new technological expedients of any interaction with the information is characteristic. Now concept 'information' is defined as an independent category. The organization, storage, transmission and displaying of information, in essence, is one of the ways of its self- organization for a civilization of such type. Therefore branches bound with generating and support of information highways, are in the center of attention those who tries to capture strategic favorable positions in world community. The reasons of the geopolitical and economic strategy in this case are direct, 'easy-to-see' mechanisms of creation of the structures in investigation activity and industry, 'Easy-to- see,' analyzable, even predictable -- but not defining. The modern science knows the physical laws of composite system's self-organization, and, as clear now, the unclosed nonequilibrium system, which one is a World Net, is subject to the laws of such system's self-organization. Nowadays the fractal nature of all Webs' infrastructures as super complex self-organizing transport-information system is established. Besides, in a row of its performances we observe an openness, coherence, nonlinearly, that together with other aspects allows to identify it as physical fractal structure living on the laws of self-organization. The interest to this phenomenon is quite clear. We have to know the laws of behavior of a Web because it is one of the sides of appearance of Noosphere, and we must adequately interact with it. There are numerous important problems, bound with a Web, and attempts to resolve them are the first task of human community, as nevertheless the human person creates the Web (or nature by hand of the person?) It's very interesting to analyze the previous history of occurrence of such phenomenon, as a Web, and to do the attempts of the prognoses on the future. For

Information and communication technology (ICT) has brought many changes in medical education and practice in the last couple of decades. Teaching and learning medicine particularly has gone under profound changes due to computer technologies, and medical schools around the world have invested heavily either in new computer technologies or in the process of adapting to this technological revolution. In order to catch up with the rest of the world, developing countries need to research their options in adapting to new computer technologies. This descriptive survey study was designed to assess medical students' computer and Internet skills and their attitude toward ICT. Research findings showed that the mean score of self-perceived computer knowledge for male students in general was greater than for female students. Also, students who had participated in various prior computer workshops, had access to computer, Internet, and e-mail, and frequently checked their e-mail had higher mean of self-perceived knowledge and skill score. Finally, students with positive attitude toward ICT scored their computer knowledge higher than those who had no opinion. The results have confirmed that the medical schools, particularly in developing countries, need to bring fundamental changes such as curriculum modification in order to integrate ICT into medical education, creating essential infrastructure for ICT use in medical education and practice, and structured computer training for faculty and students.

December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the National Coordinator for Health InformationTechnology; Health InformationTechnology; Request for Information Regarding the President's Council of Advisors on Science and Technology (PCAST) Report Entitled ``Realizing the Full Potential of Health...

Cryptology and information security are set to play a more prominent role in the near future. In this regard, quantum communication and cryptography offer new opportunities to tackle ICT security. Quantum Information Processing and Communication (QIPC) is a scientific field where new conceptual foundations and techniques are being developed. They promise to play an important role in the future of information Security. It is therefore essential to have a cross-fertilizing development between quantum technology and cryptology in order to address the security challenges of the emerging quantum era. In this article, we discuss the impact of quantum technology on the current as well as future crypto-techniques. We then analyse the assumptions on which quantum computers may operate. Then we present our vision for the distribution of security attributes using a novel form of trust based on Heisenberg's uncertainty; and, building highly secure quantum networks based on the clear transmission of single photons and/or bundles of photons able to withstand unauthorized reading as a result of secure protocols based on the observations of quantum mechanics. We argue how quantum cryptographic systems need to be developed that can take advantage of the laws of physics to provide long-term security based on solid assumptions. This requires a structured integration effort to deploy quantum technologies within the existing security infrastructure. Finally, we conclude that classical cryptographic techniques need to be redesigned and upgraded in view of the growing threat of cryptanalytic attacks posed by quantum information processing devices leading to the development of post-quantum cryptography.

The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) that uses observed phase arrivals, ground motion amplitudes and selected prior information to estimate earthquake magnitude, location and origin time, and predict the distribution of peak ground motion throughout a region using envelope attenuation relationships. Implementation of the VS algorithm in California is an on-going effort of the Swiss Seismological Service (SED) at ETH Zürich. VS is one of three EEW algorithms - the other two being ElarmS (Allen and Kanamori, 2003) and On-Site (Wu and Kanamori, 2005; Boese et al., 2008) - that form the basis of the California Integrated Seismic Network ShakeAlert system, a prototype end-to-end EEW system that could potentially be implemented in California. The current prototype version of VS in California requires picks at 4 stations to initiate an event declaration. On average, taking into account data latency, variable station distribution, and processing time, this initial estimate is available about 20 seconds after the earthquake origin time, corresponding to a blind zone of about 70 km around the epicenter which would receive no warning, but where it would be the most useful. To increase the available warning time, we want to produce EEW estimates faster (with less than 4 stations). However, working with less than 4 stations with our current approach would increase the number of false alerts, for which there is very little tolerance in a useful EEW system. We explore the use of back-azimuth estimations and the Voronoi-based concept of not-yet-arrived data for reducing false alerts of the earliest VS estimates. The concept of not-yet-arrived data was originally used to provide evolutionary location estimates in EEW (Horiuchi, 2005; Cua and Heaton, 2007; Satriano et al. 2008). However, it can also be applied in discriminating between earthquake and non-earthquake signals. For real earthquakes, the

The great earthquake occurred in Tohoku District, Japan on 11th March, 2011. This earthquake is named "the 2011 off the Pacific coast of Tohoku Earthquake", and the damage by this earthquake is named "the Great East Japan Earthquake". About twenty thousand people were killed or lost by the tsunami of this earthquake, and large area was flooded and a large number of buildings were destroyed by the tsunami. The Geospatial Information Authority of Japan (GSI) has provided the data of tsunami flooded area interpreted from aerial photos taken just after the great earthquake. This is fundamental data of tsunami damage and very useful for consideration of reconstruction planning of tsunami damaged area. The authors analyzed the relationship among land use, landform classification, DEMs data flooded depth of the tsunami flooded area by the Great East Japan Earthquake in the Sendai Plain using GIS. Land use data is 100 meter grid data of National Land Information Data by the Ministry of Land, Infrastructure, Transportation and Tourism (MLIT). Landform classification data is vector data of Land Condition Map produced by GSI. DEMs data are 5 meters grid data measured with LiDAR by GSI after earthquake. Especially, the authors noticed the relationship between tsunami hazard damage and flooded depth. The authors divided tsunami damage into three categories by interpreting aerial photos; first is the completely destroyed area where almost wooden buildings were lost, second is the heavily damaged area where a large number of houses were destroyed by the tsunami, and third is the flooded only area where houses were less destroyed. The flooded depth was measured by photogrammetric method using digital image taken by Mobile Mapping System (MMS). The result of these geographic analyses show the distribution of tsunami damage level is as follows: 1) The completely destroyed area was located within 1km area from the coastline, flooded depth of this area is over 4m, and no relationship

Information and communication technologies (ICTs) are electronic tools used to convey, manipulate and store information. The exponential growth of Internet access and ICTs greatly influenced social, political, and economic processes in the United States, and worldwide. Regardless of the level of practice, ICTs will continue influencing the careers of social workers and the clients they serve. ICTs have received some attention in the social work literature and curriculum, but we argue that this level of attention is not adequate given their ubiquity, growth and influence, specifically as it relates to upholding social work ethics. Significant attention is needed to help ensure social workers are responsive to the technological changes in the health care system, including the health care infrastructure and use of technology among clients. Social workers also need ICT competencies in order to effectively lead different types of social change initiatives or collaborate with professionals of other disciplines who are using ICTs as part of existing strategies. This paper also identifies potential pitfalls and challenges with respect to the adoption of ICTs, with recommendations for advancing their use in practice, education, and research. PMID:21691444

The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquakeinformation online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquakeinformation and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

Contents include the following: What is OTIS? OTIS use. Proposed implementation method. Development history of the Solid Waste Management (SWM) TechnologyInformation Form (TIF) and OTIS. Current development state of the SWM TIF and OTIS. Data collection approach. Information categories. Critiques/questions/feedback.

Researches aimed at enriching the number of available documentary sources on earthquakes have an important role in seismology. To this end, this paper documents the history of prominent earthquakes associated with the NW-SE trending Sultandag-Aksehir Fault and Aksehir-Afyon graben system in Western-Central Anatolia since the historical times through 1766. This work also combines the earthquake data for both historical and instrumental periods, previously listed in various catalogues and resources, for the studied area. Documents from the Ottoman archives and libraries as well as the Ottoman and Turkish newspapers were scrutinized, and eight previously unreported earthquakes in the latter half of the nineteenth century and four new earthquakes in the period 1900-1931 were revealed. For the period from 1766 to 1931, the total number of known earthquakes for the area under investigation increased from eighteen to thirty thanks to the document search. Furthermore, the existing information on eleven previously reported earthquakes is updated for the period from 1862 to 1946. Earthquakes from 1946 to 1964 are compiled from the catalogues for data completeness.

This paper describes Toshiba‧s training program in InformationTechnology in India. It is not a simple technology training, but a training for globalization of Japanese engineers so that they can cope with people from different culture and business practices. We first describe why such training program became necessary. We then describe how the training courses and contents are developed. The operation of the training program and our effort in continual improvement are explained. The effectiveness of the program is also evaluated. The training program presented is a first in its kind and we believe that it can contribute to changing Toshiba from inside toward more globalized corporation. We also believe that this kind of overseas training is effective in training young students so that they can cope with globalizing society after graduation.

Informationtechnology advances are spawning visions of radically altered modus operandi for commerce, education, business, information storage and receival. Proponents of virtual technology domination offer a world of instant communications, information sharing, and binary commerce. Some express alarm to the electronic visionaries and see an expected world vacated of human interactions, which is populated by e-hermits. The reality is that access to the Internet is becoming pervasive worldwide and affords a virtual community and markets. Governments, education, markets, businesses and consumers are rushing to exploit and adjust to an electronic, virtual world. The exploitation and adjustment to this an 'ether-world' transcends boundaries is a challenge to stakeholders. Public policy, international agreements, education, businesses and consumers face monumental change in the way they live and conduct their lives. As with most paradigms shifts, pioneers rush forward and launch a myriad of new startups with many failing and some standing the test of time and utility. An example is the early pioneers in North America who headed westward to in search of a new vision of riches. They established towns, developed farms, dug mines and began new businesses. However, many of the pioneers moved from one venture to another. Some of their endeavors ended with ghost towns, abandoned farms and mines, and bankrupt businesses. In the end, however, a great nation was born. This author expects the ether-world to go through similar starts, fits, and adjustments before it emerges as a more stable part of the fabric of society.

It had been almost a decade since the hospitals that make up the Daughters of Charity Health System (DCHS) had engaged in a formal informationtechnology strategic planning process. In the summer of 2002, as the health system re-formed, there was a unique opportunity to introduce a planning process that reflected the governance style of the new health system. DCHS embarked on this journey, with the CIO initiating and formally sponsoring the informationtechnology strategic planning process in a dynamic and collaborative manner The system sought to develop a plan tailored to encompass both enterprise-wide and local requirements; to develop a governance model to engage the members of the local health ministries in plan development, both now and in the future; and to conduct the process in a manner that reflected the values of the Daughters of Charity. The DCHS CIO outlined a premise that the CIO would guide and be continuously involved in the development of this tailored process, in conjunction with an external resource. Together, there would be joint responsibility for introducing a flexible informationtechnology strategic planning methodology; providing an education on the current state of healthcare IT, including future trends and success factors; facilitating support to tap into existing internal talent; cultivating a collaborative process to support both current requirements and future vision; and developing a well-functioning governance structure that would enable the plan to evolve and reflect user community requirements. This article highlights the planning process, including the lessons learned, the benchmarking during and in post-planning, and finally, but most importantly, the unexpected benefit that resulted from this planning process.

The slow but progressive adoption of health informationtechnology (IT) nationwide promises to usher in a new era in health care. Electronic health record systems provide a complete patient record at the point of care and can help to alleviate some of the challenges of a fragmented delivery system, such as drug-drug interactions. Moreover, health IT promotes evidence-based practice by identifying gaps in recommended treatment and providing clinical decision-support tools. In addition, the data collected through digital records can be used to monitor patient outcomes and identify potential improvements in care protocols. Kaiser Permanente continues to advance its capability in each of these areas.

We fully agree with Carol Diamond and Clay Shirky that deployment of health informationtechnology (IT) is necessary but not sufficient for transforming U.S. health care. However, the recent work to advance health IT is far from an exercise in "magical thinking." It has been strategic thinking. To illustrate this, we highlight recent initiatives and progress under four focus areas: adoption, governance, privacy and security, and interoperability. In addition, solutions exist for health IT to advance rapidly without adversely affecting future policy choices. A broad national consensus is emerging in support of advancing health IT to enable the transformation of health and care.

Not only will healthcare investments in informationtechnology (IT) continue, they are sure to increase. Just as other industries learned over time how to extract more value from IT investments, so too will the healthcare industry, and for the same reason: because they must. This article explores the types of business value IT has generated in other industries, what value it can generate in healthcare, and some of the barriers encountered in achieving that value. The article ends with management principles for IT investment.

Developers of space systems must deal with an increasing amount of information in responding to extensive requirements and standards from numerous sources. Accessing these requirements and standards, understanding them, comparing them, negotiating them and responding to them is often an overwhelming task. There are resources to aid the space systems developer, such as lessons learned and best practices. Again, though, accessing, understanding, and using this information is often more difficult than helpful. This results in space systems that: 1. Do not meet all their requirements. 2. Do not incorporate prior engineering experience. 3. Cost more to develop. 4. Take longer to develop. The NASA Technical Standards Program (NTSP) web site at http://standards.nasa.gov has made significant improvements in making standards, lessons learned, and related material available to space systems developers agency-wide. The Standards Advisor was conceived to take the next steps beyond the current product, continuing to apply evolving informationtechnology that continues to improve information delivery to space systems developers. This report describes the features of the Standards Advisor and suggests a technical approach to its development.

The application of information and communication technologies (ICT) in nursing is an integral part of the educational curriculum at the university graduate level of nursing, but also part of scientific and professional meetings on nursing informatics. As part of seminars, students are obliged to choose e-health topics from their working environment, to show them and discuss with colleagues. The same is happening at meeting on nursing informatics. Selected papers on the issue are chosen to cover information literacy of nurses, examples of e-nursing, ICT infrastructure, the possible future developments and organizational aspects of e-health at healthcare institutions. Among others, special attention is paid to improving the quality of work in nursing.

Some pre-instructional misconceptions held by children can persist through scientific instruction and resist changes. Identifying these misconceptions would be beneficial for science instruction. In this preliminary study, scores on a 60-item true-false test of knowledge and misconceptions about earthquakes were compared with previous interview…

Future Marine Corps warfighting concepts will make it more difficult to locate casualties, which will complicate casualty evacuation, lengthen casualty wait times, and require infantrymen or corpsmen to provide more extensive treatment. In these future scenarios, information flow and communications will be critical to medical functions. We asked, for Navy medical support to the Marines, what information will future combat medicine require and what technologies should supply those information needs? Based on analyses of patient data streams, focus groups of Navy medical personnel, and our estimates of the cost and feasibility of communications systems, we recommend the following: (1) increase medical training for some fraction of Marines, especially in hemorrhage control; (2) augment corpsmen's training; (3) furnish data systems for evacuation and supply that would provide in-transit visibility and simplify requests; (4) provide all ground medical personnel with access to treatment information systems and limited voice communications; and (5) exploit e-mail systems to reduce reliance on voice communications. Implementation time frames are discussed.

The use of informationtechnology (IT) in dentistry is far ranging. In order to produce a working document for the dental educator, this paper focuses on those methods where IT can assist in the education and competence development of dental students and dentists (e.g. e-learning, distance learning, simulations and computer-based assessment). Web pages and other information-gathering devices have become an essential part of our daily life, as they provide extensive information on all aspects of our society. This is mirrored in dental education where there are many different tools available, as listed in this report. IT offers added value to traditional teaching methods and examples are provided. In spite of the continuing debate on the learning effectiveness of e-learning applications, students request such approaches as an adjunct to the traditional delivery of learning materials. Faculty require support to enable them to effectively use the technology to the benefit of their students. This support should be provided by the institution and it is suggested that, where possible, institutions should appoint an e-learning champion with good interpersonal skills to support and encourage faculty change. From a global prospective, all students and faculty should have access to e-learning tools. This report encourages open access to e-learning material, platforms and programs. The quality of such learning materials must have well defined learning objectives and involve peer review to ensure content validity, accuracy, currency, the use of evidence-based data and the use of best practices. To ensure that the developers' intellectual rights are protected, the original content needs to be secure from unauthorized changes. Strategies and recommendations on how to improve the quality of e-learning are outlined. In the area of assessment, traditional examination schemes can be enriched by IT, whilst the Internet can provide many innovative approaches. Future trends in IT will

There is growing recognition that a more sophisticated informationtechnology (IT) infrastructure is needed to improve the quality of nursing home care in the United States. The purpose of this study was to explore the concept of IT sophistication in nursing homes considering the level of technological diversity, maturity and level of integration in resident care, clinical support, and administration. Twelve IT stakeholders were interviewed from 4 nursing homes considered to have high IT sophistication using focus groups and key informant interviews. Common themes were derived using qualitative analytics and axial coding from field notes collected during interviews and focus groups. Respondents echoed the diversity of the innovative IT systems being implemented; these included resident alerting mechanisms for clinical decision support, enhanced reporting capabilities of patient-provider interactions, remote monitoring, and networking among affiliated providers. Nursing home IT is in its early stages of adoption; early adopters are beginning to realize benefits across clinical domains including resident care, clinical support, and administrative activities. The most important thread emerging from these discussions was the need for further interface development between IT systems to enhance integrity and connectivity. The study shows that some early adopters of sophisticated IT systems in nursing homes are beginning to achieve added benefit for resident care, clinical support, and administrative activities.

Summary Objective No framework exists to identify and study unintended consequences (UICs) with a focus on organizational and social issues (OSIs). To address this shortcoming, we conducted a literature review to develop a framework for considering UICs and health informationtechnology (HIT) from the perspective of OSIs. Methods A literature review was conducted for the period 2000-2015 using the search terms “unintended consequences” and “health information technology”. 67 papers were screened, of which 18 met inclusion criteria. Data extraction was focused on the types of technologies studied, types of UICs identified, and methods of data collection and analysis used. A thematic analysis was used to identify themes related to UICs. Results We identified two overarching themes. One was the definition and terminology of how people classify and discuss UICs. Second was OSIs and UICs. For the OSI theme, we also identified four sub-themes: process change and evolution, individual-collaborative interchange, context of use, and approaches to model, study, and understand UICs. Conclusions While there is a wide body of research on UICs, there is a lack of overall consensus on how they should be classified and reported, limiting our ability to understand the implications of UICs and how to manage them. More mixed-methods research and better proactive identification of UICs remain priorities. Our findings and framework of OSI considerations for studying UICs and HIT extend existing work on HIT and UICs by focusing on organizational and social issues. PMID:27830231

The objective of this paper is to describe some of the major healthcare informationtechnology (IT) infrastructures in Turkey, namely, Sağlık-Net (Turkish for "Health-Net"), the Centralized Hospital Appointment System, the Basic Health Statistics Module, the Core Resources Management System, and the e-prescription system of the Social Security Institution. International collaboration projects that are integrated with Sağlık-Net are also briefly summarized. The authors provide a survey of the some of the major healthcare IT infrastructures in Turkey. Sağlık-Net has two main components: the National Health Information System (NHIS) and the Family Medicine Information System (FMIS). The NHIS is a nation-wide infrastructure for sharing patients' Electronic Health Records (EHRs). So far, EHRs of 78.9 million people have been created in the NHIS. Similarly, family medicine is operational in the whole country via FMIS. Centralized Hospital Appointment System enables the citizens to easily make appointments in healthcare providers. Basic Health Statistics Module is used for collecting information about the health status, risks and indicators across the country. Core Resources Management System speeds up the flow of information between the headquarters and Provincial Health Directorates. The e-prescription system is linked with Sağlık-Net and seamlessly integrated with the healthcare provider information systems. Finally, Turkey is involved in several international projects for experience sharing and disseminating national developments. With the introduction of the "Health Transformation Program" in 2003, a number of successful healthcare IT infrastructures have been developed in Turkey. Currently, work is going on to enhance and further improve their functionality.

Summary Objectives The objective of this paper is to describe some of the major healthcare informationtechnology (IT) infrastructures in Turkey, namely, Sağlık-Net (Turkish for “Health-Net”), the Centralized Hospital Appointment System, the Basic Health Statistics Module, the Core Resources Management System, and the e-prescription system of the Social Security Institution. International collaboration projects that are integrated with Sağlık-Net are also briefly summarized. Methods The authors provide a survey of the some of the major healthcare IT infrastructures in Turkey. Results Sağlık-Net has two main components: the National Health Information System (NHIS) and the Family Medicine Information System (FMIS). The NHIS is a nation-wide infrastructure for sharing patients’ Electronic Health Records (EHRs). So far, EHRs of 78.9 million people have been created in the NHIS. Similarly, family medicine is operational in the whole country via FMIS. Centralized Hospital Appointment System enables the citizens to easily make appointments in healthcare providers. Basic Health Statistics Module is used for collecting information about the health status, risks and indicators across the country. Core Resources Management System speeds up the flow of information between the headquarters and Provincial Health Directorates. The e-prescription system is linked with Sağlık-Net and seamlessly integrated with the healthcare provider information systems. Finally, Turkey is involved in several international projects for experience sharing and disseminating national developments. Conclusion With the introduction of the “Health Transformation Program” in 2003, a number of successful healthcare IT infrastructures have been developed in Turkey. Currently, work is going on to enhance and further improve their functionality. PMID:24853036

The availability of political information throughout society made possible by the evolution of contemporary information communication technology has precipitated conflicting debate regarding the effects of technology use on real life political participation. Proponents of technology argue that the use of new informationtechnology stimulates…

The public sector plays an important role in promoting child health informationtechnology. Public sector support is essential in 5 main aspects of child health informationtechnology, namely, data standards, pediatric functions in health information systems, privacy policies, research and implementation funding, and incentives for technology adoption. Some innovations in health informationtechnology for adult populations can be transferred to or adapted for children, but there also are unique needs in the pediatric population. Development of health informationtechnology that addresses children's needs and effective adoption of that technology are critical for US children to receive care of the highest possible quality in the future.

This paper surveys more than 70 articles published in the IEEE Multimedia journal and other journals. The survey summarizes aspects of multimedia informationtechnology and categorizes application areas of multimedia informationtechnology and interesting research areas related to it.

Interdisciplinary glossary is proposed to ensure mutual understanding of specialists from various fields of science and technology. Glossary is designed with application of informationtechnologies. The field of informationtechnologies is considered. It is necessary for the understanding and cooperation of specialists in various areas. The technological solutions and applications for multi-disciplinary areas, results of testing of the developed techniques are presented.

This book examines the intersection of informationtechnologies, power, people, and bodies. It explores how informationtechnologies are on a path of creating efficiency, productivity, profitability, surveillance, and control, and looks at the ways in which human-machine interface technologies, such as wearable computers, biometric technologies,…

The ever-increasing data demands in a radiation oncology (RO) clinic require medical physicists to have a clearer understanding of the informationtechnology (IT) resource management issues. Clear lines of collaboration and communication among administrators, medical physicists, IT staff, equipment service engineers and vendors need to be established. In order to develop a better understanding of the clinical needs and responsibilities of these various groups, an overview of the role of IT in RO is provided. This is followed by a list of IT related tasks and a resource map. The skill set and knowledge required to implement these tasks are described for the various RO professionals. Finally, various models for assessing one's IT resource needs are described. The exposition of ideas in this white paper is intended to be broad, in order to raise the level of awareness of the RO community; the details behind these concepts will not be given here and are best left to future task group reports.

The focus of the May-June issue is on training and informationtechnology. Major articles/reports in this issue include: Communicating effectively, by Alain Bucaille, AREVA; Reputation management, by Susan Brisset, Bruce Power; Contol room and HSI modernization guidance, by Joseph Naser, EPRI; How far are we from public acceptance, by Jennifer A. Biedscheid and Murthy Devarakonda, Washington TRU Solutions LLC; Spent fuel management options, by Brent W. Dixon and Steven J. Piet, Idaho National Laboratory; Industry Awards; A secure energy future for America, by George W. Bush, President, United States of America; Vision of the future of nuclear energy, by Annemore » Lauvergeon, AREVA; and, Plant profile: strategy for transition to digital, TXU Power.« less

Humanism in medicine is defined as health care providers' attitudes and actions that demonstrate respect for patients' values and concerns in relation to their social, psychological and spiritual life domains. Specifically, humanistic clinical medicine involves showing respect for the patient, building a personal connection, and eliciting and addressing a patient's emotional response to illness. Health informationtechnology (IT) often interferes with humanistic clinical practice, potentially disabling these core aspects of the therapeutic patient-physician relationship. Health IT has evolved rapidly in recent years - and the imperative to maintain humanism in practice has never been greater. In this vision paper, we aim to discuss why preserving humanism is imperative in the design and implementation of health IT systems.

We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

Airport Surface Movement Area is but one of the actions taken to increase the capacity and safety of existing airport facilities. The System Integration Branch (SIB) has designed an integrated system consisting of an electronic moving display in the cockpit, and includes display of taxi routes which will warn controllers and pilots of the position of other traffic and warning information automatically. Although, this system has in test simulation proven to be accurate and helpful; the initial process of obtaining an airport layout of the taxi-routes and designing each of them is a very tedious and time-consuming process. Other methods of preparing the display maps are being researched. One such method is the use of the Geographical Information System (GIS). GIS is an integrated system of computer hardware and software linking topographical, demographic and other resource data that is being referenced. The software can support many areas of work with virtually unlimited information compatibility due to the system's open architecture. GIS will allow us to work faster with increased efficiency and accuracy while providing decision making capabilities. GIS is currently being used at the Langley Research Center with other applications and has been validated as an accurate system for that task. GIS usage for our task will involve digitizing aerial photographs of the topology for each taxi-runway and identifying each position according to its specific spatial coordinates. The information currently being used can be integrated with the GIS system, due to its ability to provide a wide variety of user interfaces. Much more research and data analysis will be needed before this technique will be used, however we are hopeful this will lead to better usage of man-power and technological capabilities for the future.

During ÑMedical Council Of India?, Platinum Jubilee Year (1933-2008) Celebrations, In Year 2008, Several Scientific Meeting/Seminar/Symposium, On Various Topics Of Contemporary Importance And Relevance In The Field Of ÑMedical Education And Ethics?, Were Organized, By Different Medical Colleges At Various Local, State, National Levels. The Present Discussion, Is An Comprehensive Summary Of Various Different Aspects of ìMedical Information Communication Technologyî, Especially UseFul For The Audience Stratum Group Of Those Amateur Medical & Paramedical Staff, With No Previous Work Experience Knowledge Of Computronics Applications. Outlining The, i.Administration Applications: Medical Records Etc, ii. Clinical Applications: Pros pective Scope Of TeleMedicine Applicabilities Etc iii. Other Applications: Efforts To Augment Improvement Of Medical Education, Medical Presentations, Medical Education And Research Etc. ÑMedical Trancription? & Related Recent Study Fields e.g ÑModern Pharmaceuticals?,ÑBio-Engineering?, ÑBio-Mechanics?, ÑBio-Technology? Etc., Along With Important Aspects Of Computers-General Considerations, Computer Ergonomics Assembled To Summarize, The AwareNess Regarding Basic Fundamentals Of Medical Computronics & Its Practically SuccessFul Utilities.

Health informationtechnology (HIT) purports to increase quality and efficiency in health care organizations. However, health care organizations are situated in constantly changing environments. They need dynamic capabilities to implement HIT effectively. This article builds on the dynamic capabilities perspective and generates propositions about implementing HIT in dynamic environments. Specifically, I identify the (1) the necessary resources and capabilities for organizations to implement HIT; (2) the organizational capabilities and benefits that can be enhanced by HIT; and (3) the similarities and differences between three distinct forms of HIT. I synthesized the literature on dynamic capabilities and HIT to identify dynamic capabilities that are associated with (1) electronic medical records, (2) telemedicine, and (3) social media. In addition, I discuss the benefits of these HITs for improving the dynamic capabilities of health care organizations. PROPOSITIONS/FINDINGS: This article generates three sets of propositions that can be tested empirically. First, I am concerned with how organizational size and human resources affect successful implementation of HIT. In addition, I argue that three technology-specific factors--hospital type, medical specialty, and socially desirable technical features--may affect the implementation of HIT. To cope with constantly changing environmental pressures, health administrators need to deploy, modify, and/or acquire organizational resources skillfully. Practitioners need to identify dynamic capabilities to support specific forms of HIT and understand how HIT enables health care organizations in turn. The concept of evolutionary fitness in the dynamic capabilities perspective may be developed to measure HIT implementation.

Ever since the Y2K scare, boards have grown increasingly nervous about corporate dependence on informationtechnology. Since then, computer crashes, denial of service attacks, competitive pressures, and the need to automate compliance with government regulations have heightened board sensitivity to IT risk. Unfortunately, most boards remain largely in the dark when it comes to IT spending and strategy, despite the fact that corporate information assets can account for more than 50% of capital spending. A lack of board oversight for IT activities is dangerous, the authors say. It puts firms at risk in the same way that failing to audit their books would. Companies that have established board-level IT governance committees are better able to control IT project costs and carve out competitive advantage. But there is no one-size-fits-all model for board supervision of a company's IT operations. The correct approach depends on what strategic "mode" a company is in whether its operations are extremely dependent on IT or not, and whether or not it relies heavily on keeping up with the latest technologies. This article spells out the conditions under which boards need to change their level of involvement in IT decisions, explaining how members can recognize their firms' IT risks and decide whether they should pursue more aggressive IT governance. The authors delineate what an IT governance committee should look like in terms of charter, membership, duties, and overall agenda. They also offer recommendations for developing IT policies that take into account an organization's operational and strategic needs and suggest what to do when those needs change. Given the dizzying pace of change in the world of IT, boards can't afford to ignore the state of their IT systems and capabilities. Appropriate board governance can go a long way toward helping a company avoid unnecessary risk and improve its competitive position.

... InformationTechnology Agreement: Advice and Information on the Proposed Expansion: Part 1; The InformationTechnology Agreement: Advice and Information on the Proposed Expansion: Part 2 AGENCY: United States... Technology Agreement: Advice and Information on the Proposed Expansion: Part 1, and investigation No. 332-536...

This paper examines the relationship between democratic citizenship and informationtechnology. Modern informationtechnology disputes the idea that citizens can be properly educated to assume the burdens necessary to reap the blessing of freedom. Informationtechnologies challenge the ability of citizens to fulfill the fundamental requirement of…

... Electronic and informationtechnology. As prescribed in 48 CFR 1339.270(a), insert the following provision: Electronic and InformationTechnology (APR 2010) (a) To be considered eligible for award, offerors must propose electronic and informationtechnology (EIT) that meet the applicable Access Board accessibility...

... Electronic and informationtechnology. As prescribed in 48 CFR 1339.270(a), insert the following provision: Electronic and InformationTechnology (APR 2010) (a) To be considered eligible for award, offerors must propose electronic and informationtechnology (EIT) that meet the applicable Access Board accessibility...

... Electronic and informationtechnology. As prescribed in 48 CFR 1339.270(a), insert the following provision: Electronic and InformationTechnology (APR 2010) (a) To be considered eligible for award, offerors must propose electronic and informationtechnology (EIT) that meet the applicable Access Board accessibility...

... Electronic and informationtechnology. As prescribed in 48 CFR 1339.270(a), insert the following provision: Electronic and InformationTechnology (APR 2010) (a) To be considered eligible for award, offerors must propose electronic and informationtechnology (EIT) that meet the applicable Access Board accessibility...

A Newfoundland study examining how informationtechnologies affect teaching interviewed 13 teachers at a leading high school in the use of informationtechnology. Teachers used informationtechnology to interact on a global basis, expand resources, enhance local content, and customize material. Problems included need for training, information…

College administrators are offered a series of questions to ask in evaluating the appropriateness of informationtechnology for their campuses. Issues addressed include defining institutional goals and the role of informationtechnology in them, determining the most effective organization of information resources and technology, and allocation of…

ABSTRACT Following the 7.8 magnitude earthquake that struck Ecuador on 16 April 2016, multiple salient public health concerns were raised, including the need to provide mental health and psychosocial support for individual survivors and their communities. The World Health Organization and the United Nations High Commissioner for Refugees recommend conducting a desk review to summarize existing information, specific to the affected communities, that will support timely, culturally-attuned assessment and delivery of mental health and psychosocial support shortly after the onset of a disaster or humanitarian emergency. The desk review is one component of a comprehensive toolkit designed to inform and support humanitarian actors and their responders in the field. This commentary provides a case example of the development of a desk review that was used to inform personnel responding to the 2016 earthquake in Ecuador. The desk review process is described in addition to several innovations that were introduced to the process during this iteration. Strengths and limitations are discussed, as well as lessons learned and recommendations for future applications. PMID:28265485

Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of informationtechnology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877

Contracting to provide technologicalinformation (TI) is a significant challenge. TI is an unusual commodity in five ways. (i) TI is difficult to count and value; conventional indicators, such as patients and citations, hardly indicate value. TI is often sold at different prices to different parties. (ii) To value TI, it may be necessary to "give away the secret." This danger, despite nondisclosure agreements, inhibits efforts to market TI. (iii) To prove its value, TI is often bundled into complete products, such as a computer chip or pharmaceutical product. Efficient exchange, by contrast, would involve merely the raw information. (iv) Sellers' superior knowledge about TI's value make buyers wary of overpaying. (v) Inefficient contracts are often designed to secure rents from TI. For example, licensing agreements charge more than marginal cost. These contracting difficulties affect the way TI is produced, encouraging self-reliance. This should be an advantage to large firms. However, small research and development firms spend more per employee than large firms, and nonprofit universities are major producers. Networks of organizational relationships, particularly between universities and industry, are critical in transmitting TI. Implicit barter-money for guidance-is common. Property rights for TI are hard to establish. Patents, quite suitable for better mousetraps, are inadequate for an era when we design better mice. Much TI is not patented, and what is patented sets fuzzy demarcations. New organizational forms are a promising approach to contracting difficulties for TI. Webs of relationships, formal and informal, involving universities, start-up firms, corporate giants, and venture capitalists play a major role in facilitating the production and spread of TI.

The Great East Japan Earthquake of March 11, 2011 caused extensive damage over a widespread area. Our hospital library, which is located in the affected area, was no exception. A large collection of books was lost, and some web content was inaccessible due to damage to the network environment. This greatly hindered our efforts to continue providing post-disaster medical information services. Information support, such as free access to databases, journals, and other online content related to the disaster areas, helped us immensely during this time. We were fortunate to have the cooperation of various medical employees and library members via social networks, such as twitter, during the process of attaining this information support.

This document defines an InformationTechnology Architecture for the National Aeronautics and Space Administration (NASA), where InformationTechnology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context

Though informationtechnology adoptions have been always referred to as innovations in firms, much of the business value literature has concentrated on the tangible and immediately measurable impacts of informationtechnology (IT) adoptions. This study aims to explore the impact of informationtechnology investments on the innovativeness of a…

Earthquake and Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey (SATREPS Project: Science and Technology Research Partnership for Sustainable Development by JICA-JST) Yoshiyuki KANEDA Disaster mitigation center Nagoya University/ Japan Agency for Marine-Earth Science and Technology (JAMSTEC) Mustafa ELDIK Boğaziçi University, Kandilli Observatory and Earthquake Researches Institute (KOERI) and Members of SATREPS Japan-Turkey project The target of this project is the Marmara Sea earthquake after the Izmit (Kocaeli) Earthquake 1999 along to the North Anatolian fault. According to occurrences of historical Earthquakes, epicenters have moved from East to West along to the North Anatolian Fault. There is a seismic gap in the Marmara Sea. In Marmara region, there is Istanbul with high populations such as Tokyo. Therefore, Japan and Turkey can share our own experiences during past damaging earthquakes and we can prepare for future large Earthquakes and Tsunamis in cooperation with each other in SATREPS project. This project is composed of Multidisciplinary research project including observation researches, simulation researches, educational researches, and goals are as follows, ① To develop disaster mitigation policy and strategies based on Multidisciplinary research activities. ② To provide decision makers with newly found knowledge for its implementation to the current regulations. ③ To organize disaster education programs in order to increase disaster awareness in Turkey. ④ To contribute the evaluation of active fault studies in Japan. In this SATREPS project, we will integrate Multidisciplinary research results for disaster mitigation in Marmara region and .disaster education in Turkey.

The fields of Computer Information Systems (CIS) and InformationTechnology (IT) are experiencing rapid change. In 2003, an analysis of IT degree programs and those of competing disciplines at 10 post-secondary institutions concluded that an informationtechnology program is perceived differently from information systems and computer science. In…

Informationtechnology infuses all aspects of modern life, and the growth of digital information continues at an unprecedented rate. Widely influential documents, such as the National Research Council's "Being Fluent with InformationTechnology" and the American Library Association's "Information Literacy Competency Standards for Higher…

Many structures in Indonesia use reinforced concrete frames with brick walls as their infill. Commonly, the engineers consider brick walls as the partitions and count them as the non-structural elements in the structure design. However, brick walls are capable of resisting earthquake by yielding high stiffness to the structure in case the brick walls are integrated well with the frames. It will reduce the non-structural destructions that happen to structures which is one of the most frequently impacts in the earthquake. This paper will take the effects of applying brick walls as the structural elements up by comparing it with the structure using brick walls as the partitions. The modeling of the brick walls uses the equivalent spectrum method meanwhile the seismic analysis uses the respon spectrum method. The utilization of brick walls can cause the decrement of the natural period to 42%. It also reduce the structure displacements to 53% in X-direction and 67% in Y-direction and the story drifts to 57% in X-direction and 71% in Y-direction. Otherwise, it causes the increment of the base shear only up to 3% in X-direction and 7% in Y-direction.

Executive Summary The Interagency Biological Restoration Demonstration—a program jointly funded by the Department of Defense's Defense Threat Reduction Agency and the Department of Homeland Security's (DHS's) Science and Technology Directorate—is developing policies, methods, plans, and applied technologies to restore large urban areas, critical infrastructures, and Department of Defense installations following the intentional release of a biological agent (anthrax) by terrorists. There is a perception that there should be a common system that can share information both vertically and horizontally amongst participating organizations as well as support analyses. A key question is: "How far away from this are we?" As partmore » of this program, Pacific Northwest National Laboratory conducted research to identify the current informationtechnology tools that would be used by organizations in the greater Seattle urban area in such a scenario, to define criteria for use in evaluating informationtechnology tools, and to identify current gaps. Researchers interviewed 28 individuals representing 25 agencies in civilian and military organizations to identify the tools they currently use to capture data needed to support operations and decision making. The organizations can be grouped into five broad categories: defense (Department of Defense), environmental/ecological (Environmental Protection Agency/Ecology), public health and medical services, emergency management, and critical infrastructure. The types of information that would be communicated in a biological terrorism incident include critical infrastructure and resource status, safety and protection information, laboratory test results, and general emergency information. The most commonly used tools are WebEOC (web-enabled crisis information management systems with real-time information sharing), mass notification software, resource tracking software, and NW WARN (web-based information to protect critical

technology -based learning generally) may be exciting technically, it does not automatically lead to better educational programs. Good instructional design...expected to act on the first Aircraft Carrier to attempt substantial manning reductions if nothing is learned from Smart Ship. Beyond the technologies ... technology of the day. Many of the lessons learned then are in use today. However, technology breakthroughs we are now experiencing invite us to

Concepts related to consumption have shifted to include social processes not previously covered by traditional categories. The current review analyzes the application of classical concepts of consumerism to practices recently identified in the health field, like the phenomenon of cyberchondria. The theoretical challenge relates to the difficulty in extrapolating from the economic perspectives of consumerism to self-care issues in the context of information and communication technologies (ICTs). Drawing on recent anthropological categories, the study seeks to understand the phenomenon of self-care commodification under the imperative of self-accountability for health. New consumer identities are described in light of the unprecedented issues concerning technical improvements currently altering the nature of self-care. The study concludes that health is consumed as vitality, broken down into commercial artifacts in the context of a new bioeconomy - no longer linked to the idea of emulation and possession, but to forms of self-perception and self-care in the face of multiple risks and new definitions of the human being.

This paper seeks to examine the extent to which technological advances can enhance inter-organizational information sharing in disaster relief. Our case is the Virtual OSOCC (On-Site Operations Coordination Centre) which is a part of the Global Disaster Alert and Coordination System (GDACS) under the United Nations Office for Coordination of Humanitarian Affairs (UN OCHA). The online platform, which has been developing for more than a decade, provides a unique insight into coordination behaviour among disaster management agencies and individual actors. We build our study on the analysis of a complete database of user interaction including more than 20,000 users and 11,000 comments spread across approximately 300 disaster events. Controlling for types and severities of the events, location-specific vulnerabilities, and the overall trends, we find that the introduction of new features have led to increases in user activity. We supplement the data-driven approach with evidence from semi-structured interviews with administrators and key users, as well as a survey among all users specifically designed to capture and assess the elements highlighted by both interviews and data analysis. PMID:27584053

This paper seeks to examine the extent to which technological advances can enhance inter-organizational information sharing in disaster relief. Our case is the Virtual OSOCC (On-Site Operations Coordination Centre) which is a part of the Global Disaster Alert and Coordination System (GDACS) under the United Nations Office for Coordination of Humanitarian Affairs (UN OCHA). The online platform, which has been developing for more than a decade, provides a unique insight into coordination behaviour among disaster management agencies and individual actors. We build our study on the analysis of a complete database of user interaction including more than 20,000 users and 11,000 comments spread across approximately 300 disaster events. Controlling for types and severities of the events, location-specific vulnerabilities, and the overall trends, we find that the introduction of new features have led to increases in user activity. We supplement the data-driven approach with evidence from semi-structured interviews with administrators and key users, as well as a survey among all users specifically designed to capture and assess the elements highlighted by both interviews and data analysis.

It looks that networking welfare thinking and implementations of network projects only follow the development of data transfer possibilities. It is a danger that seamless chain of care in health care is just a data transferring generator based on easy connections, only creating needs for new data transferring. This is an 'illusion of core skills' that does not extend to the development of the contents of services. Easy access to the system makes more contacts and need for more also clinical services. New needs for data transfer burden the personnel with unnecessary information and networking functional model does not emancipate them to use their substantial skills. It means more costs and it is also a danger that normal life will be medicated. Public sector cannot finance all these new possibilities and consequences of modern technology. Does all this create a new combination of public and private sector and push them to allocate responsibilities in developing work? If the public and private sectors do not find the balance in controlling this development, also actors outside health care get to influence the choices and health care loses its autonomy. It becomes a business means for companies producing data transfer and network services. From the prioritization point of view this is not a good vision for financing and delivery of health care services either in public or private sector.

The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

An evidence report was prepared to assess the evidence base regarding benefits and costs of health informationtechnology (HIT) systems, that is, the value of discrete HIT functions and systems in various healthcare settings, particularly those providing pediatric care. PubMed, the Cochrane Controlled Clinical Trials Register, and the Cochrane Database of Reviews of Effectiveness (DARE) were electronically searched for articles published since 1995. Several reports prepared by private industry were also reviewed. Of 855 studies screened, 256 were included in the final analyses. These included systematic reviews, meta-analyses, studies that tested a hypothesis, and predictive analyses. Each article was reviewed independently by two reviewers; disagreement was resolved by consensus. Of the 256 studies, 156 concerned decision support, 84 assessed the electronic medical record, and 30 were about computerized physician order entry (categories are not mutually exclusive). One hundred twenty four of the studies assessed the effect of the HIT system in the outpatient or ambulatory setting; 82 assessed its use in the hospital or inpatient setting. Ninety-seven studies used a randomized design. There were 11 other controlled clinical trials, 33 studies using a pre-post design, and 20 studies using a time series. Another 17 were case studies with a concurrent control. Of the 211 hypothesis-testing studies, 82 contained at least some cost data. We identified no study or collection of studies, outside of those from a handful of HIT leaders, that would allow a reader to make a determination about the generalizable knowledge of the study's reported benefit. Beside these studies from HIT leaders, no other research assessed HIT systems that had comprehensive functionality and included data on costs, relevant information on organizational context and process change, and data on implementation. A small body of literature supports a role for HIT in improving the quality of pediatric

The development of learning technology today, have a direct impact on improving teachers' informationtechnology competence. This paper is presented the results of research related to teachers' informationtechnology competence. The study was conducted with a survey of some 245 vocational high school teachers. There are two types of instrument…

The purpose of the research is to analyze the value addition in students' information communication and technology (ICT) literacy level and confidence in using technology after completing a general education informationtechnology course at a four-year university. An online survey was created to examine students' perceptions. The findings revealed…

SAIC’s highly experienced team has developed technology, techniques and expertise in protecting these information assets from electronic attack by...criminals, terrorists, hackers or nation states. INFORMATION PROTECTION ENGINEERING : Using Technology and Experience to Protect Assets William J. Marlow... Engineering : Using Technology and Experience to Protect Assets Contract or Grant Number Program Element Number Authors Marlow, William J. Project

The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health InformationTechnology Extension Program ACTION: Public Notice. SUMMARY: This notice announces changes to the Health InformationTechnology Extension... technology, as authorized under section 3012(c) of the Public Health Service Act, as added by the American...

... technology. 542.306 Section 542.306 Money and Finance: Treasury Regulations Relating to Money and Finance... Definitions § 542.306 Information and communications technology. The term information and communications technology means any hardware, software, or other product or service primarily intended to fulfill or enable...

InformationTechnology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

The Defense Technical Information Center (DTIC), an organization charged with providing information services to the Department of Defense (DoD) scientific and technical community, actively seeks ways to promote resource sharing as a means for speeding access to information while reducing the costs of information processing throughout the technical…

Recently, the importance of an engineering education increases by the development of the informationtechnology (IT) . Development of the information literacy education is important to deal with new IT in the education on college of technology. Our group investigated the current state of information literacy education on college of technology at Kyushu area and the secondary education. In addition, we investigated about the talent whom the industrial world requested. From these investigation results, this paper proposed cooperation with the elementary and secondary education, enhancement of intellectual property education, introduction of information ethics education, introduction of career education and enhancement of PBL to information literacy education on college of technology.

My summer internship was spent supporting various projects within the Environmental Management Office and Glenn Safety Office. Mentored by Eli Abumeri, I was trained in areas of InformationTechnology such as: Servers, printers, scanners, CAD systems, Web, Programming, and Database Management, ODIN (networking, computers, and phones). I worked closely with the Chemical Sampling and Analysis Team (CSAT) to redesign a database to more efficiently manage and maintain data collected for the Drinking Water Program. This Program has been established for over fifteen years here at the Glenn Research Center. It involves the continued testing and retesting of all drinking water dispensers. The quality of the drinking water is of great importance and is determined by comparing the concentration of contaminants in the water with specifications set forth by the Environmental Protection Agency (EPA) in the Safe Drinking Water Act (SDWA) and its 1986 and 1991 amendments. The Drinking Water Program consists of periodic testing of all drinking water fountains and sinks. Each is tested at least once every 2 years for contaminants and naturally occurring species. The EPA's protocol is to collect an initial and a 5 minute draw from each dispenser. The 5 minute draw is what is used for the maximum contaminant level. However, the CS&AT has added a 30 second draw since most individuals do not run the water 5 minutes prior to drinking. This data is then entered into a relational Microsoft Access database. The database allows for the quick retrieval of any test@) done on any dispenser. The data can be queried by building number, date or test type, and test results are documented in an analytical report for employees to read. To aid with the tracking of recycled materials within the lab, my help was enlisted to create a database that could make this process less cumbersome and more efficient. The date of pickup, type of material, weight received, and unit cost per recyclable. This

Rationale for and development of an informationtechnology architecture are presented. The architectural approach described produces a technology environment that is integrating, flexible, robust, productive, and future-oriented. Issues accompanying architecture development and potential impediments to success are discussed.

Informationtechnology is nearly ubiquitous in health care settings. Nurses need basic computer skills and information literacy to effectively practice nursing. In addition, nurses must be prepared not only to work around complex health informationtechnology, but also to communicate with individuals who can address the underlying problems.

Increasing complexity and sophistication of ever evolving informationtechnologies has spurred unique and unprecedented challenges for organizations to protect their information assets. Companies suffer significant financial and reputational damage due to ineffective informationtechnology security management, which has extensively been shown to…

Since the original Institute of Medicine (IOM) report was published there has been an accelerated development and adoption of health informationtechnology with varying degrees of evidence about the impact of health informationtechnology on patient safety. This article is intended to review the current available scientific evidence on the impact of different health informationtechnologies on improving patient safety outcomes. We conclude that health informationtechnology improves patient's safety by reducing medication errors, reducing adverse drug reactions, and improving compliance to practice guidelines. There should be no doubt that health informationtechnology is an important tool for improving healthcare quality and safety. Healthcare organizations need to be selective in which technology to invest in, as literature shows that some technologies have limited evidence in improving patient safety outcomes.

Since the original Institute of Medicine (IOM) report was published there has been an accelerated development and adoption of health informationtechnology with varying degrees of evidence about the impact of health informationtechnology on patient safety. This article is intended to review the current available scientific evidence on the impact of different health informationtechnologies on improving patient safety outcomes. We conclude that health informationtechnology improves patient’s safety by reducing medication errors, reducing adverse drug reactions, and improving compliance to practice guidelines. There should be no doubt that health informationtechnology is an important tool for improving healthcare quality and safety. Healthcare organizations need to be selective in which technology to invest in, as literature shows that some technologies have limited evidence in improving patient safety outcomes. PMID:29209664

A survey was conducted about nursing information in volunteer activities of nursing faculty members and students after the Great East Japan Earthquake. Results indicated that it was important to attempt collecting information in every possible way and to always be prepared. During activities, it is important to record information, to share information with individuals other than nursing professionals and to make good use of it.

During this early stage of HIT adoption, it is critical that we engage in discussions regarding informed consent's proper role in a health care environment in which electronic information sharing holds primary importance. This article discusses current implementation of the doctrine within health information exchange networks; the relationship between informed consent and privacy; the variety of ways that the concept is referenced in discussions of information sharing; and challenges that surround incorporation of the doctrine into the evolving HIT environment. The article concludes by reviewing the purpose behind the traditional obligation to obtain informed consent and the possibility of maintaining its relevance in the new environment.

This research aimed to explore information and communication technology (ICT) coordinators' discourse in relation to ICT integration in a sample of Irish post-primary schools. As ICT leaders in their schools, how they conceptualise ICT significantly influences school-based policy and use. The research involved semi-structured interviews with a…

... conduct the 2010 through 2012 Information and Communication Technology Survey (ICTS). The annual survey... payments) for four types of information and communication technology equipment and software (computers and... through the use of automated collection techniques or other forms of informationtechnology. Comments...

Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

Currently, informationtechnology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health informationtechnology. This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. The papers were divided into two groups: those forecasting the future of health informationtechnology (seven papers) and those providing health informationtechnology foresight (four papers). The results showed that papers related to forecasting the future of health informationtechnology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health informationtechnology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. To make the most of an investment and to improve planning and successful implementation of health informationtechnology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health informationtechnology and offering health informationtechnology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health informationtechnology, the results of foresight studies can help to establish realistic long-term expectations of the future of health information

Lightening the load: Toning the Marine Corps’ InformationTechnology EWS Contemporary Issues Paper Submitted by Captain Robert...TITLE AND SUBTITLE Lightening the Load: Toning the Marine Corps’ InformationTechnology 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...far behind the other services Marine Corps’ informationtechnology was and the first time I was ever embarrassed by that knowledge. Today the

Introduction Currently, informationtechnology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health informationtechnology. Method This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. Results The papers were divided into two groups: those forecasting the future of health informationtechnology (seven papers) and those providing health informationtechnology foresight (four papers). The results showed that papers related to forecasting the future of health informationtechnology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health informationtechnology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. Conclusion To make the most of an investment and to improve planning and successful implementation of health informationtechnology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health informationtechnology and offering health informationtechnology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health informationtechnology, the results of foresight studies can help to establish realistic long

This article is an analysis of the Health InformationTechnology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health InformationTechnology. The keyword analysis suggests that Health InformationTechnology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.

New technologies such as audiotex and videotex raise public policy issues related to access, use of a common carrier network, basic versus enhanced services, financing the system, government responsibility, the Bell companies as providers, consumer protection, and privacy. (SK)

This essay explores how ethics, computing, and health care intersect in medical informatics. It discusses the power technology places in the hands of health care professionals and the ethical problems they may encounter as a result of that power.

Biomedicine is a branch of medicine that studies the human body, its structure and function in health and disease, pathological condition, methods of diagnosis, treatment and correction [1]. At the moment, to solve their diverse problems associated with the collection, storage, and data analysis, process modeling, biomedicine extensively uses modern technical equipment. The goal of this article - to make a brief analysis of existing technologies (big data, mobile and cloud technologies), in terms of their applicability to the needs of biomedicine.

This study aims to design and build mobile-based dictionary of information and communication technology applications to provide access to information in the form of glossary of terms in the context of information and communication technologies. Applications built in this study using the Android platform, with SQLite database model. This research uses prototype model development method which covers the stages of communication, Quick Plan, Quick Design Modeling, Construction of Prototype, Deployment Delivery & Feedback, and Full System Transformation. The design of this application is designed in such a way as to facilitate the user in the process of learning and understanding the new terms or vocabularies encountered in the world of information and communication technology. Mobile-based dictionary of Information And Communication Technology applications that have been built can be an alternative to learning literature. In its simplest form, this application is able to meet the need for a comprehensive and accurate dictionary of Information And Communication Technology function.

Examines current trends in data collection and information use in human services organizations. Describes issues for managers who are planning information systems, including practitioner resistance to automation. Proposes that conceptual integration of agendas for human services automation, practice evaluation, and service effectiveness enables…

... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction... sent to National Earthquake Hazards Reduction Program Director, National Institute of Standards and...

This site contains information on ambient air quality monitoring programs, monitoring methods, quality assurance and control procedures, and federal regulations related to ambient air quality monitoring.

The management of chronic diseases such as diabetes is becoming a crucial issue in developed countries. Innovative communication technologies should now be included as new partners in the health care system. These technologies can help both in managing patients and measuring quality of care. Internet-based health promotion programs may improve compliance with treatment. Decision systems are available on the Net to help patients monitoring their diet and insulin doses. The use of electronic medical record shared on Internet may help both physicians and patients to monitor on the long term the effect of interventions. It is now time to perform appropriate trials to determine, beside other interventions, the precise role of innovative communication technologies in diabetes management.

The author reviews the history and technology of the microcomputer and discusses the various classes of software that are presently available. Three major categories of software are described: numeric data processing, text processing, and communications. The application of this software to psychiatric education and practice is briefly discussed. A short curriculum on computers for psychiatric residents is outlined, and a brief bibliography of the recent relevant literature on computer applications to medicine and psychiatry is presented. Predictions are made about the future direction of computer technology and its application to psychiatry.

Using a model of scientific and technical information transfer as a framework, this document focuses on four types of activities: the generation or authorship of telecommunications information and its publication, distribution, and use. Different forms of publication are considered in each functional area, though primary emphasis is on the…

Discusses current issues in the design of information systems, noting contributions from three professions--computer science, human factors engineering, and information science. The eclectic nature of human factors engineering and the difficulty of drawing together studies with human engineering or software psychological components from diverse…

This doctoral dissertation is focused on both empirical and conceptual contributions relative to the roles social technologies play in informal knowledge sharing practices, both within and across organizations. Social technologies include (a) traditional social technologies (e.g., email, phone and instant messengers), (b) emerging social…

Attention to integrate technology in teaching and learning has provided a major transformation in the landscape of education. Therefore, many innovations in teaching and learning have been technology-driven. The study attempted to examine what is engineering students' perception regarding the use of Information and Communication Technologies (ICT)…

As of March 2010, there were fourteen InformationTechnology programs accredited by the Accreditation Board for Engineering and Technology, known as ABET, Inc (ABET Inc. 2009). ABET Inc. is the only recognized institution for the accreditation of engineering, computing, and technology programs in the U.S. There are currently over 128 U.S. schools…

This paper uses the Massachusetts Institute of Technology (MIT) Management in the 1990s Research Framework as a basis for examining the challenges of managing informationtechnology in higher education, with particular reference to open and distance education. Budgeting, technology trends, centralization, prototyping, staff, and competition versus…

Survey of the history and current development of informationtechnology covers hardware (economies of scale, communications technology, magnetic and optical forms of storage), and the evolution of systems software ("tool" software, applications software, and nonprocedural languages). The effect of new computer technologies on human…

With advances in informationtechnology, the velocity of information production on the global level has expanded as well. This acceleration has led to the delegitimizing of knowledge, the equating of information with knowledge, and the giving of predominance to information rather than knowledge. This advance has created epistemological challenges…

This review contains eight papers on topics within the field of information science and technology. The papers are divided into three sections as follows: (1) Planning Information Systems and Services, including "Information Ownership and Control" (Tomas A. Lipinski); and "Pricing and Marketing Online Information Services" (Sheila Anne Elizabeth…

The shift to longitudinal, comprehensive electronic health records (EHRs) means that any health care provider (e.g., dentist, pharmacist, physical therapist) or third-party user of the EHR (e.g., employer, life insurer) will be able to access much health information of questionable clinical utility and possibly of great sensitivity. Genetic test results, reproductive health, mental health, substance abuse, and domestic violence are examples of sensitive information that many patients would not want routinely available. The likely policy response is to give patients the ability to segment information in their EHRs and to sequester certain types of sensitive information, thereby limiting routine access to the totality of a patient's health record. This article explores the likely effect on the physician-patient relationship of patient-directed sequestration of sensitive health information, including the ethical and legal consequences.

Technology in secondary schools has become of increasing interest as the power of the microchip has developed. For the students of Mathematics, computers and handheld graphic calculators need to be accessible to all. They are relevant to the needs of the students' courses and to support and develop their Mathematical learning (Smith, 1997).…

In this inaugural year of the faculty technology study, EDUCAUSE Center for Analysis and Research (ECAR) partnered with 151 college/university sites yielding responses from 17,451 faculty respondents across 13 countries. The findings are exploratory in nature, as they cover new ground to help us tell a more comprehensive story about technology…

Due to the ready availability of new technologies, opportunities for the incidental as well as deliberate practice of English have multiplied and far exceed what can be done in more formal environments. Yet, despite the sizeable literature on the classroom-based use of specific digital resources, few studies have investigated how students evaluate…

This keynote address examines the social relations of informationtechnology in the future. Examples of the recent history of technology related to transportation, printing, and nuclear energy are presented. Some troubling examples of the social relations of new informationtechnologies that are emerging on a global scale are then discussed,…

Congress of the U.S., Washington, DC. Office of Technology Assessment.

This report considers the management, use, and congressional oversight of informationtechnology in the Federal Government as rapid advances in technology--e.g., microcomputers, computer networking, computer modeling, videoconferencing, and electronic information exchange--are generating many new applications, opportunities, and issues which are…

In 2003, Nicholas Carr published in "Harvard Business Review" his article "IT Doesn't Matter," which rekindled the debate on the strategic importance of informationtechnology (IT). Chief Information Officers (CIOs) of community colleges are now faced with the challenge of seeking the best technology for their institutions. The…

The purpose of this project was to determine workforce needs in the new informationtechnology/knowledge-based world in order to design a coherent minor program in informationtechnology at the University of Arkansas at Little Rock for the non-technically oriented college student. The process consisted of three phases: site visits to five…

This paper discusses the difficulty of accurately predicting the future role of informationtechnology, presents an overview of technological advances, and highlights such special interest areas as virtual reality, the information highway, and the influence of computers on traditional ways of thinking. (SM)

This article discusses how informationtechnologies and globalization have opened new avenues and horizons for educators and learners. It discusses different experiences of using information and communication technologies (ICTs) in teaching learning process the world over in the age of globalization. It focuses on the ways these new trends have…

The paper reports on an approach to teaching a course in informationtechnology research methodology in a doctoral program, the Doctor of Management in InformationTechnology (DMIT), in which research, with focus on finding innovative solutions to problems found in practice, comprises a significant part of the degree. The approach makes a…

It is common to find final or near final year undergraduate InformationTechnology students undertaking a substantial development project; a project where the students have the opportunity to be fully involved in the analysis, design, and development of an informationtechnology service or product. This involvement has been catalyzed and prepared…

As the demand for Information Systems (IS) and InformationTechnology (IT) graduates remains strong, it is imperative that the curriculums in IS and IT programs meet employer needs. IS and IT educators encounter a continuing challenge to ensure that their courses and curriculum stay up to date with new and evolving technological changes in the…

The Technological Innovation and Cooperation for Foreign Information Access (TICFIA) Program supports projects focused on developing innovative technologies for accessing, collecting, organizing, preserving, and disseminating information from foreign sources to address the U.S.' teaching and research needs in international education and foreign…

Our nation's competitive edge is highly dependent on the success of STEM education and the ability of informationtechnology (IT) graduates to find jobs. The School of InformationTechnology at Illinois State University (ISU) is strategically positioned to offer S-STEM scholarships to talented, financially disadvantaged students in the IT…

Informationtechnology (IT) has become a strategic vehicle for small businesses to achieve and sustain their competitive advantage. Prior research has suggested that informationtechnology plays an important role in the decision-making process. The purpose of this study is to examine the relationship between organizational IT performance and…

This 57-minute videotape covers the "Florida Educators Using InformationTechnology" session of the "Eco-Informa '96" conference. Two speakers presented examples of environmental educators using informationtechnology. The first speaker, Brenda Maxwell, is the Director and Developer of the Florida Science Institute based at…

Looking out an office window or exploring a community park, one can easily see the tremendous challenges that biological information presents the computer science community. Biological information varies in format and content depending whether or not it is information pertaining to a particular species (i.e. Brown Tree Snake), or a specific ecosystem, which often includes multiple species, land use characteristics, and geospatially referenced information. The complexity and uniqueness of each individual species or ecosystem do not easily lend themselves to today's computer science tools and applications. To address the challenges that the biological enterprise presents the National Biological Information Infrastructure (NBII) (http://www.nbii.gov) was established in 1993. The NBII is designed to address these issues on a National scale within the United States, and through international partnerships abroad. This paper discusses current computer science efforts within the National Biological Information Infrastructure Program and future computer science research endeavors that are needed to address the ever-growing issues related to our Nation's biological concerns.

With the improvement of China's urbanization, to ensure people survive the earthquake needs scientific routine emergency evacuation drills. Drawing on cellular automaton, shortest path algorithm and collision avoidance, we designed a model of earthquake emergency evacuation drill for school scenes. Based on this model, we made simulation software for earthquake emergency evacuation drill. The software is able to perform the simulation of earthquake emergency evacuation drill by building spatial structural model and selecting the information of people's location grounds on actual conditions of constructions. Based on the data of simulation, we can operate drilling in the same building. RFID technology could be used here for drill data collection which read personal information and send it to the evacuation simulation software via WIFI. Then the simulation software would contrast simulative data with the information of actual evacuation process, such as evacuation time, evacuation path, congestion nodes and so on. In the end, it would provide a contrastive analysis report to report assessment result and optimum proposal. We hope the earthquake emergency evacuation drill software and trainer can provide overall process disposal concept for earthquake emergency evacuation drill in assembly occupancies. The trainer can make the earthquake emergency evacuation more orderly, efficient, reasonable and scientific to fulfill the increase in coping capacity of urban hazard.

The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquakeinformation users. We describe the available datasets and models, developments currently underway, and anticipated products.

Provides an introduction to fulfilling the information needs of technology transfer. Highlights include a definition of technology transfer; government and university involvement; industry's role; publishers; an annotated list of information sources and contacts; technology assessment, including patent searching, competitive intelligence, and…

The City Technology Colleges in Great Britain were developed to teach science and informationtechnology to students from 11 to 18 years of age. The approach adopted at the Bradford City Technology College, which was to open in September 1990, is one in which emphasis is given to combining established technologies with a new educational context.…

M ≥7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a × k (k = 1,2,3), 11 12 a, 41 43 a, 18 19 a, and 5 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019 - 2020 and 2025 - 2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

field. Such a patchwork of separate systems neither improves information sharing nor guarantees the safety and security of communities and personnel in...analysis. In many organizations, security may not necessarily be the expertise of people working in the field, or security and safety issues may be...the safety and security of all crisis management personnel in crisis areas. Functioning information sharing between organisations improves situational

The aim of the study was to evaluate the incidences and obstetric outcomes of women who conceived using assisted reproductive technology (ART) procedures in Fukushima Prefecture before and after the Great East Japan Earthquake and Fukushima nuclear power plant accident. Information was collected and analyzed from 12,070 women who conceived with or without ART in Fukushima Prefecture during the 9 months before and after the disaster. During the 9 months before and after the disaster, 138 (2.0%) and 102 (1.9%) women conceived with in vitro fertilization-embryo transfer (IVF-ET), respectively. The proportion of women who conceived with IVF-ET decreased during the 2 months immediately after the disaster, but returned to pre-disaster levels 3 months after the disaster. In the case of women who conceived without IVF-ET, the incidences of preterm birth and low birth weight increased after the disaster. In contrast, women who conceived with IVF-ET did not differ significantly in obstetric outcomes before and after the disaster but had a higher incidence of cesarean section and low birth weight compared to those conceived without IVF-ET, regardless of the study period. The influence of the disaster on woman who conceived using ART procedures was minimal.

Background The aim of the study was to evaluate the incidences and obstetric outcomes of women who conceived using assisted reproductive technology (ART) procedures in Fukushima Prefecture before and after the Great East Japan Earthquake and Fukushima nuclear power plant accident. Methods Information was collected and analyzed from 12,070 women who conceived with or without ART in Fukushima Prefecture during the 9 months before and after the disaster. Results During the 9 months before and after the disaster, 138 (2.0%) and 102 (1.9%) women conceived with in vitro fertilization-embryo transfer (IVF-ET), respectively. The proportion of women who conceived with IVF-ET decreased during the 2 months immediately after the disaster, but returned to pre-disaster levels 3 months after the disaster. In the case of women who conceived without IVF-ET, the incidences of preterm birth and low birth weight increased after the disaster. In contrast, women who conceived with IVF-ET did not differ significantly in obstetric outcomes before and after the disaster but had a higher incidence of cesarean section and low birth weight compared to those conceived without IVF-ET, regardless of the study period. Conclusion The influence of the disaster on woman who conceived using ART procedures was minimal. PMID:28811855

the insider threat. However, “are of little value for examining complex social relationships or intricate patterns of interaction ” (Marshall and...As the previous section indicated, U.S. relations with the rest of the world will have some bearing over the social interactions , trends, and psyches...advances that bring economic and social issues into focus in a media rich environment. Technology trends have given organizations the ability to

PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

With the passage of The American Reinvestment and Recovery Act of 2009 that includes the Health Care InformationTechnology for Economic & Clinical Health Act, the opportunity for states to develop a Health InformationTechnology Center (THITC) has emerged. The Center provides the intellectual, financial, and technical leadership along with the governance and oversight for all health informationtechnology-related activities in the state. This Center would be a free-standing, not-for-profit, public-private partnership that would be responsible for operating one or more (in large states) Regional Health InformationTechnology Extension Centers (Extension Centers) along with several Regional Health Information Exchanges (HIEs) and one or more Regional Health Information Data Centers (Data Centers). We believe that if these features and functions could be developed, deployed, and integrated statewide, the health and welfare of the citizens of the state could be improved while simultaneously reducing the costs associated with the provision of care.

A major challenge for InformationTechnology (IT) programs is that the rapid pace of evolution of computing technology leads to frequent redesign of IT courses. The problem is exacerbated by several factors. Firstly, the changing technology is the subject matter of the discipline and is also frequently used to support instruction; secondly, this…

This paper studies the role and the importance of Information and Communication Technologies for Development (ICT4D) education in InformationTechnology (IT) programs. The research included the students who attended an ICT4D course at NYiT Amman Campus in the academic years of 2006 to 2009. Data were collected through two questionnaires developed…

investigating the development and application of collaborative multimedia conferencing software for education and other groupwork activities. We are extending...an alternative environment for place-based synchronous groupwork . The new environment is based on the same collaborative infrastructure as the...alternative environment for place- based synchronous groupwork . This information is being used as an initial user profile, requirements analysis

Information systems educators must balance the need to protect the stability, availability, and security of computer laboratories with the learning objectives of various courses. In advanced courses where students need to install, configure, and otherwise manipulate application and operating system settings, this is especially problematic as these…

There is a growing body of research focused on the use of social media and Internet technologies for health education and information sharing. The authors reviewed literature on this topic, with a specific focus on the benefits and concerns associated with using online social technologies as health education and communication tools. Studies suggest that social media technologies have the potential to safely and effectively deliver health education, if privacy concerns are addressed. Utility of social media-based health education and communication will improve as technology developers and public health officials determine ways to improve information accuracy and address privacy concerns. PMID:24465171

There is a growing body of research focused on the use of social media and Internet technologies for health education and information sharing. The authors reviewed literature on this topic, with a specific focus on the benefits and concerns associated with using online social technologies as health education and communication tools. Studies suggest that social media technologies have the potential to safely and effectively deliver health education, if privacy concerns are addressed. Utility of social media-based health education and communication will improve as technology developers and public health officials determine ways to improve information accuracy and address privacy concerns.

Studies report physician resistance to informationtechnology in a time when the practice of medicine could benefit from technological support. Anecdotally, it is suspected that lack of training, discomfort with technological innovations, a perceived shift in the doctor/patient relationship, or medical/legal issues may account for this circumstance. Empirical studies attribute this lag to age, personality factors, behavioral issues, and occupational influences. This paper integrates the informationtechnology and consumer behavior literatures to discuss physicians' acceptance, adoption, and application of IT.

Results of geologic studies using data collected by the NASA/JPL Thermal Infrared Imaging Spectrometer (TIMS), Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and the Airborne Synthetic Aperture Radar (AIRSAR) are discussed. These instruments represent prototypes for the Earth Observing System (EOS) satellite instruments ASTER, High Resolution Imaging Spectrometer (HIRIS), and EOS SAR. Integrated analysis of this data type is one of the keys to successful geologic research using EOS. TIMS links the physical properties of surface materials in the 8-12-*mm region to their composition. Calibrated aircraft data make direct lithological mapping possible. AVIRIS, an analog for HIRIS, provides quantitative information about the surface composition of materials based on their detailed visible and infrared spectral signatures (0.4-2.45 mm). Calibrated AVIRIS data make direct identification of minerals possible. The AIRSAR provides additional complementary information about the surface morphology of rocks and soils.

and the absence of Enterprise funding models for shared services . Also, though progress has been made within the DHS IT community regarding...security access regulations for shared services ; and difficulties associated with 3 Office of the Chief Information Officer...infrastructure and shared services is the vision for the Infrastructure Transformation Program at DHS and is the means by which to reduce IT commodity

periodically certify their APMS information is accurate and complete. Finally, related to AR 25-1 is Department of the Army Pamphlet 25-1-1 (DA PAM 25...Resource Manager, who certified funds were available to cover the request. Next, the GFEBS PR was routed to the servicing Contracting Office for review...it was certified by the Defense Business Systems Management Committee (DBSMC), then it must be registered and the data on it maintained in APMS. An

Results of a survey of the information resources available in industrialized countries which might be used in a United Nations technology transfer program for developing countries are presented. Information systems and networks, organized information collections of a scientific and technical character, and the machinery used to disseminate this…

This discussion of the development and application of lasers and microprocessors to information processing stresses laser communication in relation to capacity, reliability, and cost and the advantages of this technology to real-time information access and information storage. The increased capabilities of microprocessors are reviewed, and a…

This article concerns one of the main problems facing nurse education, that of meeting individualised learner needs. This endeavour is inescapable because of current trends in the curriculum, trends towards continuous assessment and more recently, advice from the English National Board (ENB) regarding continuous theoretical assessment. Computer assisted learning, it is suggested, can be helpful in nurturing individual learner progress. Sophisticated technologies are available to educationalists which develop individual learning strategies, but the cost of producing the necessary courseware is high, both in terms of money and tutor time. Hopefully a solution has been found as a project has been funded and is being run by the ENB allowing tutors to develop skills in this area of education.

Mobile technology is getting involved in every sphere of life including medical health care. There has been an immense upsurge in mobile phone-based health innovations these days. The expansion of mobile phone networks and the proliferation of inexpensive mobile handsets have made the digital information and communication technology capabilities very handy for the people to exploit if for any utility including health care. The mobile phone based innovations are able to transform weak and under performing health information system into more modern and efficient information system. The present review article will enlighten all these aspects of mobile technology in health care.

Families, clinicians and researchers involved with varying neurological disorders face amazing challenges to understand, treat, and assist the people they are serving. Autism brings unique challenges and serves as an important model for the application of important concepts in informationtechnology and telemedicine. The rising incidence of autism with limited professional resources has led to more consideration for using informationtechnology and related specialties to link families and professionals, and to implement strategies which implement informationtechnology to improve the outcomes for individuals with autism and their families. These are reviewed in context of the unique health, education, and the research issues facing those dealing with autism.

Surgical research is dependent upon informationtechnologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these informationtechnologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of informationtechnology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in EarthquakeInformationTechnology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

With the rapid development of economic, the Internet of Things based on Internet technology is more and more concerned by all circles of society, and the Internet of Things begins to penetrate into various fields of society. The Internet of things is an extension of the Internet, the difference between the Internet and the Internet of Things is that the purpose of things aims to achieve the exchange and exchange of information and data, contract the people and goods through a variety of technologies and equipment from items to items. Information perception and interaction technology are two very important technologies in the development of things, but also is the important technology in the history of the development of network technology. This paper briefly analyzes the characteristics of the original information perception, and the difference between the interactive technology of the Internet of Things and the human-computer interaction technology. On this basis, this paper mainly elaborates from the two aspects of information perception and interactive technology.

cience fields in order to combine efforts to better understand multiple network s systems, including technical, biological and social networks...Flowing Valued Information (FVI) project has been discussed at the Network cience Workshops linked form the Center website and the FVI reports and

To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s TechnologyInformation Management System.« less

To deliver information to providers across the U.S., the Agency for Healthcare Research and Quality's National Resource Center for Health IT (NRC) created a public domain Web site containing a number of tools and resources. Specifically lacking from this Web site is a glossary of health IT terminology. To address this omission and respond to requests from Web site users,the Regenstrief Institute created the Health IT Glossary. This glossary is designed to provide novices, providers and others new to health IT, with a single source to find basic definitions for a broad range of terms, consistent with the Office of the National Coordinator (ONC) effort. The glossary is a living document, and feedback is welcomed from the health informatics community.

Project #OA-FY14-0307, June 10, 2014. The U.S. Environmental Protection Agency (EPA) Office oflnspector General (OIG) plans to begin preliminary research on the EPA's management of informationtechnology (IT) investments.

The Center's primary function is to facilitate technology transfer within DoD, other government agencies and industry. The DoD has recognized the importance of technology transfer, not only to support specific weapon system manufacture, but to strengthen the industrial base that sustains DoD. MTIAC uses an experienced technical staff of engineers and information specialists to acquire, analyze, and disseminate technical information. Besides ManTech project data, MTIAC collects manufacturing technology from other government agencies, commercial publications, proceedings, and various international sources. MTIAC has various means of disseminating this information. Much of the technical data is on user accessible data bases. The Center researches and writes a number of technical reports each year and publishes a newsletter monthly. Customized research is performed in response to specific inquiries from government and industry. MTIAC serves as a link between Government and Industry to strengthen the manufacturing technology base through the dissemination of advanced manufacturing information.

Previous articles in this column have discussed how new informationtechnologies are revolutionizing medical education. In this article, two staff members from the Association of American Medical College's Division of Medical Education discuss how the Association (the AAMC) is working both to support the introduction of new technologies into medical education and to facilitate dialogue on informationtechnology and curriculum issues among AAMC constituents and staff. The authors describe six AAMC initiatives related to computing in medical education: the Medical School Objectives Project, the National Curriculum Database Project, the InformationTechnology and Medical Education Project, a professional development program for chief information officers, the AAMC ACCESS Data Collection and Dissemination System, and the internal Staff Interest Group on Medical Informatics and Medical Education.

The Commercial Remote Sensing and Spatial InformationTechnologies (CRS&SI) program was a congressionally mandated program authorized in the Safe, Accountable, Flexible and Efficient Transportation Equity Act: A Legacy for Users (SAFETEA-LU). Under t...

Practice management companies are becoming more prominent players in the health care industry. To improve the performance of the group practices that they acquire, these companies are striving to use updated informationtechnologies.

Educational marketing--which uses marketing methods unique to education institutions, including segmentation, direct mail, and informationtechnology--is discussed. A model for student recruitment developed by the University of Pittsburgh is described. (Author/MLW)

Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Informationtechnology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting informationtechnologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital informationtechnology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health informationtechnology may position federal agencies to help design and fund PHIT architectures.

... negative determination regarding the eligibility of workers and former workers of MetLife, Technology, Operations, and InformationTechnology Groups, Moosic, Pennsylvania (TA-W-73,210) and MetLife, Technology... software testing and quality assurance services, and are not separately identifiable by service supplied...

International Federation of Library Associations and Institutions, The Hague (Netherlands).

Four papers on informationtechnology were presented at the 1986 International Federation of Library Associations (IFLA) conference. In the paper "Optical Disc Technology Used for Large-Scale Data Base," Naoto Nakayama (Japan) considers the rapid development of optical technology and the role of applications such as optical discs,…

Learning and using new technologies by the older people is seen as a demand for their integration in society and as a factor related to active aging. The goal of this article is to understand the attitudes of the elderly towards information and communication technologies in the context of a training course about the utilization of a digital…

The author believes that informationtechnologies are powerfully heuristic in addressing one of education's deepest ambitions. Following Engelbart's paradigm, he sees these technologies as augmenting human intellect, not simply because they permit high-speed calculations but also because they externalize our own cognitive processes in a way that…

Educational changes in Wyoming that are linked to the emergence of new informationaltechnologies are considered. Attention is directed to the following topics: assumptions for Wyoming educators as they plan to respond to the impact of technology on teacher education; the importance of educational goals and objectives; the national climate…

Digital dentistry is not the wave of the future; it is occurring now. Whether a dentist embraces new technology will define his or her practice and, possibly, future. The aim of this article is to inform practitioners of the various components that constitute a digital dental practice, the technologies available today, and those on the horizon.

Experience with the Doctor of Management in InformationTechnology (DMIT), offered in the College of Management, Lawrence Technological University in the six years since 2002 is described. The mission of the program is to offer doctoral level education for working professionals with high levels of managerial and IT expertise. With a number of…

Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare informationtechnology (HIT) or health…

Advances in information and communication technologies, shortly called as ICT, require educators to present a more efficient and modern education by using these technologies. Therefore; the role of ICT in the development of education has been a popular research subject nowadays. Even not only education content but it has started to be dwelt on how…

A significant number of the universities and higher educational institutions have adopted the latest technology and implemented it productively, for the development of skilled human resource in respective area of specialization, as part of their responsibility. Information and communication Technology (ICT) has grown tremendously around the globe…

This publication highlights 48 recent ERIC listings which help to explain the variety of emerging technologies for the delivery of information in educational settings. Specific technologies addressed include cable television, electronic mail, satellite communication, teleconferencing, videodisc, and videotex. Entries were selected for inclusion…

The issues of introduction of building information modeling (BIM) in construction industry are considered in this work. The advantages of this approach and perspectives of the transition to new design technologies, construction process management, and operation in the near future are stated. The importance of development of pilot projects that should identify the ways and means of verification of the regulatory and technical base, as well as economic indicators in the transition to Building InformationTechnologies in the construction, is noted.

Modern day informationtechnology (IT) is converging around wireless networks. It is now possible to check E-mail and view information from the World Wide Web from commercially available mobile phones. For individuals with disabilities, the ability to access multiple and different types of information not only promises convenience, but also can help to promote independence and facilitate access to public and private information systems. There are many barriers to access for people with disabilities, including technological hurdles, security, privacy, and access to these emerging wireless technologies. However, legislation, advocacy, standards, and research and development can ensure that users of augmentative and alternative communication (AAC) and assistive technology have access to these technologies. This article provides a historical context for the field of AAC and IT development, a review of the current state of these technologies, a glimpse of the potential of wireless information access for the lives of AAC users, and a description of some of the barriers and enablers to making access available to users of AAC and assistive technologies.

This pilot study looks into how informationtechnology practices are being conducted in student affairs. It compares common practices against which exemplary programs and best practices can be measured. After gathering information from five universities, a model was created that encompassed policy, staffing, technology, and practice as the best…

This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National EarthquakeInformation Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

To review published evidences about using informationtechnology interventions in diabetes care and determine their effects on managing diabetes. Systematic review of informationtechnology based interventions. MEDLINE®/PubMed were electronically searched for articles published between 2004/07/01 and 2014/07/01. A comprehensive, electronic search strategy was used to identify eligible articles. Inclusion criteria were defined based on type of study and effect of informationtechnology based intervention in relation to glucose control and other clinical outcomes in diabetic patients. Studies must have used a controlled design to evaluate an informationtechnology based intervention. A total of 3613 articles were identified based on the searches conducted in MEDLINE from PubMed. After excluding duplicates (n = 6), we screened titles and abstracts of 3607 articles based on inclusion criteria. The remaining articles matched with inclusion criteria (n = 277) were reviewed in full text, and 210 articles were excluded based on exclusion criteria. Finally, 67 articles complied with our eligibility criteria and were included in this study. In this study, the effect of various informationtechnology based interventions on clinical outcomes in diabetic patients extracted and measured from selected articles is described and compared to each other. Informationtechnology based interventions combined with the usual care are associated with improved glycemic control with different efficacy on various clinical outcomes in diabetic patients.

The aim of this study was to investigate the influence of information and communication technology use and skills on nursing performance. Questionnaires were prepared relating to using the technology, practical skills in utilizing information, the Six-Dimension Scale of Nursing Performance, and demographics. In all, 556 nurses took part (response rate, 72.6%). A two-way analysis of variance was used to determine the influence of years of nursing experience on the relationship between nursing performance and information and communication technology use. The results showed that the group possessing high technological skills had greater nursing ability than the group with low skills; the level of nursing performance improved with years of experience in the former group, but not in the latter group. Regarding information and communication technology use, the results showed that nursing performance improved among participants who used computers for sending and receiving e-mails, but it decreased for those who used cell phones for e-mail. The results suggest that nursing performance may be negatively affected if information and communication technology are inappropriately used. Informatics education should therefore be provided for all nurses, and it should include information use relating to cell phones and computers.

Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.